Category Archives: General

Azure ServiceBus Relay – 50200: Bad Gateway

<TL;DR;>This error message is not always caused by proxy issues. After last weeks updates an old version of the service bus DLL’s (2.2.3) on the relay server side caused this error on the client side when trying to call service operations.</TL;DR;>

Last week I arrived at the office and was greeted by a status screen that contained a lot more red lights than when I had left the day before. That in itself wasn’t too strange, we monitor customer’s servers as well and who know what kind of update/reboot schedule these guys have. However, the fact that the only servers that were experiencing problems were the ones we host ourselves made me a bit suspicious.

After some investigation I noticed the error message from the title in our logging. Apparently it can be found in two variations: 50200: Bad Gateway, and of course 502: Bad Gateway. I had encountered this issue before at a customer using a proxy, and all google pages led me to believe that this was indeed a proxy issue on our side as well. However, we don’t have a proxy running in our network, and it was working fine before.

After some digging I noticed only the servers that received updates and were rebooted the night before were experiencing issues. Servers that had not been updated were fine. It turned out that one of the updates did not play well with the old (2.2.3) version of the service bus DLL’s we were still using (software had been running fine for 3 years, why update?). So after updating it to the latest version that could still run on .NET 4 (2.8.0 if I remember correctly) and updating the software on the rebooted servers, we were back in business again.

Visual Studio 2013 Update 2 dialog to trust IIS express SSL certificate keeps returning

The title says it all. Since I installed Visual Studio 2013 Update 2, running my SSL enabled web projects presented me with the following message:

But even when checking the ‘Don’t ask me again’ button I was presented with the exact same message the next time I tried to debug.

I decided to start up my go-to tool in these situations; SysInternals ProcMon (http://technet.microsoft.com/en-us/sysinternals/bb896645). To find out what was going wrong. I suspected Visual Studio had problems writing to the correct registry key or something like that. It turned out I was right:

Is was just a question of adding this (HKCUSoftwareMicrosoftVisualStudio12.0WebProjectsHideTrustIISCertificatePrompt) value to the registry (DWORD) and setting it to 1 to make the message box disappear forever.

Disclaimer: If you don’t know how to handle registry changes you should probably read up on that kind of stuff (or let your sysadmin fix it for you).

Legacy and .NET interop on x86 vs x64

The problem

When writing .NET applications that need to interoperate with (legacy) 32-bit ODBC drivers, COM libraries and such. You might run into problems on 64-bit OS installations. By default applications written using the .NET Framework 4 or earlier will be compiled with the “Platform Target” option as “Any CPU”. When such an application accesses a 32-bit COM component on a 64-bit machine you might see the following kind of exception:

This operation failed because the QueryInterface call on the COM component for the interface with IID ‘{YOUR-GUID-HERE}’ failed due to the following error: Old format or invalid type library. (Exception from HRESULT: 0x80028019 (TYPE_E_UNSUPFORMAT)).

When trying to access and ODBC connection you might find that there are 64-bits and 32-bits ODBC connections. And that older drivers can only be used for 32-bits ODBC connections.

Solution 1

If you catch this during in-house testing you will be able to fix the problems by forcing the compiler to use the x86 platform. By going to the project properties, than go to the “Build” tab and setting the “Platform Target” to “x86” like so:

However, be aware that this is a configuration specific setting, so if you are wondering why your release builds are still giving you these problems you should also apply this setting for the Release configuration.

If all of the components in your solution should be built using the x86 platform target or if you want to be able to create separate x64 and x86 builds it is probably better to create solution platforms for both. You can do this by right clicking on the solution and choosing “Configuration Manager”:

Here you can create custom solution platforms and allow you to select the project’s platform for each of your projects.

Solution 2

If you are using the .NET Framework 4.5 from the start of your project you might have noticed you do not run into the above problems. This is because of a new setting that is on by default called “Prefer 32-bit” (this settings was conveniently greyed out for met .NET Framework 4 project screenshot above).

If it is possible to upgrade to the .NET Framework 4.5 then this is a much cleaner way of dealing with things. Though just upgrading is not enough, after the upgrade you DO need to actually check the box to enable it.

More info on this option can be found here: http://blogs.microsoft.co.il/sasha/2012/04/04/what-anycpu-really-means-as-of-net-45-and-visual-studio-11/

Solution 3

So what if you shipped your product built in .NET 4.0 and the Platform set to “Any CPU”. The customer might not want to upgrade to your new and improved version, and you don’t want to rebuild the version he is using. Or you just found out and a release with the fix might be months away (and of course the customer can’t wait that long).

When that happens it is time to fall back to solution number 3: CorFlags.exe which is a tool installed with Visual Studio and is available from the “Developer Command Prompt”.

This tool allows you to edit the header section of your .NET executable. And can be used in the following way:

CorFlags.exe MyApp.exe /32BIT+
or
CorFlags.exe MyApp.exe /32BIT+ /Force

The latter is used if you have a strong signed assembly. This will break the signing so be careful with that. More info on the tool can be found here: http://msdn.microsoft.com/en-us/library/ms164699(v=vs.110).aspx

 

Updating DBML via drag/drop in Visual Studio 2012 not working

I don’t usually work with Linq 2 SQL DBML’s anymore, but today I had to work on a project that still used them. I needed to update some tables, so I removed them as I always do, and then tried to drag/drop them from the Server Explorer.

But not today…. today Visual Studio decided that the Server Explorer would not allow me to even begin dragging a data connections table, let alone dropping it on the DBML.

After some creative googling I found out that the problem was with the file called dsref80.dll. This file is located in the following folder on an x64 machine: “C:Program Files (x86)Common FilesMicrosoft SharedVisual Database Tools”. On my machine it had the following version:

Thats right, it says Visual Studio 2013 Preview (something I had already removed due to some other problems I had a couple of weeks ago). So the uninstall of Visual Studio 2013 Preview had left this file lying around. Causing Visual Studio 2012 to panic and not load it, which in turn breaks the drag/drop action of a Table object from Server Explorer to my DBML surface.

You could probably fix this issue by manually removing this file and then running a Visual Studio 2012 repair. But this is a somewhat time-consuming endeavour. So I opted for the fast route which goes as follows:

– Find a machine with Visual Studio 2012 installed that never had the Visual Studio 2013 Preview installed.
– Copy the dsref80.dll from that machine to your own machine, overwriting the 2013 one.
– I didn’t even have to restart VS or reload the DBML, drag/drop worked again instantly.

The version of the correct file showed as follows:

Things I fixed today #5

I’ve planned myself one of those problem fixing days again. And I started off with an easy one:

A critical Sharepoint event log entry that said: The Execute method of job definition Microsoft.SharePoint.Administration.SPSqmTimerJobDefinition etc.

Luckily I was not the only one with this problem. Ian Anker wrote this up in his blog way better then I ever can. So here is a link:

http://ianankers.wordpress.com/2011/08/19/sharepoint-2010-the-execute-method-of-job-definition-microsoft-sharepoint-administration-spsqmtimerjobdefinition-threw-an-exception-event-id-6398/

Things I fixed today #4

Issue 4: SignTool stopped working

After a recent reinstall of my developer machine I received errors when compiling projects that use SignTool in the Post-build event to sign the built assembly with an authenticode certificate.

This is because the CAPICOM SDK is deprecated and won’t work out of the box with x64 versions of Windows. Also there is probably a better way to sign your assemblies (which I should research some time) these days, but since we have quite some build machines and legacy projects using this method it seemed best to get things working again.

I found the answer here:

http://www.peppercrew.nl/index.php/2011/02/windows-7-x64-signtool-error-signtool-requires-capicom-version-2-1-0-1-or-higher/

In short:
– Download the CAPICOM SDK from Microsoft
– Install it.
– Copy the capicom.dll to your SysWOW32 folder.
– Register it using the regsvr32.exe in that same folder.

Things I fixed today #3

Issue 3: Problems accessing TFS’s Analysis Server from Excel

One of my co-workers wanted to use the “Create Report in Microsoft Excel” feature from the Team Explorer in Visual Studio 2012. However, once Excel had started, he was presented with the following error:

TF208056: An error occurred while the command was accessing the Analysis Services database Tfs_analysis on the server xxxxxx.

This happens because my co-worker’s machine is located in another domain, and the server name was not a FQDN server name. To fix this, I had to change the TFS Reporting configuration and set the SQL Server property to the FQDN.

First go to the TFS Administration console on your server:

Go to Reporting and then click the Edit button. You will be presented with a dialog stating that the jobs will be disabled. When you click Ok, the actual configuration screen appears:

Go to the Analysis Services tab and enter the FQDN of the SQL Server. You will have to enter the password for the data access account before you can hit the Ok button. After you hit Ok please be patient because it might take a few minutes before the dialog will disappear. After it has disappeared you should start the warehouse and analysis jobs again with the button on the TFS Administration page described above.

It can take a while before the change is visible on all clients.

Things I fixed today #2

Issue 2: Another Event log Error on our TFS production server

This was also a Sharepoint issue one a bit easier to solve though. The error was:

Load control template file /_controltemplates/SearchArea.ascx failed: ‘Microsoft.SharePoint.WebControls.SearchArea,Microsoft.SharePoint,Version=12.0.0.0,Culture=neutral,PublicKeyToken=71e9bce111e9429c’ is not allowed here because it does not extend class ‘System.Web.UI.UserControl’.

After some research I found out that the SearchArea.ascx file was left behind because we upgraded from WSS 3.0 to Sharepoint Foundation 2010. Renaming the file on disk to SearchArea.ascx_old stopped the errors from popping up. I guess you could probably remove it, but after that first issue, I didn’t want to tempt fate any more than I had to.

And now to lunch.

Things I fixed today #1

Sometimes you have those days where all you do is troubleshoot and fix ‘problems’. These can be your most fulfilling days, or the most frustrating ones depending on how many of these issues you can resolve before lunch (or the end of the day for that matter).

Now and again, I can plan my day of troubleshooting, where I can pick up those event log errors on my TFS server that don’t hurt too much but keep annoying my SCOM Admin. Or perhaps fix that thing you have a work around for that takes 10 minutes extra of your time so its not deemed very urgent.

Well, today I had one of those days and I’d thought to share the solutions to the problems I have solved.

Issue 1: An Event log Error on our TFS production server

This is a Sharepoint issue, so probably not TFS specific, but the full error was:

An exception occurred when trying to issue security token: Could not connect to http://localhost:32843/SecurityTokenServiceApplication/securitytoken.svc. TCP error code 10061: No connection could be made because the target machine actively refused it 127.0.0.1:32843.

It turned out there was a application pool down on the server (The “SharePoint Web Services Root” to be precise). After starting this thing back up, the problems were solved, or so I thought. It turned out I had opened a Pandora’s box. Which, if I had known beforehand would probably have stopped me from going the route I’m about to describe, and just reinstall Sharepoint on the TFS server.

First of all the “Sharepoint Web Services” website in IIS manager was presenting strange errors: The object identifier does not represent a valid object. With no good option to be found on the Internet I recreated the website by hand (including all bindings, advanced settings, virtual directories and applications). This solved the immediate problem of the error message in the IIS manager.

But the Security Token Service still wouldn’t work. So I decided to compare the IIS settings and configuration to a working installation of Sharepoint Foundation 2010. And it turned out, our TFS server did not have the SecurityTokenServiceApplication application configured. So I created it and set it to all the correct app pools, folder etc. After that I had to change the web.confg of the SecurityToken.svc because it enabled the WindowsAuthentication Module which was already enabled.

To make a very long and tedious story short. It now works, the errors have disappeared from my event log and Sharepoint Admin Monitoring page. But next time I’ll just reinstall Sharepoint Foundation 2010 I think. This was 4 hours of my life I am not going to get back.

Your own private NuGet gallery

You probably know NuGet (and if you don’t, just click that link). And, if you write a lot of the same types of projects, you probably have a few libraries you reuse in those projects. Wouldn’t it be nice if you could install those libraries just as you install JQuery, Entity Framework or ‘your favorite package here’, but just not share that package with the entire world.

As always, there are plenty of ways to achieve this. Each has its pros and cons, and I decided to do a write-up of what is possible. So, first the options:

  • Simple file share containing packages
  • Automatically synchronized (cloud) file share containing packages
  • Internal nuget server
  • NaaS – NuGet as a Service

So now that we have the options, lets see what they are all about.

Simple file share containing packages

When you open the NuGet settings you will be able to add your own package source. This does not have to be a URL per se, but can also be a file location on a network drive or your own machine:

Package source settings

Pros:

  • Easiest to setup.
  • When using a file share on a network drive, accessible for all users inside your network.

Cons:

  • Not easily available outside your company network.

Automatically synchronized (cloud) file share containing packages

The same as above, but now you use Skydrive, Dropbox, or any other cloud storage provider to synchronize that folder in the cloud for easy access when working from home or plane.

Pros:

  • Still easy to set up (on your clients at least).
  • Package source always with you, and automatically syncs when you are online.

Cons:

  • Setting up access rights to more people than a small team (4 or 5 people) is going to be a maintenance nightmare.
  • Try explaining a Dropbox or Skydrive installation on your build server to Corporate IT.

Internal NuGet server

In this scenario you setup your own NuGet server, which will probably consist of a website and some web services hosted on IIS (I am not sure if there are any non .NET NuGet servers out there). There are a few options with a varying degree of features and installation effort needed.

Pros:

  • Once installed, easy to expose to outside the local network so you can access it from anywhere.
  • Works just like the official NuGet Package Source, which means that you don’t need separate build scripts for packages you want to push to the official source and the internal source.
  • Easily incorporated in automatic build scenarios.

Cons:

  • Hard(er) to set up.

NuGet as a Service

I recently found MyGet, which hosts NuGet feeds. The idea is quite simple you register with them, a feed is created for you, and you can contribute to this feed. They have 4 different plans of which only 1 is free. More details on those plans here.

Pros:

  • Easy to set up.
  • The enterprise plan has a lot of nice options.
  • Easily incorporated in automatic build scenarios.
  • Available wherever you have an internet connection.

Cons:

  • Advanced and user/quota management features come at a price.

Conclusion

As you can see, there are plenty of options to create your own NuGet feed. I have chosen to install my own NuGet server (because I only found out about MyGet after I finished, and I love to get my hands dirty with that kind of stuff). So I’ll probably write a tutorial on that one some day.

If you need/want a private NuGet package feed is totally up to you. But if you do, there are plenty of options to get you started. And I hope this post has helped you make a decision.