Using ADFS in API Management

Elaborating on the scenario described in the preceding post, we ask ourselves the question:
Can we create a virtual service in Azure API Management that expose a backend service using ADFS?

More specifically we ask ourselves the question:
How can we configure the OAUth 2.0 Authorization Server with ADFS security?

Well, this is the answer:

• Click the Security menu on the left hand side of the API Management Publisher Portal.
• Click OAuth 2.0 in the upper right corner.
• Click Add Authorization Server.
• Name and Description, for instance CalculatorAuthServer
• Client Registration Page URL. This is the page that users can use to create and configure their own accounts in case your company supports that kind of self-service account management. In this example users do not create and configure their own accounts so a placeholder is used. Enter http://localhost.
• Set authorization grant types to Authorization Code. For further information, see:
• Enter Authorization Endpoint URL: Note that the resource, being the APP ID URI of the relying party trust is appended to the URL.
• Enter Token Endpoint URL: https://
• Because we specified the resource in the authorization endpoint URL we don’t have to specify an additional body parameter named resource.
• Next specify the ClientId from ADFS. This step needs some explanation. We have to create a separate client for the API Management endpoint in ADFS. We can only create this endpoint after creating the Authorization Server in API Management. The reason is that we need to specify a RedirectURI when creating the ADFS client. This RedirectURI in turn is taken from API Management (see the last step). In this case we choose ClientId=CalculatorVirtualService.
• The ClientSecret can be left empty. ADFS 3.0 doesn’t know the concept of a client secret, so you don’t have to specify that.
• Next – as we mentioned before – you will see the redirect URI for the authorization code grant type displayed. This field has to be used when creating an ADFS client for the virtual service.

Powershell command to create the ADFS client:
Add-ADFSClient -Name “CalculatorVirtualService” -ClientId ” CalculatorVirtualService” -RedirectUri

I didn’t mention the step to create the vitual service in API Management. But in this case I named the virtual service CalculatorAPI. Azure automatically appends to that name. That explains the above redirect URI.

For sake of completeness. After creating the Authorization Server we will have to specify that the virtual service CalculatorAPI will use the Authorization Server just created.

• In API Management, click menu APIs and Select CalculatorAPI.
• Go to the Security tab. Select OAuth 2.0 and select Authorization Server.


Securing an Azure Web App with ADFS

Most companies start their journey into Azure with Office365. To secure access to Office365 applications companies need to install Azure Active Directory. You can take all user identities in your on-premise Active Directory and recreate them manually in the cloud, but obviously it makes more sense to synchronize the identities automatically. That raises the question how we can synchronize the user identities between our on-premise Active Directory and the company’s Azure Active Directory. We can use Azure Connect (formerly AAD Sync or DirSync) for that. There are different options for synchronization and as a consequence there are different identity models:

  • Separate cloud identity and corporate identity. Cloud credentials are completely separate from the corporate credentials. Users simply have to memorize two accounts for two different environments.
  • Synchronized cloud identity and corporate identity, no password sync. In this scenario all user data are synchronized, except for the password.
  • Synchronized cloud identity and corporate identity with password sync. In this scenario all user data are synchronized, including the password. We can implement SSO.
  • Federated cloud identity. In this scenario all user data synchronized, but there is no password in Azure Active Directory. That means users can sign in to Office365 or other cloud services using their corporate credentials. In Microsoft terms, this is the ADFS scenario. We can implement SSO.

In this post I want to focus on the federated cloud identity scenario. Typically the user is signed in on-premise using Active Directory. If the user wants to use an Office365 application, he can do so without having to sign in to Azure again. That is because we use single sign-on. Now let’s move away from the Office365 scenario. A user comes from outside the corporate network and wants to call an Azure web app that is secured by ADFS, i.e. the user needs a federated cloud identity. Can we accomplish this scenario using an Azure Web App? The short answer to this question is: YES, we can.

How do we go about that?

Step 1:

First when creating the Azure Web App in Visual Studio, we need to select the Web API template and choose “Change Authentication”. Pick Work and School accounts and select “On-Premises”. Next you will have to specify the following information:

  • On-Premises Authority. This represents the URL of the metadata document of your authority. The authority is the fully qualified domain name of your federation server, for instance: sts.[companyname].nl. The metadata document allows Visual Studio to discover all relevant info about your ADFS (addresses, signing keys, identifiers, etc). An example of the metadata document URL is: https:// sts.[companyname].nl /FederationMetadata/2007-06/FederationMetadata.xml
  • App ID URI. Unique identifier (no URL) of the Azure Web App (=Service). In federation terms this is the realm of your application. The App ID URI is equal to the App ID URI you specify when creating the relying party trust (see step 2).

After creating the Azure Web App as described above we will see that a few Owin/ADAL specific NuGet packages are added to the packages.config file. Also a file named Startup.Auth.cs is added to the App_Start folder. Finally we see that all controllers in our project are decorated with the [Authorize] attribute, meaning that only authenticated users can access our controller.

This is what the code in Startup.Auth.cs looks like:

using System;
using System.Collections.Generic;
using System.Configuration;
using System.Linq;
using Microsoft.Owin.Security;
using Microsoft.Owin.Security.ActiveDirectory;
using Owin;

namespace _4_2_2_dlwopoc
public partial class Startup
// For more information on configuring authentication, please visit
public void ConfigureAuth(IAppBuilder app)
new ActiveDirectoryFederationServicesBearerAuthenticationOptions
Audience = ConfigurationManager.AppSettings[“ida:Audience”],
MetadataEndpoint = ConfigurationManager.AppSettings[“ida:AdfsMetadataEndpoint”]

The code uses the on-premise authority and the App ID URI we specified earlier from the web.config file:
<add key=”ida:AdfsMetadataEndpoint” value=”” />
<add key=”ida:Audience” value=”; />

Step 2:

Next we have to create a so-called relying party trust in ADFS. In this case we use ADFS 3.0.


A wizard will guide you through the necessary steps. Important information:

  • Specify WS-Federation as the federation protocol.
  • Certificate cam be left empty.
  • Specify a WS-Fedeation Relying Party URL (not a SAML URL). This is the URL the client must be redirected to after successful authentication. In this case it’s the base URL of the web app, i.e,


At the end of the wizard we can choose to add claims. Note that Windows Azure AD features a default set of claims. ADFS does not issue ANY default claims however. You will have to configure the claims. You can for instance specify that you want to have a claim UPN that is taken from the Active Directory LDAP claim User-Principal-Name.


Further information can be found in this post from Vittorio Bertocci:

Step 3:

Then we need to add a client to ADFS. We can only perform this step using Powershell. An example Powershell command is shown below:

Add-ADFSClient -Name “CalculatorClient” -ClientId “CalculatorClient” –RedirectUri

Step 4:

Now we are all set and ready to create our client:

private async void button_Click(object sender, EventArgs e)
string authority = “”;
string resourceURI = “;;
string clientID = ” CalculatorClient “;
string clientReturnURI = “;;

AuthenticationContext ac =
new AuthenticationContext(authority, false);
AuthenticationResult ar =
ac.AcquireToken(resourceURI, clientID, new Uri(clientReturnURI));
string authHeader = ar.CreateAuthorizationHeader();
HttpClient client = new HttpClient();
HttpRequestMessage request =
new HttpRequestMessage(HttpMethod.Get, “;);
request.Headers.TryAddWithoutValidation(“Authorization”, authHeader);
HttpResponseMessage response = await client.SendAsync(request);
string responseString = await response.Content.ReadAsStringAsync();

Futher information can be found in this post from Vittorio Bertocci:

Securing a Web API with ADFS on WS2012 R2 Got Even Easier

In the next post we will take the same scenario, but then we ask ourselves the question:
Can we create a virtual service in Azure API Management that uses ADFS?

Azure Stack

Very interesting stuff to follow: Bring Azure to your datacenter with Azure Stack.

Have the Azure functionality on-premise. This includes infrastructure services like virtual machines, virtual networks, storage accounts, etc. But it also includes all sorts of platform services like Azure App Services. The Azure Portal and the Azure Resource Manager will also become available on premise. Very interesting!


Link: Azure Stack


I will talk to a client tomorrow that considers the use of NServiceBus instead of BizTalk. I’m not an expert on the matter, so I did some on-line research. The best source I found is the following link: NServiceBus. It’s a theoretical explanation, but note that there’s a link to a code example at the bottom of the post.

In short, NServiceBus is a code-based .NET implementation of a service bus. NServiceBus is not as rich of a product like BizTalk Server, but it can be used for simpler scenarios that need pub-sub and message persistance via queues or databases. Additional features:

  • NServiceBus offers long running processes definition using Sagas.
  • NServiceBus can also be hosted in Azure (VM or website). You can also build a hybrid scenario using the Azure Service Bus Queues and Topics.
  • Finally, NServiceBus provides built-in message auditing for every endpoint. Just tell NServiceBus that you want auditing and it will capture a copy of every received message and forward it to a specified audit queue.

Note that the documentation on NServiceBus is quite limited. I also found a commercial link from Particular Software: Link. Much information, including the fact that the product is not for free. My general impression is that NServiceBus has two important weak points: it’s code based and there’s no big community and/or extensive documentation. Furthermore, it’s a distributed solution. You need to install software on all nodes you need to connect and you should add specific code and config to each endpoint you need to connect. Altogether this makes the product hard to maintain and not future proof.

Azure Queues versus ServiceBus Queues

Microsoft Azure supports two types of queue mechanisms: Azure Queues and Service Bus Queues.

While both queuing technologies exist concurrently, Azure Queues were introduced first, as a dedicated queue storage mechanism built on top of the Azure storage services. Service Bus queues are built on top of the broader “brokered messaging” infrastructure designed to integrate applications or application components that may span multiple communication protocols, data contracts and/or network environments.

The article in the following link compares the two queue technologies offered by Azure. The article also provides guidance for choosing which features might best suit your application development needs. Link: Azure Queues vs ServiceBus Queues.

Using Certificates in Azure

For a recent RFI I had to call a SOAP webservice using a X.509 certificate. This made me look into the issue of using certificates. Obviously you don’t have a certificate store in Azure. So, how does it work?

First all, I found out that things work differently for cloud services (web+worker roles) and Azure Websites. In Azure Cloud Services you can simply upload a certificate through the Azure Management Portal. Next you can specify that certificate’s thumbprint and install location in your role’s properties. On deployment, the fabric controller automatically installs the certificate for you. That makes sense, because each role gets deployed into a virtual machine (with a certificate store I guess). The following code loads the uploaded certificate:

private List GetAvailableCertificatesFromStore()
var list = new List();
var store = new X509Store(StoreName.My,StoreLocation.LocalMachine);

foreach (var cert in store.Certificates)
// todo: add friendly name
list.Add(string.Format(“{0}”, cert.Subject));

return list;

In the Azure Website scenario things work a little differently. Because of security restrictions in an Azure website, you just can’t install a certificate in the certificate store. To work with certificates, you would need to include the certificate’s PFX file in the AppData folder. Still you may run into errors like “CryptographicException: The system cannot find the file specified. Check the following blog post:

Later on in the original blog post, I read an alternative way to work with certificates. Not sure which version is the correct one. Working procedure:

  • Upload your certificate through the Azure Portal.
  • Add an appsetting called WEBSITE_LOAD_CERTIFICATES and set its value to the thumbprint of the uploaded certificate (use a comma separated list for multiple thumbprints, or use the * wildcard to load all your uploaded certificates). I’m presuming this forces the certificates to be loaded in to memory.
  • To load your certificate, you can do the following:
var store = new X509Store(StoreName.My, StoreLocation.CurrentUser);
var certs = store.Certificates.Find(X509FindType.FindByThumbprint, YOUR_THUMBPRINT, false);

Change the ‘false’ to ‘true’ if you want to ensure the certificate is valid. I found this information here, which explains it much better than I have:

Additional reading:
Link 1: Signing a SOAP message using a certificate (non Azure)
Link 2: Export certificate + Add to Azure Portal (old?)

Working with Files in Azure

For a recent RFI, I had to sort out some information on file storage. Basically I had to use a File Connector to retrieve files and trigger a Logic App. As far as I can see, you can only use local file shares with hybrid connections when using the File Connector. Below is the information I found:

Link 1: Using the File Connector

Link 2: Azure File Storage

The second link is not directly related to the original question, but relevant for the issue at hand.

Microsoft Integration Roadmap

Received an email from BizTalk360. Important news. I have assumed the full text below:

For a very long time there were lot of debates and frustrations within the Microsoft BizTalk Server community, what is going to be Microsoft Roadmap with BizTalk Server + some of the new cloud integration offerings like MABS (Microsoft Azure BizTalk Services), Azure App Services (LogicApps and API apps), and to some extend with Service Bus (Queues and Topics).

Until this point there were no clear directions apart from a vague belief BizTalk Server will be released every 2 years once and Microsoft will continue it’s investment in the cloud. Two days ago Microsoft released a clear roadmap document “Microsoft Integration Roadmap 1.0”, which clearly explains the vision for Microsoft Integration.

Link 1: BizTalk360 Summary

Link 2: Microsoft Integration Roadmap



Azure MVC Web App with Active Directory security

When you develop an Azure webservice, you typically create both a service application and a client application in Azure Active Directory. A website in run from the browser. That means you only create an application for the website in Active Directory. The problem I faced was basically: how do I pass the credentials from browser to website? And how do I configure the browser to use the correct AD settings. Obviously, I don’t have a web.config in the browser scenario.

Anyways. The solution was quite simple. Open Visual Studio and add a new Project: ASP.Net Web Application and select the MVC template. Next click Change Authentication. Select the correct Active Directory Domain, typically [company] Leave the Access Level at Single Sign On or change it to Single Sign On, Read Directory Data. Finally Expand More Options and make sure APP ID URI is filled in appropriately. Next you can check the Host in the Cloud option. Normally I don’t do that, because I have more control when I manually create the WebApp via the Azure Portal.

The above steps are all you have to do to create a default MVC web app with all necessary NuGet Packages and reference. The most important source code file is StartUp.Auth.cs in the App_Start folder. This class file is very different from the same file you create when you want a webservice with AD security. Also note that all controllers are decorated with the [Authorize] attribute. This ensures that users from a browser will have to authenticate. Hit F5 to run the application locally. You will get a SignIn page where you can enter the credentials and off you go.

Now comes the interesting part. You want to deploy your web app to Azure. Right click the project and select Publish. I will not go through the publishing steps, bit let’s say you we’re able to publish the app. At first you will receive an error like ‘Server Error in ‘/’ Application’. Not very informative. You can change the web.config file and set . Tip: You can perform this action without republish by selecting the web.config from the Server Explorer. Visual Studio will say [remote] in the Document tab. That’s exactly what you want. Make the change and hit Save. You will still receive an error, but this time somewhat clearer:

0x80131904): A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 52 – Unable to locate a Local Database Runtime installation. Verify that SQL Server Express is properly installed and that the Local Database Runtime feature is enabled.)]

A database error seems a bit strange, since the default MVC application doesn’t use a database. When you open the web.config file, however, you will see the following connection string:

What’s this about? Well, if you’ve written MVC applications you’ll know this is where Visual Studio stores data related to ASP.Net related authentication. What the application is trying to do is to grab two specific pieces of information so it can authenticate the client using SAML assertions against Azure AD. The problem is that once you deploy it to the web, this database is not accessible anymore. So how do you fix it and what goes in the database.
I copied the following information from the following link:

In order to fix the problem we have to look at the local database. To do that, do the following:
• In Server Explorer, expand the Data Connections node and you should already see a connection to the application’s database.
• If it doesn’t exist, add it as follows:
• Right-click Data Connections and select Add Connection
• Change the data source to Microsoft SQL Server Database File
• Browse to the App_Data folder and select the mdf file there
• Click OK
• Expanding the table nodes we find a very simple schema
• The important tables are the IssuingAuthorityKeys and the Tenants table. Right-click each table and select Show Data

Each table has a single column and a single row. These tables need to be moved to a database accessible to the ASP.Net MVC application running in Azure. For our purposes we’ll use a SQL Database. For simplicities sake I’ve created the SQL statements needed to be executed against a SQL database. Replace the two insert statements with the values found in each of the respective tables as shown (for example):

CREATE TABLE [dbo].[__MigrationHistory] (
[ContextKey] NVARCHAR (300) NOT NULL,
[ProductVersion] NVARCHAR (32) NOT NULL
CREATE TABLE [dbo].[IssuingAuthorityKeys] (
CREATE TABLE [dbo].[Tenants] (
INSERT INTO [dbo].[IssuingAuthorityKeys]
INSERT INTO [dbo].Tenants

If you don’t have an existing SQL database, create one in the Azure Management Portal. Now you can use the the Query Editor when right-clicking the database in the Server Explorer. Paste the script in the sql window and click run (after replacing the issuing authority key and tenants key). Now we have a table that holds the appropriate data needed by the application. Let’s try publishing the application once again. This time however we’re going to make a change to the connection string. You can grab the connection string from the Azure Portal.

There’s one last problem. By default SSL is not set up. Yet SSL is required by default. You’ll get the following error message: ID1059: Cannot authenticate the user because the URL scheme is not https and requireSsl is set to true in the configuration, therefore the authentication cookie will not be sent. Change the URL scheme to https or set requireSsl to false on the cookieHandler element in configuration.

Open the web.config file and edit the following like:

Change “true” to “false” – now publish the application.

Again we will receive an error, saying: This Page can’t be displayed. What’s the problem now? Looking at the URL we can see that the callback for the logon takes us back to https://localhost:4430 only we aren’t running on our local machine anymore – we want to go to The fix is simple but it’s important to understand the end-to-end flow. In Active Directory, drill into the appropriate application and click Configure. Update the Sign In URL and the Reply URL to the correct URL on Azure (like Click Save. In a new browser window, enter the URL.
And it will work!

Now, a final word when logging on with a Microsoft account. For some reason (I don’t know why) when you try to log on with a Microsoft account the first time it returns null. If you try logging on a second time it works fine. A simple change to the _LoginPartial.cshtml page – it doesn’t handle nulls at all. At the following to the top of the _LoginPartial page:

var user = “Null User”;
if (!String.IsNullOrEmpty(User.Identity.Name))
user = User.Identity.Name;

And update the user action link to the following:
@Html.ActionLink(user, “UserProfile”, “Home”, routeValues: null, htmlAttributes: null)

Republish the application and now it will handle any situation.