Logic App Standard versus Logic App Consumption

In this post, I want to share some findings on Logic App Standard. Not a definitive guide, but food for thought. My two cents: Standard Logic Apps are definitely more complex. An advice could be to use Standard Logic Apps only when you need to access on-prem resources via API Management. In that case, VNet integration could be a security requirement.

  • Standard Logic Apps are single tenant, not multi-tenant. Isolation can be good for security, reliability and performance.
  • Standard Logic App has reserved capacity and dedicated resources. That’s why a Standard Logic App needs to be deployed to an App Service Plan. This makes Standard Logic Apps a very expensive option. See the pricing model below. In theory it’s possible to use the same App Service Plan for different logic apps. This results in a very complex model, which makes it hard to troubleshoot performance problems when those occur. Auto scaling is not an option.
  • Standard Logic Apps are required for VNet integration (instead of ISE). Think of the differences however. Consumption tier means you have an encrypted communication channel over the public internet. Standard tier means you communicate over the Azure backbone. Which is more secure, but only required for high security. In modern applications Identity and Access Management (via Azure AD) is preferred over network security.
  • Standard Logic Apps hold multiple workflows like Function Apps. You can use stateful workflows next to stateless workflows. In theory stateless workflows are best for workflows with max duration under 5 minutes. In practice, I would rarely make use of stateful workflows.
  • Standard Logic Apps have changed, limited, unavailable, or unsupported capabilities, but the limited capabilities don’t need to be blocking. Zie: Logic App Tiers Comparison
  • Standard Logic Apps can be developed in Visual Studio Code. A developer performed a POC and reported back that Visual Studio Code is not optimal.
  • Standard Logic Apps are not easily deployed by ARM, Bicep, or Terraform. Terraform has some workflow modules, however they are highly impractical. The only feasible way is to deploy Standard Logic Apps from a package. A developer reported back that he was able to deploy a simple workflow, however I could not get it to deploy a parameterized LAS with connections. Logic App Connections require separate JSON files.

Standard has fixed pricing model (no pay-as-you-go):

Perform Null Check in Logic App

Yet another issue that was very hard to solve and hard to test. I was performing an Http call that either returned a success, a failure or a time-out. In case of a time-out I had to do a retry by abandoning the servicebus message. In case of a regular error I had to send a message to the deadletter queue. The first thing I found out, is that the time-out didn’t result in an actual time-out, but in a bad request with an error description shown in the run history, but without a message body.

First, I tried to check if outputs(‘Http’)[‘body’] equals null. An error was returned that the expression could not be evaluated because property body could not be selected. In other words, I wasn’t able to perform a null check.

In the end, I didn’t find a way to directly test against null or undefined but I found a workaround using coalesce (return string value ‘NotExist’ if null). This is what the expression looks like.

"expression": {
"or": [
              {
                  "equals": [
                              "@coalesce(body('HTTP'), 'NotExist')",
                              "NotExist"
                            ]
                }
              ]
}

Sequential Processing in a Logic App

In a Logic App, I wanted to process a number of descriptions sequentially, but the descriptions were processed out of order. How come?

By default, “For each” iterations run at the same time, or in parallel. To run sequentially, you must enable Concurrency Control in the settings of the ForEach loop and then set the limit to 1.

It’s as easy as that, but quite hard to pinpoint if you are not aware of this default behavior. Use it to your advantage!

Deploy Logic App via Portal

Another quick gotcha. If you want to migrate a Logic App from Development to Accept and you don’t want to use Powershell/ARM or Visual Studio Team Services (DevOps), you can perform the following steps in the Azure Portal:

  1. Clone the Dev version of your Logic App and name it Acc_LogicApp (follow the correct naming convention).
  2. Go to the Dev resource group. Select your Logic App and move it to the Acc resource group.

Logic App parameters in SQL

In a previous post, I explained how you can use Logic App parameters in ARM templates. This solution is quite complicated. You will have to parameterize your Logic Apps. At deployment, the parameters are substituted by actual values. But, If you lookup the logic app in the Azure Portal, you will notice that the value is not displayed. Just a reference to a parameter. Difficult for support engineers. How do they know the value of Logic App parameters without access to the parameter.json file?

At my current client, I saw an interesting alternative solution whereby the workflow settings are stored in a SQL Server table. You can use the SQL Connector to retrieve the values.
If you like, you can use the Compose Json (convert to Json) and Parse Json (validate Json and transform to typed object) actions next, but you don’t have to.

“Get_rows_-_destination_-_9000”: {
“inputs”: {
“host”: {
“connection”: {
“name”: “@parameters(‘$connections’)[‘sql_2’][‘connectionId’]”
}
},
“method”: “get”,
“path”: “/datasets/default/tables/@{encodeURIComponent(encodeURIComponent(‘[setup].[workflow_orchestration]’))}/items”,
“queries”: {
“$filter”: “active eq true and interface eq ‘9000’ and direction eq ‘to_asb'”
}
},
“runAfter”: {
“Set_variable_-_source_guid”: [
“Succeeded”
]
},
“type”: “ApiConnection”
},
“Compose_-_destination”: {
“inputs”: {
“destination_application”: “@{body(‘Get_rows_-_destination_-_9000’)?[‘value’][0][‘application’]}”,
“destination_blob_archive_path”: “@{body(‘Get_rows_-_destination_-_9000’)?[‘value’][0][‘blob_archive_path’]}”,
“destination_blob_error_path”: “@{body(‘Get_rows_-_destination_-_9000’)?[‘value’][0][‘blob_error_path’]}”,
“destination_entity”: “@{body(‘Get_rows_-_destination_-_9000’)?[‘value’][0][‘entity’]}”,
“destination_file_path”: “”,
“destination_file_server”: “”,
“destination_folder”: “@{body(‘Get_rows_-_destination_-_9000’)?[‘value’][0][‘blob_path’]}”,
“destination_ftp_path”: “@{body(‘Get_rows_-_destination_-_9000’)?[‘value’][0][‘ftp_path’]}”
},
“runAfter”: {
“Get_rows_-_destination_-_9000”: [
“Succeeded”
]
},
“type”: “Compose”
},
“Parse_JSON”: {
“inputs”: {
“content”: “@body(‘blo-asb-afas-to-totara-conversion’)”,
“schema”: {
“properties”: {
“Files”: {
“items”: {
“type”: “string”
},
“type”: “array”
},
“RecordCount”: {
“type”: “number”
}
},
“type”: “object”
}
},
“runAfter”: {
“blo-asb-afas-to-totara-conversion”: [
“Succeeded”
]
},
“type”: “ParseJson”
}

If we look the SQL table, we see the following setup:

SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

CREATE TABLE [setup].[workflow_orchestration](
[id] [bigint] IDENTITY(1,1) NOT NULL,
[active] [bit] NOT NULL,
[interface] [varchar](10) NOT NULL,
[application] [varchar](250) NOT NULL,
[entity] [varchar](250) NOT NULL,
[direction] [varchar](50) NOT NULL,
[ftp_server] [varchar](250) NULL,
[ftp_path] [varchar](250) NULL,
[storage_account] [varchar](250) NULL,
[blob_container] [varchar](250) NULL,
[blob_path] [varchar](250) NOT NULL,
[blob_archive_path] [varchar](250) NOT NULL,
[blob_error_path] [varchar](250) NOT NULL,
[file_extention] [varchar](250) NULL,
[storage_queue_s10] [varchar](250) NULL,
[storage_queue_s20] [varchar](250) NULL,
[storage_queue_s40] [varchar](250) NULL,
[environment] [varchar](250) NULL,
[ftp_interval] [varchar](250) NULL,
[connection_type] [varchar](50) NULL,
[destination_server] [varchar](250) NULL,
[destination_path] [varchar](max) NULL,
[comment] [varchar](max) NULL,
[retry] [bit] NOT NULL,
[blob_retry_path] [varchar](250) NULL,
PRIMARY KEY CLUSTERED
(
[id] ASC
)WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

Interface is equal to the interface number of the client. By setting Active to 0, we can invalidate the logic app. No settings will be found as active needs to be true (see where clause of logic apps connector).

In this case, the values from the database are put in a queue message which triggers other logic apps. These logic apps use the config values.

As an example:
“destination_path”: “@{body(‘Parse_JSON_-_destination’)?[‘destination_folder’]}”

Advantages of this solution:

  • You don’t have to parameterize your logic apps.
  • Deployment of logic apps doesn’t require a parameters file.
  • Parameter values can easily be visualized by querying SQL Server.

Run After in Logic Apps

Using the visual designer in Logic Apps, you can specify the Run After property. Let’s say you have multiple branches and an action that should run after one of the branches has succeeded. Note that if one of branches is successfully executed, the other branches are skipped. That’s why you need a run after property stating that the last action in each branch is either succeeded or skipped.

Import multiple APIs in one API Management Service

You can expose multiple Logic Apps via one and the same API Management service. Let’s say you define one gateway service for each external client. You want the client to be able to send orders, claims, articles and other resources via one and the same endpoint. You use different logic apps to store each resource. Importing multiple logic apps in one sevice is possible in Azure API Management. Navigate to your API Management instance in the Azure portal.

  • Select APIs from under API MANAGEMENT.
  • Press ellipsis “. . .” next to the API that you want to append another API to.
  • Select Import from the drop-down menu.
  • Select Logic App or another service to import an API.

In the below example we are creating a service with display name WebshopGateway (specify in Settings).

Next we added two operations named ProcessOrder and ProcessArticle. For each operation we imported another Logic App. Per operation you can open the Front-End form-based editor to specify display name and POST Url /[resource].

Use basic authentication with Azure API Management

I developed the habit to unlock Azure App Services using so-called Gateway Services. As the name implies, Gateway Services are nothing more than gatekeepers. They have a fixed set of responsibilities: Give customers authorized access using basic authentication, store the posted entity in original format (Azure Storage tables for XML/Json, blobs for file attachments) and send an event message to a queue to kick off the process in a decoupled way.

I first implemented the Gateway Services via custom coding. I created a separate Web App for each customer. One customer, one Gateway. After a while I realized I could implement the Gateway Service via a Logic App using out-of-the-box API’s (so without any custom coding): Request, Azure Storage Tables and Blobs (with looping for attachments), Azure Storage Queue, Response. Fair enough. The only remaining responsibility for the Web App was to call the Logic App and apply basic authentication. The next step was to call the Logic App from Azure API Management. You can’t miss the option to import a new API from a Logic App. Nice. Now there was only one problem left. How to perform basic authentication?

When using Azure Active Directory and ADFS 3.0 you need to define an Authorization Server. You can also use a OAuth 2.0 bearer token for external identity providers like Microsoft and Google. But the Security section is not what we need here. For basic authenication, you can use an inbound policy: check-header.

[box type=”info”]
<policies>
<inbound>
<base />
<check-header name=”Authorization”
failed-check-httpcode=”401″
failed-check-error-message=”Not authorized”
ignore-case=”false”>
<value>Basic a2xhbnQ6V2Vsa29tMjAxOA==</value>
</check-header>
[/box]

The Authorization header looks quite complicated, but you can use an on-line tool or base64encode.org to generate the header. The basic authentication header is a base64 encoded string with format username:password.

When testing the API Management call to the logic app with the above policy applied, I received a rather cryptic error:
{
“error”: {
“code”: “DirectApiAuthorizationRequired”,
“message”: “The request must be authenticated only by Shared Access scheme.”
}
}

Using this excellent blog, I found out that was due to the fact that Logic Apps are not able to handle the Authorization HTTP header. So, I had to find a way to remove the Authorization header after authentication/authorization. Luckily enough that’s easy. You need to add another policy to the inbound section:

[box type=”warning”]
<set-header name=”Authorization” exists-action=”delete”/>
[/box]

Error Logic App: Workflow Parameter not found

I built a Logic App with the following action:
“Email1_-_replace_siteUrl”: {
“type”: “Compose”,
“inputs”: “@replace(actionBody(‘HTTPGetEnquete’),'{{siteUrl}}’,parameters(‘customerSatisfactionSite’))” …

I could successfully deploy the logic app, but on runtime I received the following error:
Unable to process template language expressions in action ‘Email1_-_replace_siteUrl’ inputs at line ‘1’ and column ‘2730’: ‘The workflow parameter ‘customerSatisfactionSite’ is not found.’.

I tried different things like adding square brackets: [parameters(‘customerSatisfactionSite’)]. Like using the string function: string(parameters(‘customerSatisfactionSite’)). But, in either case I received the same runtime error or even couldn’t deploy the logic app in the first place. As a sidenote, it’s not like I can’t use workflow parameters at all. Look at:
“enqueteURL”: {
“type”: “Compose”,
“inputs”: “[concat(parameters(‘customerSatisfactionSite’), ‘/Content/Html/motion10enquete.html’)]”,

Anyways, I couldn’t use workflow parameters in this case. So the next attempt was to use a variable instead of parameter. I added a parameters section via the Logic App code template:
“variables”: {
“customerSatisfactionSite”: parameters(‘customerSatisfactionSite’)
}

Next I changed the workflow action to:
@replace(actionBody(‘HTTPGetEnquete’),'{{siteUrl}}’,variables(‘customerSatisfactionSite’))

Unfortunately that didn’t solve my problem as I received the error:
Variable cannot be used as it’s not initialized.

I removed the variables section again and instead added an action InitializeVariable to initialize a variable named siteUrl.
“Init_siteUrl”: {
“type”: “InitializeVariable”,
“inputs”: {
“variables”: [
{
“name”: “siteUrl”,
“type”: “String”,
“value”: “[parameters(‘customerSatisfactionSite’)]”
}
]
}, …

Next I changed the workflow action Email1_-_replace_siteUrl as follows:
@replace(actionBody(‘HTTPGetEnquete’),'{{siteUrl}}’,variables(‘siteUrl’))

It took me quite some experimenting, but after deploying the logic app, it finally worked.

[box type=”success”] If you can’t use a workflow parameter because it can’t be found, think of using a variable. Remember that you shouldn’t add a Variables section. Instead use action InitializeVariable. [/box]

Import Logic App from Azure

I’m using Powershell to deploy my API Apps, Logic Apps and Azure Functions. Logic Apps can be developed in Visual Studio, but for deployment to Azure they need to be parameterized. The parameterized version of the Logic Apps is in my deployment folder, not in Visual Studio. Maintaining the Logic App in Visual Studio as well, actually brings a lot of extra work. The question is: Is that necessary.

I found a Microsoft resource that pointed me to the solution: “The Cloud Explorer allows you to Download workflows published in an Azure subscription”.  In the Cloud Explorer select Resource Types (instead of Resource Groups) at the top of the window. Right-click the logic app in the Cloud Explorer tool, and select “Open with Logic App Editor.”  A download button will be available which creates a resource template for the logic app.  The template automatically includes parameters and connection resources”. For example:

“Put_a_message_on_a_queue_-_s50_-_totara”: {
“type”: “ApiConnection”,
“inputs”: {
“host”: {
“connection”: {
“name”: “@parameters(‘$connections’)[‘azurequeues’][‘connectionId’]
}
},
“method”: “post”,
“body”: “{…”,
“path”: “/@{encodeURIComponent(‘blo-asb-acc-s50-blob-to-ftp-totara-sq’)}/messages
},
“runAfter”: {}
}

[box type=”success”] I think it’s safe not to have the logic app in Visual Studio. You have it in your deployment folder and you have it in Azure. Don’t think loosing your code is a big risk. [/box]