Logic App error using workflow parameters

I received the following error when deploying a Logic App via Powershell/ARM:
InvalidTemplate. Unable to process template language expressions in action ‘HTTP’ inputs at line ‘1’ and column ‘1420’: ‘The workflow parameter ‘ahakStorageConnectorName’ is not found.’.

At the top of the Logic App Json file, I defined the parameters. The parameter values were contained in a separate parameters.json file. I thought it would work when using a deploy statement in Powershell referencing both the json template file and the parameter file, like this:
$logicAppTemplate = $baseDir + ‘\ProcessAGP.json’
$logicAppParameter = $baseDir + ‘\ProcessAGP.parameters.json’
New-AzureRmResourceGroupDeployment -Name ‘DeployAGPTst’ -ResourceGroupName $resourcegroup -TemplateFile $logicAppTemplate -TemplateParameterFile $logicAppParameter

As I received an error I took a different approach. If you scroll down the json template file, you’ll notice there are an additional two parameter sections.
In the first parameter section I copied the parameter definitions from the top of the json template file. In the second parameter section I copied the parameter values from the parameters.json file.
This is apparantly the way, you can make parameters available to a Logic App.

Example:
{
“$schema”: “https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#”,
“contentVersion”: “1.0.0.0”,

“parameters”: {
“ahakStorageConnectorName”: {
“type”: “string”,
“metadata”: {
“description”: “Name of the AHakStorageConnector”
}
},
“dspConnectorName”: {
“type”: “string”,
“metadata”: {
“description”: “Name of the DSPConnector”
}
},
“logicAppUploadToSharepoint”: {
“type”: “string”,
“metadata”: {
“description”: “LogicApp UploadToSharepoint”
}
},
“rg”: {
“type”: “string”,
“metadata”: {
“description”: “Resourcegroup”
}
},
“rgFunctions”: {
“type”: “string”,
“metadata”: {
“description”: “Resourcegroup Functions”
}
},
“functionContainer”: {
“type”: “string”,
“metadata”: {
“description”: “Function Container”
}
}
},
“triggers”: {
“manual”: {
“type”: “Request”,
“kind”: “Http”,
“inputs”: {
“schema”: {
“$schema”: “http://json-schema.org/draft-04/schema#”,
“properties”: {
“Event”: {
“type”: “string”
},
“Id”: {
“type”: “string”
}
},
“required”: [
“Event”,
“Id”
],
“type”: “object”
}
}
}
},
“contentVersion”: “1.0.0.0”,
“outputs”: {}
},
“parameters”: {
“ahakStorageConnectorName”: {
“value”: “tstahakstorageconnector”
},
“dspConnectorName”: {
“value”: “tstdspconnector”
},
“logicAppUploadToSharepoint”: {
“value”: “TstUploadToSharePoint”
},
“rg”: {
“value”: “ahak-appservices-tst”
},
“rgFunctions”: {
“value”: “ahak-appfunctions-tst”
},
“functionContainer”: {
“value”: “ahak-functions-tst”
}
}
}
}
],
“outputs”: {}
}

Deploy Logic App using Powershell and ARM

To deploy a logic app via Powershell, I used the following approach;
Create a directory structure with two folders, i.e.:
C:\Sources\PBaars\Deployment\Release1.5\AAA-DeployResources
C:\Sources\PBaars\Deployment\Release1.5\ProcessPlanning

Folder AAA-DeployResources contains reusable artefacts like a parameter file and a Powershell module with reusable code. In this case the reusable code uploads files for a Web App or API App to a FTP folder. I will come back to that later.The parameters file is named ParametersTst.ps1. That’s because I want to deploy a Logic App to a test environment. The parameters file can also be used for AppSettings. AppSettings are relevant for Web Apps and API Apps. I will come back to that later.
The folder ProcessPlanning contains artefacts related to Logic App ProcessPlanning. The artefacts are:DeployProcessPlanning.cmdDeployProcessPlanning.ps1ProcessPlanning.jsonProcessPlanningTst.parameters.json
DeployProcessPlanning.cmd has just one purpose: call DeployProcessPlanning.ps1 passing a parameter for the environment specific parameter file. Note that we can pass a different parameter file for different environments like Acc and Prod.


@Echo Off

Echo Changing PowerShell Execution policy…

powershell Get-Execution

Policypowershell Set-ExecutionPolicy Unrestricted
powershell %~dpn0.ps1 -paramFileName “ParametersTst.ps1”
pause


DeployProcessPlanning.ps1 is the Powershell script that kicks off the Logic App deployment using both ProcessPlanning.json and ProcessPlanningTst.parameters.json. As we will see later ProcessPlanningTst.parameters.json – as the name suggests – is used for parameterization of the json file. In this case we renamed the file to ProessPlanningTst.parameters.json (note the Tst prefix), because this file performs paramterization of the Test version of the Logic App.
To use DeployProcessPlanning.ps1, we first have to add a parameter for the name of the Logic App to ParametersTst.ps1 in folder AAA-DeployResources.
# ProcessPlanning
$nameProcessPlanning = ‘TstProcessPlanning’

Now we can edit the contents of DeployProcessPlanning.ps1:


param (
[parameter(Mandatory = $true)][string] $paramFileName
)

Import-Module AzureRM.Resources

Write-Host “Login to Azure” –fore gray;
Login-AzureRmAccount

# $PSScriptRoot is null when you run from Powershell ISE
# Run from cmd file
$baseDir = $PSScriptRoot
$pos = $basedir.LastIndexOf(‘\’)
$baseDirParam = $baseDir.Substring(0,$pos)
$paramFile = $baseDirParam + “\AAA-DeployResources\” + $paramFileName

#region Load Parameter file
. $paramFile
#endregion

$logicAppTemplate = $baseDir + ‘\ProcessPlanning’ + $environment + ‘.json’
$logicAppParameter = $baseDir + ‘\ProcessPlanning’ + $environment + ‘.parameters.json’

Write-Host “BaseDir: ” $baseDir –fore gray;
Write-Host “LogicAppTemplate: ” $logicAppTemplate –fore gray;
Write-Host “LogicAppParameter: ” $logicAppParameter –fore gray;
Write-Host “ParameterFile: ” $paramFile -fore gray;

# New-AzureRmResourceGroup -Name ahak-appservices-tst -Location “westeurope”
Write-Host $(Get-Date).ToString(“yyyyMMdd_HHmss”) ” Add logic app ” $nameProcessPlanning ” to Azure” –fore gray;
New-AzureRmResourceGroupDeployment -Name ‘DeployPlanningTst’ -ResourceGroupName $resourcegroup -TemplateFile $logicAppTemplate -TemplateParameterFile $logicAppParameter

# Check status of last command, i.e. deployment to Azure
if ($?)
{
Write-Host $(Get-Date).ToString(“yyyyMMdd_HHmss”) ” Add logic app ” $nameProcessPlanning ” to Azure succeeded” –fore green
}
else
{
Write-Host $(Get-Date).ToString(“yyyyMMdd_HHmss”) ” Add logic app ” $nameProcessPlanning ” to Azure failed” –fore red
}

# List Swagger
#Invoke-AzureRmResourceAction -ResourceGroupName ahak-appservices-tst -ResourceType Microsoft.Logic/workflows -ResourceName TstProcessAGPBeoordeling -Action listSwagger -ApiVersion 2016-06-01 -Force
# GET DevProcessAGPBeoordeling
Get-AzureRmResource -ResourceGroupName $resourcegroup -ResourceType Microsoft.Logic/workflows -ResourceName $nameProcessPlanning -ApiVersion 2016-06-01


There are a few things to note:

  • First you see how the parameter file can be passed to the ps1 script.
  • Next you see the LogicAppTemplate and the LogicAppParameter file is specified. Both the template file and the parameter file have a variable $environment which makes the parameter file specific per environment. If we pass parameter ParametersPrd.ps1 to theps1 file. We can add a file ParamterPrd.ps1 to folder AAA-DeployResources and files ProcessPlanningPrd.json and ProcessPlanningPrd.parameters.json to folder ProcessPlanning.
  • Next you see, I added parameter $nameProcessPlanning at multiple places.
  • Finally I renamed the deployment to DeployProcessPlanning in command New-AzureRmResourceGroupDeployment.

Now, let’s turn our attention to ProcessPlanning.json and ProcessPlanningTst.parameters.json. Both files are copied from the original ResourceGroup project containing the Logic App.In the json file we see there are some environment-specific settings:
Custom Api App: “https://devdspconnector.azurewebsites.net/api/Planning”
Custom Api App: “https://devahakstorageconnector.azurewebsites.net/api/Storage/OpdrachtGegevens/@{triggerBody()[‘Id’]}”
Reusable Workflow: “/subscriptions/1ea9735d-00df-4375-86d5-d0d35362dd7f/resourceGroups/ahak-appservices-dev/providers/Microsoft.Logic/workflows/DevUploadToSharePoint”

It’s obvious that you want one version of the Logic App.json file, but – depending on the environment – this Logic App has to call different Api Apps (Dev/Tst/Prd) and different child workflows. To make this possible we will have to definea few parameters at the top of the json file in section parameters:


“ahakStorageConnectorName”: {
“type”: “string”,
“metadata”: {        “description”: “Name of  the AHakStorageConnector”      }
},
“dspConnectorName”: {
“type”: “string”,
“metadata”: {        “description”: “Name of  the DSPConnector”      }
},
“logicAppUploadToSharepoint”: {
“type”: “string”,
“metadata”: {        “description”: “LogicApp UploadToSharepoint”      }
},
“rg”: {
“type”: “string”,
“metadata”: {        “description”: “Resourcegroup”      }
}


Note that we have to enter the definitions of the custom parameters twice. Look for the Parameter section between the sections Resources and Triggers. If we don’t enter the parameter definitions here, we will receive an error like: ‘The only declared parameters for this definition are ‘$connections’ (i.e. there are no custom parameters).

Also note that we added a variable rg to the variables section. I can’t tell why exactly this is necessary, but otherwise it won’t work.


“variables”:
{ “rg”: “[parameters(‘rg’)]”
}


The values for these parameters we have to add the environment specific parameters file, in this case: ProcessPlanningTst.parameters.json


{
“$schema”: “https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#”,
“contentVersion”: “1.0.0.0”,
“parameters”:
{
“logicAppName”: {      “value”: “TstProcessPlanning”    },
“logicAppLocation”: {      “value”: “westeurope”    },
“ahakStorageConnectorName”: { “value”: “tstahakstorageconnector”      },
“dspConnectorName”: { “value”: “tstdspconnector”      },
“rg”: { “value”: “ahak-appservices-tst”
},
“logicAppUploadToSharepoint”: { “value”: “TstUploadToSharePoint”
}  }}


The final step, is to use the parameters in the json file:
Custom Api App: “https://@{encodeURIComponent(parameters(‘dspConnectorName’))}.azurewebsites.net/api/Planning”
Custom Api App: “https://@{encodeURIComponent(parameters(‘ahakStorageConnectorName’))}.azurewebsites.net/api/Storage/OpdrachtGegevens/@{triggerBody()[‘Id’]}”
Reusable Workflow: “[concat(‘/subscriptions/1ea9735d-00df-4375-86d5-d0d35362dd7f/resourceGroups/’, variables(‘rg’), ‘/providers/Microsoft.Logic/workflows/’, parameters(‘logicAppUploadToSharepoint’))]”

We are almost done now. Go to directory ProcessPlanning, right-click DeployProcessPlanning.cmd and Run As Administrator. When things go well, the Logic App will be deployed.

Useful link Microsoft Azure site: ARM template for Logic App

Retry in Logic Apps

//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js

(adsbygoogle = window.adsbygoogle || []).push({
google_ad_client: “ca-pub-1655089477432243”,
enable_page_level_ads: true
});

For easy reference this information is taken from the Microsoft Azure site.

HTTP actions and API Connections (like the Azure Storage API, SharePoint API or child workflows) support retry policies.  A retry policy applies to intermittent failures (characterized as HTTP status codes 4xx (408, 429) and 5xx as well as any connectivity exceptions) and is described using the retryPolicy object.

The retry interval is specified in the ISO 8601 format.  Its default value is 20 seconds, which is also the minimum value. The maximum value is 1 hour. The default retry count is 4. 4 is also the maximum retry count. To disable the retry policy, set its type to None.

[box type=”error”] If you don’t add a retry interval, the default retry interval will be applied. The default retry policy is: 4 retries with a 20 second interval [/box]

In the example below, we have defined a retry policy that will retry an action 2 time in case of intermittent failures, for a total of 3 executions, with a 30 second delay between each attempt:

“PostAGAssets”:
“retryPolicy” : {
“type”: “fixed”,
“interval”: “PT30S”,
“count”: 2
}
For an expanation of the ISO8601 interval notation, see the following link.
[box type=”warning”] Retry seems like a good idea, but it may have a counterproductive effect in the Azure Portal. Because of all the retries we see five failing logic app runs, not just one. This situation gets worse if we use nested workflows.[/box]
In the example below we see a child workflow being called with a retry interval None. Next we see a Http action. Note that the retry policy is contained in the inputs section of the action.

Child workflow:
“ProcessAGADocument”:
{
“type”: “Workflow”,
“inputs”: {
“host”: {
“triggerName”: “manual”,
“workflow”: {
“id”: “[concat(‘/subscriptions/1ea9735d-00df-4375-86d5-d0d35362dd7f/resourceGroups/’, variables(‘rg’), ‘/providers
/Microsoft.Logic/workflows/’, parameters(‘logicAppProcessAGADocument’))]”
}
},
“body”: {
“RegionCode”: “@{triggerBody()[‘RegionCode’]}”,
“BestandsNaam”: “@{item().BestandsNaam}”,
“DocumentId”: “@{item().DocumentId}”,
“IntegrationId”: “@{body(‘ConvertIntegrationId’).IntegrationId}”,
“FormCode”: “@{item().FormCode}”,
“DocumentType”: “@{item().DocumentType}”
},
“retryPolicy”: {
“type”: “none”
}
},
“runAfter”: {}
}
},
“runAfter”: {
“FilterAGAAttachments”: [
“Succeeded”
]
}
}

Http Action:
“GetDocument”:
{
“type”: “Http”,
“inputs”: {
“method”: “GET”,
“uri”: “https://@{encodeURIComponent(parameters(‘gmfBaseUrl’))}/api/gmf/@{triggerBody()[‘RegionCode’]}/document
/@{item().DocumentId}”,
“headers”: {
“PlatformUser”: “[parameters(‘gmfUser’)]”
},
“authentication”: {
“type”: “Basic”,
“username”: “[parameters(‘gmfUser’)]”,
“password”: “[parameters(‘gmfPassword’)]”
},
“retryPolicy”: {
“type”: “none”
}
},
“runAfter”: {}
}

Http Action(2):
“PostAGAFoto”:
{
“type”: “Http”,
“inputs”: {
“method”: “post”,
“queries”: {
“regionCode”: “@{triggerBody()[‘RegionCode’]}”,
“formCode”: “@{triggerBody()[‘FormCode’]}”,
“integrationId”: “@{triggerBody()[‘IntegrationId’]}”,
“documentId”: “@{triggerBody()[‘DocumentId’]}”,
“bestandsNaam”: “@{triggerBody()[‘BestandsNaam’]}”
},
“uri”: “https://@{encodeURIComponent(parameters(‘ahakStorageConnectorName’))}.azurewebsites.net/api/Storage/PostAGAFoto”,
“retryPolicy”: {
“type”: “none”
}
},
“runAfter”: {
“DeleteAGAFoto”: [
“Succeeded”
]
},
“metadata”: {
“apiDefinitionUrl”: “https://@{encodeURIComponent(parameters(‘ahakStorageConnectorName’))}.azurewebsites.net/swagger/docs/v1”,
“swaggerSource”: “website”
}
}

Deploy Logic App from Visual Studio

If you deploy a logic app from Visual Studio, you may run into problems. In this specific case I entered a WDL expression as the body of a Http Action:  @{decodeBase64(actionBody(‘GetDocument’)._buffer)}

After deploying the Logic App from Visual Studio, I looked at the Logic App in the Azure Portal. The code view looked OK, but in Design view the expression seemed to be replaced by actionBody(‘GetDocument’)._buffer. Strange.

If I look at the run details of the Logic App run, the input document is different:

Correct:
{
“uri”: “https://dspconnector.azurewebsites.net/api/AGA”,
“method”: “post”,
“body”: “…”
}

Incorrect:
{
“uri”: “https://dspconnector.azurewebsites.net/api/AGA”,
“method”: “query”,
“queries”: {
“document”: “…”

}

The solution of the problem is easy. Open the Logic App in the Azure Portal. In design view paste the code expression in the body of the http action, and voila, it works again. To keep the code in the Azure Portal and in Visual Studio consistent, you can – as a workaround – copy the code behind from the Azure Portal to Visual Studio. I couldn’t really tell the difference, but after doing that, Visual Studio Design View suddenly looked the same as the Azure Portal Design View. I could also successfully redeploy again. No idea what’s happening here.

Use custom API App in Logic App

Probably a beginner failure, but after creating and deploying my custom API App to Azure, I wasn’t able to select the API App from a Logic App. Then I found this post from azure.microsoft.com, to discover that you have to enable CORS and set the APIDefinition properties of your custom API App. This is very easy within the Azure Portal. Simply open the settings blade of your API App, and under the API section check if  the ‘API Definition’ is set to https://{name}.azurewebsites.net/swagger/docs/v1), and add a CORS policy for ‘*’ to allow for requests from the Logic apps Designer.

ForEach in LogicApp

I have struggled a while to get ForEach looping working in my logic app. In this scenario I call an Azure Function FilterEvents (post), to filter the results from an Http Call. The output of this function is an array. So I added a ForEach loop to go through the array. Within the ForEach section, you have an element named actions. Here you will have to add the actions you want to include in your loop. Makes sense, but I didn’t know I could nest actions that way. Next note the element foreach with value @body(‘FilterEvents’). This means you are going through the output of the FilterEvents action one by one. Finally the most important part: How do you refer to the items in the array? As you can see, that’s via the Item element: @{item().IntegrationId}. Note that I don’t want to return json, but a string. That’s why I use the expression notation @{ …}, so with the curly braces. Within the loop I call a child workflow. Like I said, you can add multiple actions within a loop, but I wanted a bit more modularization.

“FilterEvents”: {
“type”: “Function”,
“inputs”: {
“body”: “@body(‘HTTP’)”,
“function”: {
“id”: “/subscriptions/1ea9735d-00df-4375-86d5-d0d35362dd7f/resourceGroups/ahak-appfunctions/providers/Microsoft.Web/sites/ahak-functions/functions/FilterEvents”
}
},
“runAfter”: {
“HTTP”: [
“Succeeded”
]
}
},
“ForEachEvent”: {
“type”: “Foreach”,
“foreach”: “@body(‘FilterEvents’)”,
“actions”: {
“ProcessAGADocuments”: {
“type”: “Workflow”,
“inputs”: {
“host”: {
“triggerName”: “manual”,
“workflow”: {
“id”: “/subscriptions/1ea9735d-00df-4375-86d5-d0d35362dd7f/resourceGroups/ahak-appservices/providers/Microsoft.Logic/workflows/ProcessAGADocuments”
}
},
“body”: {
“IntegrationId”: “@{item().IntegrationId}”
}
},
“runAfter”: {}
}
},
“runAfter”: {
“FilterEvents”: [
“Succeeded”
]
}
}
}

Example Logic App with Azure Function

It’s basic, but I am not that familiar with JSON yet. I saw an example where you use a logic app to retrieve all tweets with anchor #LogicApps. Next you see an Azure function named Returning ComplexResp is called. See the screenprint below:

TweetLogicApp

What we see, is that the Azure function takes input from the Twitter functoid. Json property “text” is set to Tweet text, “by” is set to TweetedBy. Now. To give a better idea. What does the code of ReturningComplexResp look like? Actually it returns in the Body a value for Msg (being a combination of the input parameters TweetedBy and TweetText) and a value FileName.

FunctionComplexCode

When we run the Azure function we see we get the following response:

FunctionComplexOutput

Now, as the last step, we create an API to add a file to the root folder of a DropBox directory. First we create a DropBox API with the Body of the ReturningComplexResp Azure function as both the filename and the content.

DropBoxFile

Then we switch to code view and see how we can actually set the content to Body.Msg and filename to Body.FileName as returned by the Azure function:

DropBoxCodeView

This is the actual JSON I wanted to show. Simple, but powerful. When we switch back to Design View again, we see the File Name and File Content parameters have been changed. This shows that Code View and Designer View are actually aligned. Nice! I’ve had another experience before.

DropBoxFileAdjusted

 

Call assembly that uses EF

I wanted to call an ExceptionHandler in a helper class. The ExceptionHandler uses Entity Framework to store the error. Then I tried to call the ExceptionHandler from a Logic App. The Logic App selects data, but does so via ADO.Net and not via the Entity Framework. The first error I received is that the EF connection string cannot be found.

Remember: you will have to add the EF connection string to the config file of the calling app. The config file is what counts when it comes to the connection string, not the config file of the ErrorHandler itself. Below is an example of the connection string I added (note that there are both the EF connection string and the ADO.Net connection string):

<connectionStrings>
<add name=”EntityModelCommon_v10″ connectionString=”metadata=res://*/CommonEntities.csdl|res://*/CommonEntities.ssdl|res://*
/CommonEntities.msl;provider=System.Data.SqlClient;provider connection string=”data source=bvgo-dev-sqlazure.database.windows.net;initial catalog=bvgo-appservices;user id=—;password=—;MultipleActiveResultSets=True;App=EntityFramework”” providerName=”System.Data.EntityClient” />
<add name=”bvgo-appservices” connectionString=”Server=tcp:bvgo-dev-sqlazure.database.windows.net,1433;Database=bvgo-appservices;User ID=—;Password=—;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;” />
</connectionStrings>

Then I received the following error:
No Entity Framework provider found for the ADO.NET provider with invariant name ‘System.Data.SqlClient’. Make sure the provider is registered in the ‘entityFramework’ section of the web.config file.

The solution to this problem was very simple. I had to add the Entity Framework NuGet package to the references of the Logic App. This may sound counter
intuitive because the Logic App itself doesn’t use EF, but believe me: it works! Below is the section that is added to the web.config file of the Logic App:

<entityFramework>
<defaultConnectionFactory type=”System.Data.Entity.Infrastructure.LocalDbConnectionFactory, EntityFramework”>
<parameters>
<parameter value=”mssqllocaldb” />
</parameters>
</defaultConnectionFactory>
<providers>
<provider invariantName=”System.Data.SqlClient” type=”System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer” />
</providers>
</entityFramework>

Note the provider System.Data.SqlClient, the provider that was missing.

Do Until in Logic App

June 2015 the DoUntil functionality in Logic Apps was announced. You can add a Do Until to every action in your Logic App. Let’s say you wanna do a Http Post until you receive an Http 200. You can specify the  do until expression and one or more limits (for example: call max 5 times or do for max 5 minutes. Below is an example:

Do Until

Logic App without a trigger

Today I watched a hangout hosted by Jeff Hollan where I saw that you can just build a logic app with two SQL Connector actions and no trigger.

LogicAppWithoutTrigger

If you build a logic app without a trigger, you are actually saying you want to start the logic app manually. Let’s say you wanna call it from Postman or from another Logic app. First question is: where do you find the URL to call?

LogicAppAccess

Go to the Settings of your Logic App, select Properties and then you will see: (1) the Https access endpoint and (2) the primary access key that you can use to generate a Base64 encoded Basic authentication token (note: the username is default, but you can create your own). Postman and Fiddler apparantly have built in tools to generate the Basic authentication token.

Next, when we call the logic app from Postman, you will notice you can append a parameter run?api-version-2015-02-01-preview to call a specific version of your API. Didn’t see where you can get the version string from, but that’s probably easy to find.

CallApiFromPostman

After calling, note that you get a synchronuous http response. This response says the request was received in good order. It doesn’t say the request was fully processed by the API. You can use the Operations tab to track the execution of your Logic app.

The alternative to using Postman, is calling the Logic app from another Logic App. In the example has a Twitter action as a trigger. It says that whenever someone tweets a message with “logicappsio” in it, I catch the email address (which is one of the properties I can retrieve from the tweet body) and Http Post it to the first Logic app I showed. It’s the exact same Http post that was done via Postman before.

Calling Logic App

Get input parameters from the tweet:

TweetValues