Powershell script for APIManagementARMTemplateCreator fails

When using the APIManagementARMTemplateCreator, you run a Powershell script to actually generate the ARM templates. The Powershell script starts with the following command:
Install-Module -Name APIManagementTemplate -AllowClobber -Force

I used the Powershell script extensively before, but suddenly I ran into problems. My problems were very well explained in the following post:
https://dev.to/darksmile92/powershell-disabled-support-for-tls-1-0-for-the-gallery-update-module-and-install-module-broken-1oii

In short, Microsoft announced that the PowerShell Gallery has deprecated Transport Layer Security (TLS) versions 1.0 and 1.1 as of April 2020. This means you suddenly can’t run Install-Module anymore. Run the following command in Powershell ISE to fix the issue:
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
Install-Module PowerShellGet -RequiredVersion 2.2.4 -SkipPublisherCheck

Can be a real timesaver!

Deploy API Connection using ARM

Azure API Connections have a name and a display name. When you create a new API Connection via the Azure Portal, you are not able to specify the name of the API Connection. This name is actually used in Logic Apps however. Example:
@parameters(‘$connections’)[‘sftp_1’][‘connectionId’]

In this case sftp_1 is the name, not the display name. If you deploy this logic app from DEV to ACC, you might run into problems when connection name sftp_1 refers to a different FTP server. The only way around this, is to create the API Connection via Powershell using an ARM template. The ARM template will create a resource of type Microsoft.Web/connections.

Example ARM Template:

{
“$schema”: “https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#”,
“contentVersion”: “1.0.0.0”,
“parameters”: {
“logicAppLocation”: {
“type”: “string”,
“defaultValue”: “[resourceGroup().location]”,
“allowedValues”: [
“[resourceGroup().location]”,
“eastasia”,
“southeastasia”,
“centralus”,
“eastus”,
“eastus2”,
“westus”,
“northcentralus”,
“southcentralus”,
“northeurope”,
“westeurope”,
“japanwest”,
“japaneast”,
“brazilsouth”,
“australiaeast”,
“australiasoutheast”,
“westcentralus”,
“westus2”
],
“metadata”: {
“description”: “Location of the Logic App.”
}
},
“sftp_name”: {
“type”: “string”,
“defaultValue”: “sftp”
},
“sftp_displayName”: {
“type”: “string”,
“defaultValue”: “”
},
“sftp_hostName”: {
“type”: “string”,
“defaultValue”: “”,
“metadata”: {
“description”: “Host Server Address”
}
},
“sftp_userName”: {
“type”: “string”,
“defaultValue”: “”,
“metadata”: {
“description”: “User Name”
}
},
“sftp_password”: {
“type”: “securestring”,
“defaultValue”: null,
“metadata”: {
“description”: “Password”
}
},
“sftp_portNumber”: {
“type”: “int”,
“defaultValue”: 22,
“metadata”: {
“description”: “SFTP Port Number (example: 22)”
}
},
“sftp_giveUpSecurityAndAcceptAnySshHostKey”: {
“type”: “bool”,
“defaultValue”: true,
“metadata”: {
“description”: “Disable SSH Host Key Validation? (True/False)”
}
},
“sftp_sshHostKeyFingerprint”: {
“type”: “string”,
“defaultValue”: “”,
“metadata”: {
“description”: “SSH Host Key Finger-print”
}
},
“sftp_disableUploadFilesResumeCapability”: {
“type”: “bool”,
“defaultValue”: false,
“metadata”: {
“description”: “Disable Resume Capability? (True/False)”
}
}
},
“variables”: {},
“resources”: [
{
“type”: “Microsoft.Web/connections”,
“apiVersion”: “2016-06-01”,
“location”: “[parameters(‘logicAppLocation’)]”,
“name”: “[parameters(‘sftp_name’)]”,
“properties”: {
“api”: {
“id”: “[concat(‘/subscriptions/’,subscription().subscriptionId,’/providers/Microsoft.Web/locations/’,parameters(‘logicAppLocation’),’/managedApis/sftp’)]”
},
“displayName”: “[parameters(‘sftp_displayName’)]”,
“parameterValues”: {
“hostName”: “[parameters(‘sftp_hostName’)]”,
“userName”: “[parameters(‘sftp_userName’)]”,
“password”: “[parameters(‘sftp_password’)]”,
“portNumber”: “[parameters(‘sftp_portNumber’)]”,
“giveUpSecurityAndAcceptAnySshHostKey”: “[parameters(‘sftp_giveUpSecurityAndAcceptAnySshHostKey’)]”,
“sshHostKeyFingerprint”: “[parameters(‘sftp_sshHostKeyFingerprint’)]”,
“disableUploadFilesResumeCapability”: “[parameters(‘sftp_disableUploadFilesResumeCapability’)]”
}
}
}
],
“outputs”: {}
}

It’s a bit too much, to share the entire Powershell function, but I will explain its workings. The Powershell function reads settings like the subscriptionId and the resource group from a settings.xml file. Next, it’s checked whether the user is already logged on. If not, Login-AzureRmAccount is called without parameters. This opens a window where you can specify username en password.

Finally the following statement is run:
New-AzureRmResourceGroupDeployment -TemplateFile $templateFilePath -ResourceGroupName $resourceGroupName -TemplateParameterFile $parametersFilePath

This command uses the above template file and a parameters file for configuration.

Deploy Azure Functions using Powershell and FTP

Deploying an Azure Function to Azure to a large extent resembles deploying an API App. See Link:  Deploy Api App. The only problem is, we can’t create a deployment package for Azure Functions via Visual Studio. Instead, go to the Azure Portal. Open the relevant Function container and click Download App Content.

You will now get a zip file with subfolders per Azure Function. Each Azure Function folder contains a csx file with the Azure Function code and a json file for configuration settings. Also note we can define parameters at the Function Container level. An example is a queue trigger function where the address from the Logic App is taken from the settings. Below you will see the ps1 file and an example psx file.

DeployTriggerFunctions.ps1:
param (
[parameter(Mandatory = $true)][string] $paramFileName
)

Import-Module AzureRM.Resources

Write-Host “Login to Azure” –fore gray;
Login-AzureRmAccount

# $PSScriptRoot is null when you run from Powershell ISE
# Run from cmd file
$baseDir = $PSScriptRoot
$pos = $basedir.LastIndexOf(‘\’)
$baseDirParam = $baseDir.Substring(0,$pos)
$dirParam = $baseDirParam + “\AAA-DeployResources\” + $paramFileName
$dirUtils = $baseDirParam + “\AAA-DeployResources\Utils.psm1”
Write-Host “BaseDir: ” $baseDir –fore gray;
Write-Host “Dir ParameterFile: ” $dirParam –fore gray;
Write-Host “Dir UtilsFile: ” $dirUtils –fore gray;

#region Load Parameter file
. $dirParam
#endregion

#region Import Utils file
Import-Module “$dirUtils”
#endregion

Write-Host “Functions Container: ” $nameFunctionsContainer –fore gray
$result=’false’
$result = UploadToFTP -webAppName $nameFunctionsContainer -rg $resourcegroupFunctions -sourceDir $sourceDirTriggerFunctions

if ($result=’true’)
{
Write-Host $(Get-Date).ToString(“yyyyMMdd_HHmss”) ” Add Azure Function Container ” $nameFunctionsContainer ” to Azure succeeded” –fore green
}
else
{
Write-Host $(Get-Date).ToString(“yyyyMMdd_HHmss”) ” Add Azure Function Container ” $nameFunctionsContainer ” to Azure failed” –fore red
}

# Add ApplicationSettings
$appSettingsTriggerFunctions = @{“AzureWebJobsDashboard”=$azureWebJobsDashboard;”AzureWebJobsStorage”=$azureWebJobsStorage;”FUNCTIONS_EXTENSION_VERSION”=$FUNCTIONS_EXTENSION_VERSION; `
“WEBSITE_NODE_DEFAULT_VERSION”=$WEBSITE_NODE_DEFAULT_VERSION_FUNC;”ahakstorage_STORAGE”=$storageConnectionString;”ProcessAGAEventsURI”=$ProcessAGAEventsURI; `
“ProcessAGABeoordelingURI”=$ProcessAGABeoordelingURI;”ProcessAGPURI”=$ProcessAGPURI;”ProcessAGPBeoordelingURI”=$ProcessAGPBeoordelingURI;”ProcessAnnuleringenURI”=$ProcessAnnuleringenURI; `
“ProcessAnnuleringGereedURI”=$ProcessAnnuleringGereedURI;”ProcessBijstellingURI”=$ProcessBijstellingURI;”ProcessOpdrachtenURI”=$ProcessOpdrachtenURI;”ProcessOpdrachtInfoURI”=$ProcessOpdrachtInfoURI; `
“ProcessPlanningURI”=$ProcessPlanningURI;”ProcessStatusUpdateURI”=$ProcessStatusUpdateURI;”ProcessTGEventsURI”=$ProcessTGEventsURI;”ProcessWorkOrderURI”=$ProcessWorkOrderURI}
Set-AzureRmWebApp -Name $nameFunctionsContainer -AppSettings $appSettingsTriggerFunctions -ResourceGroupName $resourcegroupFunctions

run.cxs:
using System;
using System.Threading.Tasks;
using System.Net.Http;
using System.Text;

private static string logicAppUri = Environment.GetEnvironmentVariable(“ProcessAnnuleringenURI”);

public static void Run(string myQueueItem, TraceWriter log)
{
log.Info($”C# Queue trigger function processed: {myQueueItem}”);
using (var client = new HttpClient())
{
var response = client.PostAsync(logicAppUri, new StringContent(myQueueItem, Encoding.UTF8, “application/json”)).Result;
}
}

Again we upload the files using FTP. We can use the same FTP script we saw when deploying the API App. There’s one gotcha. If we deploy the Azure Function this way and we go the Integrate tab, we will note that the storage account connection name (in case of a queue trigger) is not selected. We have to select this setting manually.

Deploy API App using Powershell and FTP

//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js

(adsbygoogle = window.adsbygoogle || []).push({
google_ad_client: “ca-pub-1655089477432243”,
enable_page_level_ads: true
});

Deploying an API App to Azure to a large extent resembles deploying a Logic App. See link: Deploy Logic App. The difference is that an API App contains executable code that needs to be built. The trick here is to build a web deploy package in Visual Studio. From that web deploy package we get the files that need to be uploaded to apiappname.scm.azurewebsites.net (kudu). We upload these files using FTP. For FTP to work we need to know the FTP URL to publish to, the FTP userand the FTP password. This information is contained in the so-called Publish Profile of the API App. We can get the publish profile of the API App via Powershell.

For reusability I have created a Powershell module (*.psm) with a function named UploadToFTP. This function contains functionality to retrieve the publish profile, to create the necessary folders and upload the dll’s, configs and other content files. The code is shown below:

function UploadToFTP
{
param (
[parameter(Mandatory = $true)][string] $webAppName,
[parameter(Mandatory = $true)][string] $rg,
[parameter(Mandatory = $true)][string] $sourceDir
)

# Get publishing profile for the web app
[xml]$xml = (Get-AzureRmWebAppPublishingProfile -Name $webAppName -ResourceGroupName $rg -OutputFile null)
#Write-Host “PublishingProfile: ” $xml –fore gray;

# Extract connection information from publishing profile
$username = $xml.SelectSingleNode(“//publishProfile[@publishMethod=’FTP’]/@userName”).value
$password = $xml.SelectSingleNode(“//publishProfile[@publishMethod=’FTP’]/@userPWD”).value
$url = $xml.SelectSingleNode(“//publishProfile[@publishMethod=’FTP’]/@publishUrl”).value
Write-Host “FTP Url: ” $url –fore gray;

# Upload files recursively
Write-Host “Source Directory: ” $sourceDir –fore gray;
Set-Location $sourceDir
$webclient = New-Object -TypeName System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($username,$password)

# Get folders and files
$SrcEntries = Get-ChildItem $sourceDir -Recurse
$Srcfolders = $SrcEntries | Where-Object{$_.PSIsContainer}
$SrcFiles = $SrcEntries | Where-Object{!$_.PSIsContainer}

# Create subdirectories
foreach($folder in $Srcfolders)
{

$SrcFolderPath = $sourceDir -replace “\\”,”\\” -replace “\:”,”\:”
$DesFolder = $folder.Fullname -replace $SrcFolderPath,($url+’\’)
Write-Host “FTP folder fullname: ” $folder.Fullname –fore gray;
$DesFolder = $DesFolder -replace “\\”, “/”
Write-Host “FTP folder: ” $DesFolder –fore gray;

try
{
$makeDirectory = [System.Net.WebRequest]::Create($DesFolder);
$makeDirectory.Credentials = New-Object System.Net.NetworkCredential($username,$password);
$makeDirectory.Method = [System.Net.WebRequestMethods+FTP]::MakeDirectory;
$makeDirectory.GetResponse();
Write-Host “FTP folder: ” $DesFolder –fore gray;
}
catch [Net.WebException]
{
Write-Host “FTP folder not created. Possibly it already existed: ” $DesFolder –fore gray;
}
}

# Upload files from deployment package
foreach($file in $SrcFiles)
{
$SrcFullname = $file.fullname
$SrcName = $file.Name
$SrcFilePath = $sourceDir -replace “\\”,”\\” -replace “\:”,”\:”
$DesFile = $SrcFullname -replace $SrcFilePath,($url+’\’)
$DesFile = $DesFile -replace “\\”, “/”
Write-Host “SrcFile: ” $SrcFullname –fore yellow;
Write-Host “DesFile: ” $DesFile –fore yellow;

$uri = New-Object System.Uri($DesFile)
$webclient.UploadFile($uri, $SrcFullname)
#Write-Host “File uploaded: ” $SrcFullname –fore yellow;
}

$webclient.Dispose()

return ‘true’

}

Another thing that is important for API Apps is to automatically set the App Settings. Below is the necessary Powershell script:

# Add ApplicationSettings
$appSettingsNav4PSGatewayClient = @{“WEBSITE_NODE_DEFAULT_VERSION”=$WEBSITE_NODE_DEFAULT_VERSION;”Nav4PSGatewayUrl”=$nav4PSGatewayUrl;”Nav4PSUser”=$nav4PSGatewayUser;”Nav4PSPassword”=$nav4PSGatewayPassword}
Set-AzureRmWebApp -Name $nameNav4PSGatewayClient -AppSettings $appSettingsNav4PSGatewayClient -ResourceGroupName $resourcegroup
Write-Host $(Get-Date).ToString(“yyyyMMdd_HHmss”) ” App settings ” $nameNav4PSGatewayClient” updated” –fore green

Beware. You have to use unique names for Appsetting variables. In my case I had two apps: a DSPGateway (for incoming traffic) and a DSPConnector (for outgoing traffic). Both apps needed a username/password for authentication. I named these variabeles DSPUserName and DSPPassword for both apps. That’s wrong. You have to use unique names. So, for example: DSPGatewayUser, DSPConnectorUser, DSPGatewayPassword and DSPConnectorPassword.

Logic App error using workflow parameters

I received the following error when deploying a Logic App via Powershell/ARM:
InvalidTemplate. Unable to process template language expressions in action ‘HTTP’ inputs at line ‘1’ and column ‘1420’: ‘The workflow parameter ‘ahakStorageConnectorName’ is not found.’.

At the top of the Logic App Json file, I defined the parameters. The parameter values were contained in a separate parameters.json file. I thought it would work when using a deploy statement in Powershell referencing both the json template file and the parameter file, like this:
$logicAppTemplate = $baseDir + ‘\ProcessAGP.json’
$logicAppParameter = $baseDir + ‘\ProcessAGP.parameters.json’
New-AzureRmResourceGroupDeployment -Name ‘DeployAGPTst’ -ResourceGroupName $resourcegroup -TemplateFile $logicAppTemplate -TemplateParameterFile $logicAppParameter

As I received an error I took a different approach. If you scroll down the json template file, you’ll notice there are an additional two parameter sections.
In the first parameter section I copied the parameter definitions from the top of the json template file. In the second parameter section I copied the parameter values from the parameters.json file.
This is apparantly the way, you can make parameters available to a Logic App.

Example:
{
“$schema”: “https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#”,
“contentVersion”: “1.0.0.0”,

“parameters”: {
“ahakStorageConnectorName”: {
“type”: “string”,
“metadata”: {
“description”: “Name of the AHakStorageConnector”
}
},
“dspConnectorName”: {
“type”: “string”,
“metadata”: {
“description”: “Name of the DSPConnector”
}
},
“logicAppUploadToSharepoint”: {
“type”: “string”,
“metadata”: {
“description”: “LogicApp UploadToSharepoint”
}
},
“rg”: {
“type”: “string”,
“metadata”: {
“description”: “Resourcegroup”
}
},
“rgFunctions”: {
“type”: “string”,
“metadata”: {
“description”: “Resourcegroup Functions”
}
},
“functionContainer”: {
“type”: “string”,
“metadata”: {
“description”: “Function Container”
}
}
},
“triggers”: {
“manual”: {
“type”: “Request”,
“kind”: “Http”,
“inputs”: {
“schema”: {
“$schema”: “http://json-schema.org/draft-04/schema#”,
“properties”: {
“Event”: {
“type”: “string”
},
“Id”: {
“type”: “string”
}
},
“required”: [
“Event”,
“Id”
],
“type”: “object”
}
}
}
},
“contentVersion”: “1.0.0.0”,
“outputs”: {}
},
“parameters”: {
“ahakStorageConnectorName”: {
“value”: “tstahakstorageconnector”
},
“dspConnectorName”: {
“value”: “tstdspconnector”
},
“logicAppUploadToSharepoint”: {
“value”: “TstUploadToSharePoint”
},
“rg”: {
“value”: “ahak-appservices-tst”
},
“rgFunctions”: {
“value”: “ahak-appfunctions-tst”
},
“functionContainer”: {
“value”: “ahak-functions-tst”
}
}
}
}
],
“outputs”: {}
}

Deploy Logic App using Powershell and ARM

To deploy a logic app via Powershell, I used the following approach;
Create a directory structure with two folders, i.e.:
C:\Sources\PBaars\Deployment\Release1.5\AAA-DeployResources
C:\Sources\PBaars\Deployment\Release1.5\ProcessPlanning

Folder AAA-DeployResources contains reusable artefacts like a parameter file and a Powershell module with reusable code. In this case the reusable code uploads files for a Web App or API App to a FTP folder. I will come back to that later.The parameters file is named ParametersTst.ps1. That’s because I want to deploy a Logic App to a test environment. The parameters file can also be used for AppSettings. AppSettings are relevant for Web Apps and API Apps. I will come back to that later.
The folder ProcessPlanning contains artefacts related to Logic App ProcessPlanning. The artefacts are:DeployProcessPlanning.cmdDeployProcessPlanning.ps1ProcessPlanning.jsonProcessPlanningTst.parameters.json
DeployProcessPlanning.cmd has just one purpose: call DeployProcessPlanning.ps1 passing a parameter for the environment specific parameter file. Note that we can pass a different parameter file for different environments like Acc and Prod.


@Echo Off

Echo Changing PowerShell Execution policy…

powershell Get-Execution

Policypowershell Set-ExecutionPolicy Unrestricted
powershell %~dpn0.ps1 -paramFileName “ParametersTst.ps1”
pause


DeployProcessPlanning.ps1 is the Powershell script that kicks off the Logic App deployment using both ProcessPlanning.json and ProcessPlanningTst.parameters.json. As we will see later ProcessPlanningTst.parameters.json – as the name suggests – is used for parameterization of the json file. In this case we renamed the file to ProessPlanningTst.parameters.json (note the Tst prefix), because this file performs paramterization of the Test version of the Logic App.
To use DeployProcessPlanning.ps1, we first have to add a parameter for the name of the Logic App to ParametersTst.ps1 in folder AAA-DeployResources.
# ProcessPlanning
$nameProcessPlanning = ‘TstProcessPlanning’

Now we can edit the contents of DeployProcessPlanning.ps1:


param (
[parameter(Mandatory = $true)][string] $paramFileName
)

Import-Module AzureRM.Resources

Write-Host “Login to Azure” –fore gray;
Login-AzureRmAccount

# $PSScriptRoot is null when you run from Powershell ISE
# Run from cmd file
$baseDir = $PSScriptRoot
$pos = $basedir.LastIndexOf(‘\’)
$baseDirParam = $baseDir.Substring(0,$pos)
$paramFile = $baseDirParam + “\AAA-DeployResources\” + $paramFileName

#region Load Parameter file
. $paramFile
#endregion

$logicAppTemplate = $baseDir + ‘\ProcessPlanning’ + $environment + ‘.json’
$logicAppParameter = $baseDir + ‘\ProcessPlanning’ + $environment + ‘.parameters.json’

Write-Host “BaseDir: ” $baseDir –fore gray;
Write-Host “LogicAppTemplate: ” $logicAppTemplate –fore gray;
Write-Host “LogicAppParameter: ” $logicAppParameter –fore gray;
Write-Host “ParameterFile: ” $paramFile -fore gray;

# New-AzureRmResourceGroup -Name ahak-appservices-tst -Location “westeurope”
Write-Host $(Get-Date).ToString(“yyyyMMdd_HHmss”) ” Add logic app ” $nameProcessPlanning ” to Azure” –fore gray;
New-AzureRmResourceGroupDeployment -Name ‘DeployPlanningTst’ -ResourceGroupName $resourcegroup -TemplateFile $logicAppTemplate -TemplateParameterFile $logicAppParameter

# Check status of last command, i.e. deployment to Azure
if ($?)
{
Write-Host $(Get-Date).ToString(“yyyyMMdd_HHmss”) ” Add logic app ” $nameProcessPlanning ” to Azure succeeded” –fore green
}
else
{
Write-Host $(Get-Date).ToString(“yyyyMMdd_HHmss”) ” Add logic app ” $nameProcessPlanning ” to Azure failed” –fore red
}

# List Swagger
#Invoke-AzureRmResourceAction -ResourceGroupName ahak-appservices-tst -ResourceType Microsoft.Logic/workflows -ResourceName TstProcessAGPBeoordeling -Action listSwagger -ApiVersion 2016-06-01 -Force
# GET DevProcessAGPBeoordeling
Get-AzureRmResource -ResourceGroupName $resourcegroup -ResourceType Microsoft.Logic/workflows -ResourceName $nameProcessPlanning -ApiVersion 2016-06-01


There are a few things to note:

  • First you see how the parameter file can be passed to the ps1 script.
  • Next you see the LogicAppTemplate and the LogicAppParameter file is specified. Both the template file and the parameter file have a variable $environment which makes the parameter file specific per environment. If we pass parameter ParametersPrd.ps1 to theps1 file. We can add a file ParamterPrd.ps1 to folder AAA-DeployResources and files ProcessPlanningPrd.json and ProcessPlanningPrd.parameters.json to folder ProcessPlanning.
  • Next you see, I added parameter $nameProcessPlanning at multiple places.
  • Finally I renamed the deployment to DeployProcessPlanning in command New-AzureRmResourceGroupDeployment.

Now, let’s turn our attention to ProcessPlanning.json and ProcessPlanningTst.parameters.json. Both files are copied from the original ResourceGroup project containing the Logic App.In the json file we see there are some environment-specific settings:
Custom Api App: “https://devdspconnector.azurewebsites.net/api/Planning”
Custom Api App: “https://devahakstorageconnector.azurewebsites.net/api/Storage/OpdrachtGegevens/@{triggerBody()[‘Id’]}”
Reusable Workflow: “/subscriptions/1ea9735d-00df-4375-86d5-d0d35362dd7f/resourceGroups/ahak-appservices-dev/providers/Microsoft.Logic/workflows/DevUploadToSharePoint”

It’s obvious that you want one version of the Logic App.json file, but – depending on the environment – this Logic App has to call different Api Apps (Dev/Tst/Prd) and different child workflows. To make this possible we will have to definea few parameters at the top of the json file in section parameters:


“ahakStorageConnectorName”: {
“type”: “string”,
“metadata”: {        “description”: “Name of  the AHakStorageConnector”      }
},
“dspConnectorName”: {
“type”: “string”,
“metadata”: {        “description”: “Name of  the DSPConnector”      }
},
“logicAppUploadToSharepoint”: {
“type”: “string”,
“metadata”: {        “description”: “LogicApp UploadToSharepoint”      }
},
“rg”: {
“type”: “string”,
“metadata”: {        “description”: “Resourcegroup”      }
}


Note that we have to enter the definitions of the custom parameters twice. Look for the Parameter section between the sections Resources and Triggers. If we don’t enter the parameter definitions here, we will receive an error like: ‘The only declared parameters for this definition are ‘$connections’ (i.e. there are no custom parameters).

Also note that we added a variable rg to the variables section. I can’t tell why exactly this is necessary, but otherwise it won’t work.


“variables”:
{ “rg”: “[parameters(‘rg’)]”
}


The values for these parameters we have to add the environment specific parameters file, in this case: ProcessPlanningTst.parameters.json


{
“$schema”: “https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#”,
“contentVersion”: “1.0.0.0”,
“parameters”:
{
“logicAppName”: {      “value”: “TstProcessPlanning”    },
“logicAppLocation”: {      “value”: “westeurope”    },
“ahakStorageConnectorName”: { “value”: “tstahakstorageconnector”      },
“dspConnectorName”: { “value”: “tstdspconnector”      },
“rg”: { “value”: “ahak-appservices-tst”
},
“logicAppUploadToSharepoint”: { “value”: “TstUploadToSharePoint”
}  }}


The final step, is to use the parameters in the json file:
Custom Api App: “https://@{encodeURIComponent(parameters(‘dspConnectorName’))}.azurewebsites.net/api/Planning”
Custom Api App: “https://@{encodeURIComponent(parameters(‘ahakStorageConnectorName’))}.azurewebsites.net/api/Storage/OpdrachtGegevens/@{triggerBody()[‘Id’]}”
Reusable Workflow: “[concat(‘/subscriptions/1ea9735d-00df-4375-86d5-d0d35362dd7f/resourceGroups/’, variables(‘rg’), ‘/providers/Microsoft.Logic/workflows/’, parameters(‘logicAppUploadToSharepoint’))]”

We are almost done now. Go to directory ProcessPlanning, right-click DeployProcessPlanning.cmd and Run As Administrator. When things go well, the Logic App will be deployed.

Useful link Microsoft Azure site: ARM template for Logic App

SQL query in Powershell

For security reasons I only had access to the BizTalk Server, not to the SQL Server. I know I can also install SQL Server Management Studio on the BizTalk Server, but there’s another way by using Powershell. First  you will have to install the PowerShell module for SQL Server, that is: SQLPS.  You can download the module via the Microsoft SQL Server 2016 Feature Pack: Download. Locate the PowerShellTools.msi and install on BizTalk Server.

Next, open PowerShell ISE (run as adminstrator):

Import-Module Sqlps -DisableNameChecking;
#Get-Module -ListAvailable -Name Sqlps

$_server = Get-WmiObject -Namespace “root\MicrosoftBizTalkServer” -Class “MSBTS_GroupSetting”
$SqlServerName = $_server.MgmtDbServerName
$bizTalkMgmtDbName = $_server.MgmtDbName

$sql = “SELECT [Name] FROM [$bizTalkMgmtDbName].[dbo].[adm_Server]”
$biztalkServers = Invoke-Sqlcmd -ServerInstance $SqlServerName -Database $bizTalkMgmtDbName -Query $sql

foreach ($i in $biztalkServers)
{
$bizTalkServer = $i.Item(0)
write-host $bizTalkServer
}

Note that I didn’t specify a path when importing the Sqlps module. That’s because I added an environment variable PSModulePath:

PSModulePath=C:\Windows\system32\WindowsPowerShell\v1.0\Modules\;C:\Program File
s (x86)\Microsoft SQL Server\120\Tools\PowerShell\Modules\

 

 

 

Call Powershell script from command file

Just another gotcha. I have a command file Kw1c_Build.cmd that calls a Powershell script Kw1c_Build.ps1. The command file has the following code:

@Echo Off

Echo Changing PowerShell Execution policy…
powershell Get-ExecutionPolicy
powershell Set-ExecutionPolicy Unrestricted

powershell %~dpn0.ps1

pause

What’s important to know, is that %~dpn0 returns the Drive, Path and Name of the currently executing script. So, if your .cmd file and .ps1 file have the same name, you can use a call like that. In my opinion you can also reference the ps1 file by it’s path or even use a relative path: powershell .\Kw1c_Build.ps1.
I’m not sure about the last assumption, but must be working.

Powershell Settingsfile

In a Powershell (.ps) script, you can use settings from a settings file. In this case we have a Powershell script Functions_Build.ps1 which uses a settings file Settings_BuildEnvironment.csv. You can use the following code in your PowerShell script:

Set-ExecutionPolicy Unrestricted
# Import general helpers using dot operator
. .\Functions_General.ps1

# Load parameters
$settings = Import-Csv Settings_BuildEnvironment.csv
foreach($setting in $settings)
{
# The directory where the BizTalk projects are stored
if($setting.’Name;Value’.Split(“;”)[0].Trim() -eq “projectsBaseDirectory”) { $projectsBaseDirectory = $setting.’Name;Value’.Split(“;”)[1].Trim() }

# The directory where the MSI’s should be saved to
if($setting.’Name;Value’.Split(“;”)[0].Trim() -eq “installersOutputDirectory”) { $installersOutputDirectory = $setting.’Name;Value’.Split(“;”)[1].Trim() }

# Directory where Visual Studio resides
if($setting.’Name;Value’.Split(“;”)[0].Trim() -eq “visualStudioDirectory”) { $visualStudioDirectory = $setting.’Name;Value’.Split(“;”)[1].Trim() }
}

The settinsfile looks like this:
Name;Value
projectsBaseDirectory;..\..\..\..\..\..\Trunk\ESB
installersOutputDirectory;..\..\..\..\..\..\Trunk\ESB\Deployments
visualStudioDirectory;C:\Program Files (x86)\Microsoft Visual Studio 14.0

To use Functions.Build.ps from my main build script Kw1c_Build.ps1, use the following code. Note that we call a function BuildAndCreateBizTalkInstallers in Functions_Build.ps1.

Set-ExecutionPolicy Unrestricted
# Project specific settings
$projectName = “Source”
$applications = @(“Kw1c.BizTalk.Skb.Groep”)

# Import custom functions
. .\Functions_Build.ps1

# Build the applications
BuildAndCreateBizTalkInstallers $applications $projectName

# Wait for user to exit
WaitForKeyPress

Powershell script to enable/disable receive location

I had a situation in which I had to process a very large number of messages as an initial load. The messages had to be sent to a third-party webservice, but  I wanted to prevent the webservice from being overloaded. In that case you can build a sequential convoy, to send one message at a time. I didn’t want to build an orchestration for a one-time process though. That’s why I made a Powershell script that enables and disables the receive location in a configurable loop. Maybe not the nicest solution, but it solves my problem.

# Get the Receive Location of the specified name
$location = get-wmiobject msbts_receivelocation -Namespace ‘root\MicrosoftBizTalkServer’ -Filter “name=’RcvArticleInformationUpdate.MSMQ'”

# Initialize loop
$loop = 1

# Disable the Receive Location
$location.Disable()
“Disable Receive Location”

DO
{

“Starting Loop $loop”

# Enable the Receive Location
$location.Enable()
“Enable Receive Location”

# Wait for 1 minute
Start-Sleep -s 30

# Disable the Receive Location
$location.Disable()
“Disable Receive Location”

# Wait for 1 minute
Start-Sleep -s 90

$loop++

} While ($loop -le 5)