Chat with us, powered by LiveChat

See how Adaptiv can transform your business. Schedule a kickoff call today

Modernising Consumption Logic Apps with Bicep and IaC

  • Technical
  • Thought Leadership
  • General
  • Construction
  • Energy & Utilities
  • Government & Public Sector
  • Healthcare
  • Manufacturing
  • Education
  • Professional Services
  • Financial Services
  • Retail

There have been many improvements in the Logic Apps space recently with Standard tier coming out of preview and Bicep becoming a mainstay in the world of Infrastructure as Code (IaC). Bicep has made creating the resources a lot simpler and less verbose. And the separation of Infrastructure and Code with Logic Apps Standard has made the day-to-day way of working, especially when the integration developer is responsible for their own DevOps, a great deal easier once everything is set up. But what about Consumption? While it’s a slightly older product (and has tended to fall by the wayside), I think there are still many valid use cases for the Consumption tier. It’s been a shame to see it get left behind, both in the areas I see it used daily and in the online documentation and blogs. So, if you’re a Consumption aficionado, how can you elevate its use to align a little more closely with the way you work on Standard Logic Apps? While this is by no means gospel on how to do it, I’ve found the following solution has helped to considerably speed up development and make jumping between the two products a bit less jarring.

The Setup

First, let’s take a quick look at how I’ve taken to structuring things in the repo (aka repository). I separate what I treat as infrastructure and what I treat as code. Traditionally, the infrastructure folder might house things like your ARM, Bicep or Terraform files for a Function App or Logic App Standard, and then the code would contain your actual code to go in the function app or your Logic App workflows. With Consumption though, that line gets a bit blurred. It used to all be treated as one – especially if you tended to extract ARM templates directly from the portal or used some of the tools floating around. However, my goal here is to make it so that once you’ve set up the Logic App, connections and any other dependencies, you can then update the definition.json with actual workflow changes straight out of the portal code view (similar to how Standard tends to work if using the ZIP deploy method).

The laC

So, what does that look like in Bicep? If you take a look at the Microsoft docs, the actions of your Consumption Logic App can be defined in the Bicep, much like how the ARM looks when pulling from the portal.

The Logic App

The Bicep for the Consumption Logic App might look something like this:

resource logicAppConsumption 'Microsoft.Logic/workflows@2019-05-01' = {

        name: '${logicAppName}-${regionSuffix}-${environment}'

        location: location

        tags: tags

        dependsOn: [

          serviceBusConnection

          keyVaultConnection

        ]

        /** 

        Note: Consumption can only have System or a single User assigned identity. I've done the MSI and RBAC with system here

        as user assigned would be comparatively easier and would require less code.

        */

        identity: {

          type: 'SystemAssigned'

        }

        properties: {

          state: 'Enabled'

          // Set only allowd IP's to access the LA

          accessControl: {

            actions: {

              allowedCallerIpAddresses: allowedIpsFull

            }

            contents: {

              allowedCallerIpAddresses: allowedIpsFull

            }

            triggers: {

              allowedCallerIpAddresses: allowedIpsFull

            }

            workflowManagement: {

              allowedCallerIpAddresses: allowedIpsFull

            }

          }

          // This is what normally makes the ARM/Bicep massive, the workflow definition. Treat it as code and import it here instead of defining it here

          definition: logicAppDefinition.definition

          /** 

          These parameters are technically part of the code and are seen in the code view. I think defining them here where you

          can also see the API connection resources makes life easier and they change a lot less than the definition.

          */

          parameters: {

            '$connections': {

              value: {

                servicebus: {

                  connectionId: '/subscriptions/${subscription().subscriptionId}/resourceGroups/${resourceGroup().name}/providers/Microsoft.Web/connections/${logicAppName}-servicebus-${regionSuffix}-${environment}'

                  connectionName: '${logicAppName}-servicebus-${regionSuffix}-${environment}'

                  connectionProperties: {

                    authentication: {

                      type: 'ManagedServiceIdentity'

                    }

                  }

                  id: '/subscriptions/${subscription().subscriptionId}/providers/Microsoft.Web/locations/${location}/managedApis/servicebus'

                }

                keyvault: {

                  connectionId: '/subscriptions/${subscription().subscriptionId}/resourceGroups/${resourceGroup().name}/providers/Microsoft.Web/connections/${logicAppName}-keyvault-${regionSuffix}-${environment}'

                  connectionName: '${logicAppName}-keyvault-${regionSuffix}-${environment}'

                  connectionProperties: {

                    authentication: {

                      type: 'ManagedServiceIdentity'

                    }

                  }

                  id: '/subscriptions/${subscription().subscriptionId}/providers/Microsoft.Web/locations/${location}/managedApis/keyvault'

                }

              }

            }

          }

        }

      }

So, what’s going on here? The above is an extreme example as there is a lot that could be omitted depending on your needs – but it does illustrate my point. First, I set up System Assigned Identity. I’ll use this later to set up RBAC to things like Service Bus or Key Vault and MSI as the authentication type for those managed API connections. Next is the access control. We tend to IP restrict things where we can for an extra layer of security, and there are some quirks to doing this that need to be discussed here. In most cases, I think the “accessControl” block will be left out when getting started and added later once.

Way of working and next steps

Right – how is it used? On initial deployment, you might have a close to empty definition.json file, which will look similar to the code from a newly created Consumption Logic App:

As you develop your Logic App, and because we have the definition.json file ready to be loaded into the Bicep, you can just pluck the content of the code view out of the portal, and drop it into the definition.json for CI/CD to pick up. After that, your definition.json will be a bit more filled out:

{
        "definition": {
          "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
          "actions": {
            "Compose": {
              "inputs": "Pipeline CICD Test",
              "runAfter": {},
              "type": "Compose"
            },
            "List_secrets": {
              "inputs": {
                "host": {
                  "connection": {
                    "name": "@parameters('$connections')['keyvault']['connectionId']"
                  }
                },
                "method": "get",
                "path": "/secrets"
              },
              "runAfter": {
                "Compose": [
                  "Succeeded"
                ]
              },
              "type": "ApiConnection"
            },
            "Response": {
              "inputs": {
                "statusCode": 200
              },
              "kind": "Http",
              "runAfter": {
                "Send_message": [
                  "Succeeded"
                ]
              },
              "type": "Response"
            },
            "Send_message": {
              "inputs": {
                "body": {
                  "ContentData": "@{base64('test')}"
                },
                "host": {
                  "connection": {
                    "name": "@parameters('$connections')['servicebus']['connectionId']"
                  }
                },
                "method": "post",
                "path": "/@{encodeURIComponent(encodeURIComponent('test'))}/messages",
                "queries": {
                  "systemProperties": "None"
                }
              },
              "runAfter": {
                "List_secrets": [
                  "Succeeded"
                ]
              },
              "type": "ApiConnection"
            }
          },
          "contentVersion": "1.0.0.0",
          "outputs": {},
          "parameters": {
            "$connections": {
              "defaultValue": {},
              "type": "Object"
            }
          },
          "triggers": {
            "manual": {
              "inputs": {
                "schema": {}
              },
              "kind": "Http",
              "type": "Request"
            }
          }
        }
      }

What next? In my next blog, I’ll talk about some of the additional resources that might go into the Bicep shared with the Consumption Logic App IaC, using modules for setting up RBAC and the CI/CD using YAML pipelines.

Ready to elevate your data transit security and enjoy peace of mind?

Click here to schedule a free, no-obligation consultation with our Adaptiv experts. Let us guide you through a tailored solution that's just right for your unique needs.

Your journey to robust, reliable, and rapid application security begins now!

Talk To Us