Deploying Containerized Azure Functions with Terraform

I am a huge fan of serverless and its ability to create more simple deployments with less for me to worry about and still with the reliability and scalability I need. I am also a fan of containers and IaC (Infrastructure as Code) so the ability to combine all three is extremely attractive from a technical, operational, and cost optimization standpoint.

In this post, I will go through a recent challenge that I completed where I used HashiCorp Terraform to setup an Azure Function app where the backing code is hosted by a Docker Container. I feel this is a much better way to handle serverless deployments instead of the referenced Zip file I have used in the past.

You need to be Premium

One of the things you first encounter when seeking out this approach is that Microsoft will only allow Function Apps to use Custom Docker Images if they use a Premium or Dedicated App Service Plan (https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-function-linux-custom-image?tabs=nodejs#create-an-app-from-the-image) on Linux.

For this tutorial I will use the basic Premium Plan (SKU P1V2). A quick reminder, you Function App AND its App Service Plan MUST be in the same Azure region. I ran into a problem trying to work with Elastic Premium which, as of this writing, is only available (in the US) in East and West regions.

Terraform to create the App Service Plan (Premium P1V2)


resource "azurerm_app_service_plan" "plan" {
name = "${var.app_name}-premiumPlan"
resource_group_name = "${data.azurerm_resource_group.rg.name}"
location = "${data.azurerm_resource_group.rg.location}"
kind = "Linux"
reserved = true
sku {
tier = "Premium"
size = "P1V2"
}
}

view raw

appservice.tf

hosted with ❤ by GitHub

Pretty straightforward. Far as I am aware container based hosting is ONLY available on Linux plans – Windows Container support is no doubt coming but, no idea when it will be available, if ever.

Create the Dockerfile

No surprise that the Docker image has to have a certain internal structure for the function app to be able to use it. Here is the generic Dockerfile you can get using the func helper via the Azure Function Tools (npm).

func init MyFunctionProj –docker

This will start a new Azure Function App project that targets Docker. You can use this as a starting point or just to get the Dockerfile. Below is the contents of that Dockerfile:


FROM microsoft/dotnet:2.2-sdk AS installer-env
COPY . /src/dotnet-function-app
RUN cd /src/dotnet-function-app && \
mkdir -p /home/site/wwwroot && \
dotnet publish *.csproj –output /home/site/wwwroot
# To enable ssh & remote debugging on app service change the base image to the one below
# FROM mcr.microsoft.com/azure-functions/dotnet:2.0-appservice
FROM mcr.microsoft.com/azure-functions/dotnet:2.0
ENV AzureWebJobsScriptRoot=/home/site/wwwroot \
AzureFunctionsJobHost__Logging__Console__IsEnabled=true
COPY –from=installer-env ["/home/site/wwwroot", "/home/site/wwwroot"]

view raw

Dockerfile

hosted with ❤ by GitHub

At the very least its a good starting point. I assume that when Azure runs the image it looks mount into known directories hence the need to conform as this Dockerfile does.

Push the Image

As with any strategy that involves Docker containers we need to push the source image to a spot where it can be accessed by other services. I wont go into how to do that, but I will assume you are hosting the image in Azure Container Registry.

Deploy the Function App

Back to our Terraform script, we need to deploy our Function App – here is the script to do this:


resource "azurerm_function_app" "funcApp" {
name = "userapi-${var.app_name}fa-${var.env_name}"
location = "${data.azurerm_resource_group.rg.location}"
resource_group_name = "${data.azurerm_resource_group.rg.name}"
app_service_plan_id = "${azurerm_app_service_plan.plan.id}"
storage_connection_string = "${azurerm_storage_account.storage.primary_connection_string}"
version = "~2"
app_settings = {
FUNCTION_APP_EDIT_MODE = "readOnly"
https_only = true
DOCKER_REGISTRY_SERVER_URL = "${data.azurerm_container_registry.registry.login_server}"
DOCKER_REGISTRY_SERVER_USERNAME = "${data.azurerm_container_registry.registry.admin_username}"
DOCKER_REGISTRY_SERVER_PASSWORD = "${data.azurerm_container_registry.registry.admin_password}"
WEBSITES_ENABLE_APP_SERVICE_STORAGE = false
}
site_config {
always_on = true
linux_fx_version = "DOCKER|${data.azurerm_container_registry.registry.login_server}/${var.image_name}:${var.tag}"
}
}

view raw

funcApp.tf

hosted with ❤ by GitHub

The MOST critical AppSetting here is WEBSITES_ENABLE_APP_SERVICE_STORAGE and its value MUST be false. This indicates to Azure to NOT look in storage for metadata (as is normal). The other all cap AppSettings are access to the Azure Container Registry – I assume these will change if you use something like Docker Hub to host the container image.

Note also the linux_fx_version setting. If you have visited my blog before you will have seen this when deploying Azure App Service instances (not surprising since a Function App is an App Service under the hood).

Troubleshooting Tips

By far the best way I found to troubleshoot this process was to access the Kudu options from Platform Features for the Azure Function App. Once in, you can access the Docker Container logs (have to click a couple links) and it gives you the Docker output. You can use this to figure out why an image may not be starting.

This was what led me to ultimately discovered the APP_SERVICE_STORAGE setting (above) as the reason why, despite the container starting, I never saw my functions in the navigation.

Hope this helps people out. I think this is a very solid way to deploy Azure Functions moving forward though, I do wish a Premium plan was not required.

2 thoughts on “Deploying Containerized Azure Functions with Terraform

Leave a comment