I always enjoy applications where I can achieve complex functionality while writing the least amount of code. I believe that, by doing so, I can reduce the amount of mainteance and application requires as well as reduce the possibility of bugs or other problems because I am relying on functionality that others have tested; in the case of REST APIs for Azure services, I am relying on code that millions of people use every day.
To this end, I have been constantly researching the concept of “codeless” applications. This is not necessarily devoid of code but, limits the amount of custom code that is to be written. Central to this are API Gateway like tools, such as the component found in Azure API Management.
In this blog post, I want to create a set of operation in API Management that allows me to save and retrieve an image placed in a Blob storage account. Using this approach maximizes security and eliminates the need to store Access Keys or other sensitive information.
Provision our Infrastructure and Identities
For this sample you will need to create an Azure API Management instance (20m to deploy) and a Storage Account.
To start, access your API Management service and select the Managed Identities under the Security section. Set the Status to On and click Save.
Once you hit Save Azure will get to work create a new Service Principal with the same name as your API Management instance. You will want to remember this for later steps.
Create Read/Write Role Access for Azure Storage
Open your Azure Storage account and select the Access Policy (IAM) option. We need to add two role assignments to this Storage Account.
Note: for simplicity we are adding this access policy at the account level, which means the Role could talk to any Storage service in the account. To further limit this, create the role at the specific service level.
From the Access Policy (IAM) main page select Add a role assignment. Here on this page we can associate certain users with roles that grant them permissions to the storage account. We are interested in using two roles for our needs:
- Storage Blob Data Reader
- Storage Blob Data Contributor
For the role select Storage Blob Data Reader.
Make sure the Assign access to is set to the value which indicates the assignment will be to a service principal.
The Select field should be set to your user. This field supports search so, just type your APIM instance name and select it when it appears.
Click Save and repeat for Storage Blob Data Contributor.
This is all you need to do for Storage account. Make sure to create a Container and make a note of it. You will need it for the next step.
Create the Operations in API Management
To start with we need to create an API that can hold the operations we will call to save and retrieve the images we save.https://gist.github.com/xximjasonxx/273a751fd2011c91ffd06e804eafaaa9
Click APIs under General.
As you create this API fill out the fields as you desire BUT ensure to set the Web Service Url to the base URL of your storage account + container. For example, my storage account is called mystorageaccount123 and the container is called images. Therefore my base URL is:
https://mystorageaccount123.blob.core.windows.net/images
The reason to do this is, we are going to route all calls within this API to the same URL (its just the way the REST API for storage account works).
Click Create and your API will be created and added to the display column.
Now, the trick with this is, the processing we need to do to decorate the incoming request so it can communicate with Azure Storage accounts is the same for all of the endpoints. So, rather than duplicating the processing, we can select All Operations and then enter the code view for Inbound processing and use the following policy definitions to ensure all operations are affected.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
<authentication-managed-identity resource="https://storage.azure.com" /> | |
<set-header name="x-ms-version" exists-action="override"> | |
<value>2017-11-09</value> | |
</set-header> | |
<set-header name="x-ms-blob-type" exists-action="override"> | |
<value>BlockBlob</value> | |
</set-header> |
This is LITERALLY all we need to provide a “codeless” image store and retrieval system. This will access our blob storage with only the access needed. We dont need to store keys or any sensitive data anywhere.
All we have left to do is create our operations. You will need two:
- PUT /{id}
- GET /{id}
That is it. You can now upload and download images from Blob storage.
Testing things out
API Management provides a very sophisticated and useful test harness. You can use this to test your PUT endpoint, you should receive a 201 Created if things are working. DO NOT attempt the GET endpoint with the test harness, it doesnt seem to like binary data coming back (assuming you upload an image and not a text file).
To test the GET endpoint, you will need to create a Subscription and use the subscription key to test the endpoint in either Postman or a browser; I recommend a browser. Here is how I did it:
- Access the Subscriptions link under General
- This page contains a list of ALL subscriptions that your API currently has assigned to it. You can use the Add Subscription option to create a One-Off subscription to an API
- Create a subscription to the API – I called mine images
- Open a browser and access your GET endpoint setting the value of the new Subscription Key to the subscription-key query string parameter
- Upon browsing you should receive the item you just uploaded
Congrats, everything is working if this step worked.
Going Further
This really is only scratching the surface, when you start to involve things like BlobTrigger in Azure Functions or BlobCreated events registered with an Azure Event Grid you can just start to see the amount functionality you can get with a minimal amount of code. You can refer to my Event Pipeline series here where I used Azure Function triggers to create a real time data processing pipeline that probably had about 40 lines of code total – and I am working to reduce that even further.
I really believe it is beneficial to look at these types of approaches because they have the potential to really leverage the cloud and execute complex functionality with minimal code.
One thought on “Using MSI with API Management and Azure Storage”