Deploying a Hugo site to Azure Storage static website hosting using Azure DevOps


Updated: 14th Dec 2018 - Static hosting now GA

Introduction

I’ve recently moved my blog from hosted Ghost Pro to a Hugo static site hosted in Azure Storage using the static website hosting feature. The main driver for moving was that I’m not a prolific blogger and I didn’t feel I was getting value from my annual Ghost subscription (this is before their rather hefty price rises, although I was locked on the lower price tariff). The static hosting feature has just gone to Generally Availability (GA) so you can use this for production purposes.

This post will focus on setting up the CI/CD portion of the workflow once you start to use Hugo. In case you don’t know about Hugo, it’s a super fast static site generator built using Go. The Hugo documentation is pretty comprehensive, I’d recommend checking that out if you want to know more about Hugo.

Outline

We’re going to use Azure Pipelines (part of Azure DevOps) to create the build pipeline, this pipeline will use the Hugo CLI to parse the site content/templates and generate the static site into a folder. Once we have generated the content we’ll use a release pipeline to publish this content to the Azure Storage account we’re using for the static web hosting (in blob storage). To be honest setting up the CI/CD pipeline was so easy I nearly didn’t bother writing this post, however there are a couple of steps in the release pipeline that perhaps aren’t so obvious so I thought it was better to document it than not.

Setting up build pipeline

In my example I’m using Azure Repos to store my Hugo content/templates, you’re free to use whatever source control provider you want to (e.g. BitBucket, GitHub etc). Azure DevOps integration with external source control systems is excellent.

There is a wealth of documentation on getting started with Azure DevOps so I’ll focus on a couple of important/less obvious points around the build/deployment pipelines.

On that point, the build pipeline is actually trivial thanks to the Hugo build task from the marketplace. This extension lets you specify a number of options around publishing drafts, expired posts etc, in my case I’m sticking to the defaults. I just need to specify the source folder and ensure the Hugo generated site is created in the artifact staging directory.

build pipeline

The build task downloads the Hugo CLI and then invokes it using the parameters supplied in the task. It will generate the static site into the public folder.

Once we have the build working we can enable the continuous integration trigger to build on each check-in on my master branch

build pipeline trigger

To summarise this pipeline:

  1. Use Hugo to publish your site to the public folder
  2. Publish the public folder to the an artifact we can use in the release pipeline

That’s all there is for the build step - pretty simple really.

Setting up the release pipeline

We’ve got the site content built we now need to publish this to the Azure Storage static hosting container. This process has a couple of steps which aren’t as obvious as the build steps, in part because there is a special blob container used for Azure Storage static hosting called $web. Ordinarily the $ symbol is not a legal character in blob containers so the tools used to interact with it need to be aware of the static hosting feature in order to work properly and it’s this that can trip up the unwary.

There are two steps to the release

  1. Delete the contents of the $web container
  2. Copy over the new content from the build artifact

Step 1. is pretty straightforward, we can use the Azure CLI task to batch delete the contents of the $web folder using a simple inline script

az storage blob delete-batch --account-name hfcblog --source $web

deployment pipeline clean step

Step 2. We use the Azure file copy task to copy the contents of the build artifact to the newly cleaned $web folder. Because the container used by static website hosting in Azure Storage is called $web we can run into a few problems.

deployment pipeline clean step

Behind the scenes the Azure File Copy task uses AzCopy and in order for it to work correctly with the $web folder we have to use the latest version which is the one the V2.* version of the task uses.

Additionally there are a couple of parameters you’ll want to specify:

  • /S ensures the contents of the folder are copied recursively so all subfolders are also copied.

  • /SetContentType ensures that the MIME type of the files being copied is correct, in my case without this all my files were marked with the content type “application/octet-stream”

You can find the other flags used by AzCopy in the documentation.

And that’s about it - once we’re happy the release is working correctly we can set up the trigger so the release is created automatically after the build step.

deployment pipeline clean step

Configuring static hosting feature

My site is already in place so there are no additional steps to take but you may want to ensure you’re static hosting settings have been configured, things like setting the default and error documents and ensuring custom domains are mapped.

Costs

I mentioned one of my motivations for moving away from Ghost Pro was the reduce the cost of running the blog, well the static hosting is costing pennies especially as I front the site using CloudFlare which provides edge caching for free. With Azure DevOps you can get free private repos and 1,800 minutes per month for pipelines (which is plenty for me). You can get free build minutes for open source projects so if your blog is on a public repo you should be able to take advantage of this. Essentially this move means my blog hosting is costing a few pennies a month rather than close to £100 per year. I don’t have a full month’s costs yet but currently my consumption is showing at £0.08 for the current period.

Summary

Hosting static sites using Azure Storage static hosting is extremely cheap and using Azure Pipelines to deploy the site is pretty trivial too. My workflow is to work on blog posts on my local machine using Hugo Server using a branch for each post. when I’m ready I simply merge to master and git push and in a few seconds my post is automatically deployed and live on my site (that’s how this post was created). I mentioned I use CloudFlare to front my site, in a follow up post I’ll show how I can integrate with CloudFlare to ensure my site’s caches are purged from CloudFlare when I deploy to ensure everyone see’s the latest content.

Back to home