How to

Writing Bash doesn’t have to be as painful as you think! Shellcheck to the rescue.

So I’ve found myself writing lots of bash scripts recently and because they tend to do real things to the file system or cloud services they’re hard to test… it’s painful.

neverfear

So it turns out there is an awesome linter/checker for bash called shellcheck which you can use to catch a lot of those gotchas before they become a problem.

There is a great plugin for vscode so you get instant feedback when you do something you shouldn’t.

Better still it’s easy to get running in your build pipeline to keep everyone honest. Here is an example task for Azure Devops to run it on all scripts in the ./scripts folder.

Next on my list is to play with the xunit inspired testing framework for bash called shunit2 but kinda feel if you have enough stuff to need tests you should probably be using python.

 

Standard
Azure, Coding, Uncategorized

Friends don’t let friends commit Terraform without fmt, linting and validation

So it starts out easy, you write a bit of terraform and all is going well then as more and more people start committing and the code is churning things start to get messy. Breaking commits block release, formatting isn’t consistent and and errors get repeated.

Seems a bit odd right, in the middle of your devops pipe which dutifully checks code passes tests and validation you just give terraform a free pass.

Captain Picard Quotes. QuotesGram

The good new is terraform has tools to help you out here and make life better!

Here is my rough script for running during build to detect and fail early on a host of terraform errors. It’s also pinning terraform to a set release (hopefully the same one you use when releasing to prod) and doing a terraform init each time to make sure you have providers pinned (if not the script fail when a provider ships breaking changes and give you an early heads up).

It’s rough and ready so make sure your happy with what it does before you give it a run. For an added bonus the docker command below the script runs it inside a Azure Devops container to emulate locally what should happen when you push.

Optionally you can add args like -var java_functions_zip_file=something  to the terraform validate call.

Hope this helps as a quick rough guide!

Standard
Azure, How to

Using .img files on Azure Files to get full Linux filesystem compatibility

Here is the scenario, I wanted to use a container to run a linux app in Azure and I need to persist changes to the filesystem via Azure Files. This, initially, appears to be nice and simple – you mount the Azure File share using CIFS and then pass this into docker as a volume mount. But what if you app relies on some linux specific operations some, like symlinks, can be emulated see here for details, but others can’t.

This is a really simple little hack which I learnt from the Azure Cloudshell implementation. It uses the ‘.img’ format to store a full EXT2 filesystem on Azure files.

How does it work? Well if you open a cloudshell instance and use the ‘mount’ command you can see it has two mounts, one for CIFS and one for a loop0.

Seeing this started to peak my interest, what was the loop0 device mounting as my home directory? The cloudshell creates a storage account to persist your files between sessions, next I took a look at this to see what I could find.

imgmount2

This is when I found the ‘.img’ file being used. So how do we use this approach for ourselves, as it seems to work nicely for cloudshell?

It’s actually pretty simple, we mount the Azure file share with CIFS, create the ‘.img’ file in the CIFS share, format it and then mount the ‘.img’ file. Done.

The key is to create a ‘img’ file which is sparse, meaning we don’t write all the empty space to file storage, otherwise creating a 10Gig ‘img’ file involves copying 10gig to Azure files. This is done by passing in the ‘seek’ command into dd on like 15.

So this gives you a fully compatible Linux disk stored on Azure files so it can persist container restarts and machine moves.

Ps. If you’re replicating super critical data dive this some extensive testing first, this approach worked nicely for my use case but do exercise caution.

Standard