#terraform, Coding, vscode

Terraform, Docker, Ubuntu 20.04, Go 1.14 and MemLock: Down the rabbit hole

I recently upgrade my machine and and installed the latest Ubuntu 20.04 as part of that.

Very smugly I fired it up the new install and, as I use devcontainers, looked forward to not installing lots of devtools as the Dockerfile in each project had all the tooling needed for VSCode to spin up and get going.

Sadly it wasn’t that smooth. After spinning up a project which uses terraform I found an odd message when running terraform plan

failed to retrieve schema from provider “random”: rpc error: code = Unavailable desc = connection error: desc = “transport: authentication handshake failed: EOF

error from terraform plan

Terraform has a provider model which uses GRPC to talk between the CLI and the individual providers. Random is one of the HashiCorp made providers so it’s a really odd one to see a bug in.

Initially I assumed that the downloaded provider was corrupted. Nope, clearing the download and retrying didn’t help.

So assuming I’d messed something up I:

  1. Tried changing the docker image using by the devcontainer. Nope. Same problem.
  2. Different versions of terraform. Nope. Same problem.
  3. Updated the Docker version I was using. Nope. Same problem.
  4. Restarted the machine. Nope. Same problem.

Now feeling quite frustrated I finally remembered a trick I’d used lots when building my own terraform providers. I enabled debug logging on the terraform CLI.

TF_LOG=DEBUG terraform plan

This is where it gets interesting…

Continue reading
Standard
#terraform

Azure Databricks and Terraform: Create a Cluster and PAT Token

My starting point for a recent bit of work was to try and reliably and simply deploy and manage Databricks clusters in Azure. Terraform was already in use so I set about trying to see how I could use that to also manage Databricks.

I had a look around and after trying the Terraform REST provider and a third party Datbricks provider (didn’t have much luck with either) found a Terraform Shell provider. This turned out to be exactly what I needed.

If you haven’t written a Terraform provider here’s a crash course. You basically just define a method for create, read, update and delete and the parameters they take. Then Terraform does the rest.

The Shell provider (https://github.com/scottwinkler/terraform-provider-shell) lets you do this by passing in scripts (bash, powershell, any executable that can take stdin and output stdout). In this case I wrote some powershell to wrap the databricks-cli.

It’s better (or different) to localexec with nullresources as you can store information in the Terraform State and detect drift. If a read returns different information than the current information in the state then update will be called, for example.

So I took the work of Alexandre and wrapped it into this provider and using the Shell provider have a simple, no frills Databricks provider for Terraform which makes calls to Databricks via the databricks-cli.

This is currently a simple hack and hasn’t undergone any significant testing: https://github.com/lawrencegripper/hack-databricksterraform. The flow is as follows:

Hopefully this might be useful to others as a starting point for others.

Standard