Posted by Daniel Tariq Metcalfe on February 07, 2022 | containers

How many times have you heard the phrase “but it works on my machine”? Container technologies like Docker have done a lot to ensure our applications can run reliably in different environments but what about the reliability of the tools we use to actually build these applications?

Just as we can effortlessly pull a Docker container and use it right away, we should also be able to clone a repo and use it right away. Testing, debugging, linting, and anything else we need should work out-of-the-box so we can spend more time shipping and less time configuring.

Fear not, dev containers to the rescue! A dev container is just a container that is being used as a development environment. The benefits of containers apply doubly to development environments because they need to be able to run the application code and the tools we use to work on that code.

Fortunately, the idea of dev containers is gaining traction and popular products are making use of them. Visual Studio Code is building in first class support for dev containers and GitHub Codespaces is bringing this same convenience to the browser so developers can start tinkering without even having to clone the repo!

If you’re a bit lost or you haven’t used containers before, don’t worry. Throughout the rest of this post, we’re going to review what containers are then walk through a scenario step-by-step so you can see how the value proposition for dev containers naturally becomes clear as a team’s needs evolve.

What is a container?

According to Docker a container is:

…a standard unit of software that packages up code and all its dependencies so the application runs quickly and reliably from one computing environment to another.

Your code alone has limited effectiveness unless it is run in an environment with everything it needs to perform its intended function and deliver value to your users.

Docker achieves containerisation by using clever tricks to emulate a full operating system environment, allowing a single computer to efficiently run multiple programs that would usually each require their own instance of an operating system setup to their needs. This in turn would usually require multiple computers unless some other virtualisation technology was being used (but that’s outside the scope of this blog post).

containerized applications diagram

Why would you use a container?

Chapter I: Buggy Beginnings

Imagine that you are a developer at Example Inc. You know that their most popular product is the legendary json-prettifier-as-a-service. Your first task is to fix a bug in their legacy offering of this product so you clone the legacy-jpaas repo.

You open the readme and follow the getting started instructions. It says you need to install Node 8 and version 1.0 of the command-line tool jq. You go through the tedious manual process of installing these on your local machine. Several of the links are out-of-date but you figure it out and update the readme as you go.

Now that these essential dependencies are installed you can run the code. The tests are always a good place to start.

npm install
npm test

> Tests passed! ✅

So far, so good!

You fix the bug and ship it on your very first day. The entire company gathers around and applauds. Your boss has tears of joy in his eyes. You feel amazing… for now.

Chapter II: The Node Paradox

Your second assigment is to make a small modification to some automation scripts. You clone the automation-scripts repo.

You’re full of confidence. You don’t even bother to look at the readme, you go straight to the package JSON and see that there is a test script. You run it, feeling invincible.

npm test

> SyntaxError: Unexpected token import
    at exports.runInThisContext (vm.js:53:16)
    at Module._compile (module.js:387:25)
    at Object.Module._extensions..js (module.js:422:10)
    at Module.load (module.js:357:32)
    at Function.Module._load (module.js:314:12)
    at Function.Module.runMain (module.js:447:10)
    at startup (node.js:140:18)
    at node.js:1001:3

Uh oh. You look at the readme and discover that these scripts use Node 16 and make use of the ES modules syntax supported in that version. Now you need two different Node versions on your local machine.

You could mess about with your system configuration to support both versions but that would be time-consuming and painful. Instead you uninstall your existing NodeJS and install Node Version Manager (NVM) instead. Now you can have multiple Node versions on your system! You try again.

nvm install 16
nvm use 16
npm test

> Tests passed! ✅

Whoo! The office erupts in cheers. You’re a hero!

Chapter III: The Dependency Dilemma

Now for your third assignment. You are asked to add a feature to Example Inc’s newer JSON prettifier. You clone their shiny-new-jpaas repo and open the readme.

ℹ️ You can clone a real-life version of this repo and follow along from this point onwards if you like.

git clone

Oh no! This app requires Node 17.4 and jq 1.6! NVM can handle the extra Node version but managing multiple versions of jq yourself would be a pain and you know that there’s no such thing as a “jq version manager”.

You know it won’t work, but you run the tests anyway.

nvm install 17.4
nvm use 17.4
npm test

> jq: Unknown option -S

Yep. This version of the prettifier adds the ability to sort the returned JSON by key but the older version of jq on your local machine doesn’t support that option!

After a lot of wasted time, you eventually discover that there is a Dockerfile in the repo.

FROM node:17.4-alpine3.14
RUN apk add "jq>1.6"
RUN mkdir /app
COPY ./package*.json /app
RUN npm ci

A previous developer added it 2 years ago but it’s not mentioned in the readme. This must have been a brave attempt to introduce Docker that never gained traction.

You install Docker and try to run the tests in the container instead.

docker build --tag shiny-new-jpaas .
docker run -v "$(pwd)"/src:/app/src shiny-new-jpaas npm run test

> Tests passed! ✅

With the tests running you’re able to continue development and finish the feature. Yet again you have found the right tool for the job and delivered for the company, nice one!

Over the next few months, you continue the noble efforts of the previous developer and introduce Dockerfiles to the other repos. You and your colleagues feel productivity benefits as you use Docker to switch effortlessly between different runtime environments depending on the repo you are working in.

Why Would You Use a Dev Container?

Chapter IV: The Developer’s Dream

Despite your efforts Dockerising all the things you keep getting feedback from new developers that they’re losing time on other setup tasks necessary for them to be productive.

Setting Up Dev Containers

All the issues in the previous section are essentially due to development dependencies which can be managed with containers the same as any others. Example Inc could leave their developers to set up these things themselves but that would be time-consuming, error-prone, and would inevitably result in an inconsistent development experience across the team.

Alternatively, they could have one developer create a dev container to encapsulate all the setup necessary and the rest of the team could use that going forwards. Let’s see how we’d go about addressing one of the issues raised by creating a dev container with Travis CLI in the shiny-new-jpaas repository.

Creating Dev Dockerfile

First we create a new image using the production image as a base layer.

FROM shiny-new-jpaas
RUN apk update
RUN apk add "ruby>2.3.0"
RUN apk add ruby-dev
RUN apk add make
RUN apk add --no-cache build-base git && \
    gem install travis && \
    gem install travis-lint && \
    apk del build-base

Let’s test that Travis has been added properly.

docker build -f --tag shiny-new-jpaas-dev .
docker run shiny-new-jpaas-dev travis -v

> 1.11.0

Great, now developers have Travis CLI right out of the box.

ℹ️ Example Inc could also maintain their own Docker registry to remove the build step and allow developers to go straight to the run command. Docker would automatically pull the pre-built image for the registry then run the command.

Integrate With Your IDE

Being able to run terminal commands is all well and good but developers usually work in an IDE. Thankfully Visual Studio Code offers first-class support for dev containers, we just need to add a configuration file.

  1. Open Visual Studio Code.
  2. Open the command palette (command + shift + p on Mac) and select Remote Containers: Add Development Container Configuration Files.
  3. In the next prompt select From
  4. This will create .devcontainer/devcontainer.json. Open it and make the following updates to set up the Mocha extension.
     "settings": {
       "mochaExplorer.files": "**/*.test.mjs"
     "extensions": [
  5. Now open the command palette again and select Remote Containers: Rebuild and Reopen in Container.

Visual Studio Code will re-open in the container and you’ll have access to the dev container environment (and any future developer who opens this repo in Visual Studio Code will be prompted to re-open in the container).

Let’s put this new container to the test. Open the terminal and check Travis is available.

travis -v

> 1.11.0

Yep! Now open the test explorer and see that the Mocha extension is set up. You see all our tests and can run or debug them with one click.

Here’s a video that shows how easy it will be for other developers to get started with this repo now.


The ability for developers to clone a repo, open it in their IDE, and immediately have everything they need to be productive is what makes dev containers so powerful.

I hope this post has shown you how easy it is to get started with them, happy coding!