Dockerizing a Node application

tabofa
3 min readAug 23, 2020

I wrote a node application using ES6 syntax, i.e. import express from 'express';. Once done, to make life easier, I wanted to use Docker to host the application. So I googled my inquiry, as you do, and found a bunch of tutorials on how to create a Dockerfile and docker-compose.yml.

Sweet, let’s do it!

TL;DR

You can find all the code on GitHub @ https://github.com/Tabofa/demo-app

Once I had written all the instructions I ran it, the image built and docker-compose started the container. Much to my dismay I got errors. It didn’t recognise import and complained loudly about it.

After what felt like eons of googling, I finally found out that there were 2 things required that I didn’t know.

  1. I had to add { "type": "module" } as a top level field in package.json.
  2. And I needed Node 13 or later.

So with that information at hand, I could finally create my new image.

If you want to follow along, I’ve created a very basic node app to use for testing, if not, scroll down to ‘The Dockerfile’.

mkdir demo-app
cd demo-app
npm init
npm install express
npm install pm2 # I want pm2 to host my app in the container later
touch Dockerfile
touch docker-compose.yml
mkdir src
cd src
touch index.js

If you follow along you should have a file structure like the following

demo-app
| src
| index.js
| Dockerfile
| docker-compose.yml
| package.json
| package-lock.json

The first trick, in oder to make node read my application as es6 and not CommonJs was that I had to alter the package.json file a little. Adding the type field

{
"name": "demo-app",
"version": "1.0.0",
"description": "",
"type": "module",
"main": "src/index.js",
"script": {
...
}
}

In ./src/index.js I wrote a simple application that will expose a port and endpoint that when called on will reply with ‘hello world’. Using express that we installed in the beginning here.

# src/index.js
import express from 'express'
const app = express()const PORT = 3000app.get('/', (req, res) => {
res.send('hello world')
})
app.listen(PORT, () => {
console.log(`listening on port ${PORT}`)
})

Once I’d tried it with node ./src/index.js I could go on to creating the Dockerfile.

The Dockerfile

# Dockerfile
FROM node:14-alpine

This was the other trick, using a base image that had node 13 or later, I opted to go for 14 since that was the latest major version at the time of writing.

I continued adding the usual commands to the Dockerfile

# Dockerfile# Base image to use
FROM node:14-alpine
#Create the directory we're going to use
RUN mkdir -p /usr/src/app
# Set it as work directory
WORKDIR /usr/src/app
# Copy all the code to our work directory
COPY . .
# Install all our dependencies
sRUN npm install
# Expose the port we want to communicate on
EXPOSE 3000
# Start the container
CMD [ "npx", "pm2-runtime", "start", "src/index.js" ]

On the last line in the Dockerfile I specify the command to execute once the image has been built. in this case we’re using pm2-runtime according to the documentation for pm2 this is the way to do it in a docker container, or docker might think that the process is done and shut it down, we don’t want that.

I’m also using npx here, so we use pm2 from our node_modules, giving us the luxury of not having to specify that as a global install in the Dockerfile, we could if we wanted run RUN npm install -g pm2 after our RUN npm install statement, having it globally installed. But why bother.

The docker-compose.yml file

On to the next part, our docker-compose.yml, this isn’t needed but it makes life easier in my opinion.

# docker-compose.yml
version: '3'
services:
demo-app:
build:
context: .
ports:
- "3000:3000"

Now you will be able to run docker-compose up in your terminal window and if you open a browser and navigate to http://localhost:3000 you should see the words ‘hello world’ printed out on the screen.

Happy coding!

--

--