Final Year Project: Part 1 - Dockerize Node.JS API for development and production
This is the first article about my work on my engineer studies’ final year project.
We are a team of 7 students, working on the Ecobol project. As the project is not yet public, I will stick to my work on the implementation of technologies and infrastructure.
Infrastructure overview
Dockerize Node.js REST API
Objectives: Speed and environment harmony
The project lasts only 2 months, we don’t want to loose time installing the dev environment on each developer’s machines. Developing on docker containers also avoid OS based issues, all the developers are working on the same environment than the production server.
Dealing with environments
We have 2 environments: development
and production
stored in NODE_ENV
environment variable. Choice of environment impacts the dependencies loaded, and the API behavior (e.g. logging level, pm2 usage).
The container must be able to run either of these environment properly. There is 3 major things to take into account while starting the API:
- Load only necessary dependencies
- Pass the correct
NODE_ENV
environment variable - Use the right command to start the API (
nodemon
in development,pm2
in production)
The solution involves a conditional Dockerfile
, and multiple docker-compose.yml
able to deal with this Dockerfile.
Conditional Dockerfile
Here is the conditional Dockerfile
:
FROM node:10.16.3
# Create app directory
WORKDIR /usr/src/api-ecobol
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
COPY package*.json ./
# Building code for production only if NODE_ENV is set to "production"
RUN if [ "$NODE_ENV" = "development" ]; \
then npm install; \
else npm ci --only=production; \
fi
# Bundle app source
COPY . .
EXPOSE 3000
The key point here is the RUN
line: if the NODE_ENV
is not development
, it will only install production dependencies npm ci --only=production
.
Also, note that this container does not start any process at the end, this will be the job of docker-compose.
Docker-Compose
Once the Dockerfile is ready, Docker-Compose is used to build it and provide a MongoDB instance.
docker-compose.yml
version: '3'
services:
api-ecobol:
env_file:
- .env
restart: on-failure
build: .
ports:
- '127.0.0.1:3002:3000'
links:
- mongo
mongo:
image: 'mongo:4'
volumes:
- './data:/data/db'
ports:
- '127.0.0.1:27017:27017'
restart: on-failure
This pass the .env
file to our app correctly, but does not start the app yet.
To start the app for development
environment, another docker-compose file is created to override some development-specific settings:
docker-compose.dev.yml
version: '3'
services:
api-ecobol:
environment:
- NODE_ENV=development
volumes:
- .:/usr/src/api-ecobol/
command: ./node_modules/.bin/nodemon server.js
It:
- Forces
NODE_ENV=development
- Mounts the source code as a volume for nodemon to watch it
- Starts nodemon process
In the same way, the production
environment has its own docker-compose file:
docker-compose.prod.yml
version: '3'
services:
api-ecobol:
command: ./node_modules/.bin/pm2-runtime server.js
environment:
- NODE_ENV=production
It:
- Forces
NODE_ENV=production
- Starts pm2 process
Using these docker-compose file
To start the development
API, simply execute:
docker-compose -f docker-compose.yml -f docker-compose.dev.yml up -d
And for starting the production
API:
docker-compose -f docker-compose.yml -f docker-compose.prod.yml up -d
That’s it ! 🚀