Overview:
Docker is an essential tool for developers, especially when building cloud-native applications. It allows you to package your application and its dependencies into containers, ensuring consistency across different environments. In this tutorial, we’ll show you how to set up a local cloud-like development environment using Docker and Docker Compose. This will help you run your application, database, and other services locally in containers, just like they would run in a production cloud environment.
By the end of this blog, you’ll have a simple web application running with a PostgreSQL database, all managed by Docker Compose.
Prerequisites:
- Install Docker.
- Install Docker Compose.
- Basic knowledge of Docker and Docker Compose.
- A simple web application (we’ll use Node.js as an example).
Step 1: Creating the Project Directory
First, create a directory to store your project files. This directory will contain the application code, Docker configuration, and database setup.
mkdir docker-local-env
cd docker-local-env
Step 2: Writing the Node.js Application
In this tutorial, we’ll build a simple Node.js application that connects to a PostgreSQL database.
- Create a
server.js
file with the following code:
touch server.js
Add the following code to server.js
:
const express = require('express');
const { Pool } = require('pg');
const app = express();
const port = 3000;
// PostgreSQL database connection pool
const pool = new Pool({
user: 'postgres',
host: 'db',
database: 'mydb',
password: 'password',
port: 5432,
});
// Route to test database connection
app.get('/', async (req, res) => {
try {
const result = await pool.query('SELECT NOW()');
res.send(`Database connected! Server time is: ${result.rows[0].now}`);
} catch (error) {
console.error(error);
res.status(500).send('Error connecting to the database.');
}
});
// Start the Express server
app.listen(port, () => {
console.log(`App listening on port ${port}`);
});
This simple Express.js application connects to a PostgreSQL database and returns the current time from the database.
- Create a
package.json
file to define the dependencies:
touch package.json
Add the following content to package.json
:
{
"name": "docker-local-env",
"version": "1.0.0",
"description": "A simple Node.js app connected to PostgreSQL using Docker",
"main": "server.js",
"scripts": {
"start": "node server.js"
},
"dependencies": {
"express": "^4.17.1",
"pg": "^8.7.1"
}
}
Step 3: Writing the Dockerfile
Next, you need a Dockerfile
that defines how to package the Node.js application into a Docker image.
- Create a
Dockerfile
:
touch Dockerfile
- Add the following content to the
Dockerfile
:
# Use the official Node.js image as the base image
FROM node:18-alpine
# Set the working directory in the container
WORKDIR /usr/src/app
# Copy package.json and package-lock.json into the container
COPY package*.json ./
# Install the project dependencies
RUN npm install
# Copy the application code into the container
COPY . .
# Expose the application port
EXPOSE 3000
# Run the application
CMD ["npm", "start"]
Step 4: Writing the Docker Compose File
Now, we’ll use Docker Compose to define and run multiple containers, one for the Node.js app and another for PostgreSQL.
- Create a
docker-compose.yml
file:
touch docker-compose.yml
- Add the following content to
docker-compose.yml
:
version: '3.8'
services:
app:
build: .
container_name: node_app
ports:
- "3000:3000"
environment:
- NODE_ENV=development
depends_on:
- db
networks:
- app-network
db:
image: postgres:13-alpine
container_name: postgres_db
environment:
POSTGRES_DB: mydb
POSTGRES_USER: postgres
POSTGRES_PASSWORD: password
ports:
- "5432:5432"
networks:
- app-network
networks:
app-network:
driver: bridge
Explanation:
- app: This service builds the Node.js application using the
Dockerfile
and exposes it on port 3000. - db: This service runs a PostgreSQL database, with environment variables to define the database, user, and password.
- networks: Both services are connected to the same Docker network, allowing them to communicate (the app can access the database using the hostname
db
).
Step 5: Building and Running the Containers
Now that we have defined the application and database in Docker, it’s time to build and run the containers.
- Build the containers:
docker-compose build
- Start the services:
docker-compose up
You should see the application and database containers start. The Node.js app will be available on http://localhost:3000
.
Step 6: Testing the Application
Open a browser or use curl
to test the application:
curl http://localhost:3000
You should see a response like this:
Database connected! Server time is: 2024-09-18T12:34:56.789Z
This confirms that the Node.js application is successfully connecting to the PostgreSQL database running inside the container.
Step 7: Managing Containers
7.1 Stopping the Services
To stop the containers, use:
docker-compose down
This will stop and remove the containers but keep the data in the volumes. If you want to remove the volumes as well, run:
docker-compose down --volumes
7.2 Checking Logs
To view the logs of the services, you can run:
docker-compose logs
You can also view the logs of individual services:
docker-compose logs app
docker-compose logs db
Step 8: Extending the Configuration (Optional)
Here are a few ways you can extend this setup:
8.1 Adding Redis
To add a Redis service to the docker-compose.yml
, simply add the following under services
:
redis:
image: redis:alpine
container_name: redis_cache
ports:
- "6379:6379"
networks:
- app-network
This will spin up a Redis container alongside the app and database.
8.2 Volumes for Database Persistence
You can add a volume to persist the database data between container restarts by modifying the db
service:
db:
image: postgres:13-alpine
container_name: postgres_db
environment:
POSTGRES_DB: mydb
POSTGRES_USER: postgres
POSTGRES_PASSWORD: password
ports:
- "5432:5432"
volumes:
- db_data:/var/lib/postgresql/data
networks:
- app-network
volumes:
db_data:
Conclusion
In this tutorial, you learned how to set up a local cloud-like development environment using Docker and Docker Compose. We created a simple Node.js application that connects to a PostgreSQL database, all running in separate Docker containers. This setup simulates a typical microservice architecture in a cloud environment and can easily be extended with additional services like Redis, RabbitMQ, or Nginx.
Feel free to copy and modify the provided code, and try running it on your own system to get comfortable with using Docker for local cloud development!