Docker Compose for Development Link to heading
Until recently, my experience with containers was limited to production deployments and simple single-service Docker setups. This year, I finally took the plunge into using Docker Compose for local development, and it’s completely transformed how I approach building and testing multi-service applications.
The Problem It Solved Link to heading
I was working on a project that required:
- A Node.js API server
- A PostgreSQL database
- A Redis cache
- A Vue.js frontend
- An Nginx reverse proxy for local SSL
Getting all these services running locally used to require:
- Installing and configuring PostgreSQL locally
- Setting up Redis
- Managing different Node.js versions
- Configuring Nginx with SSL certificates
- Ensuring everyone on the team had identical setups
This setup was fragile, environment-specific, and a nightmare for onboarding new developers.
Enter Docker Compose Link to heading
Docker Compose promised to solve these problems by defining the entire development environment as code. Here’s the compose file that replaced hours of setup documentation:
version: "3.8"
services:
database:
image: postgres:13
environment:
POSTGRES_DB: taskapp
POSTGRES_USER: developer
POSTGRES_PASSWORD: devpass123
ports:
- "5432:5432"
volumes:
- postgres_data:/var/lib/postgresql/data
- ./init.sql:/docker-entrypoint-initdb.d/init.sql
healthcheck:
test: ["CMD-SHELL", "pg_isready -U developer"]
interval: 30s
timeout: 10s
retries: 3
redis:
image: redis:6-alpine
ports:
- "6379:6379"
volumes:
- redis_data:/data
api:
build:
context: ./api
dockerfile: Dockerfile.dev
ports:
- "3001:3001"
environment:
- NODE_ENV=development
- DATABASE_URL=postgresql://developer:devpass123@database:5432/taskapp
- REDIS_URL=redis://redis:6379
volumes:
- ./api:/app
- /app/node_modules
depends_on:
database:
condition: service_healthy
redis:
condition: service_started
command: npm run dev
frontend:
build:
context: ./frontend
dockerfile: Dockerfile.dev
ports:
- "3000:3000"
environment:
- REACT_APP_API_URL=http://localhost:3001
volumes:
- ./frontend:/app
- /app/node_modules
depends_on:
- api
proxy:
image: nginx:alpine
ports:
- "443:443"
- "80:80"
volumes:
- ./nginx/nginx.conf:/etc/nginx/nginx.conf
- ./nginx/ssl:/etc/nginx/ssl
depends_on:
- frontend
- api
volumes:
postgres_data:
redis_data:
The Development Experience Transformation Link to heading
One-Command Setup Link to heading
The entire development environment now starts with a single command:
docker-compose up
New team members can be productive within minutes of cloning the repository, rather than spending hours configuring their local environment.
Consistent Environments Link to heading
Everyone on the team now works with identical database versions, Node.js versions, and service configurations. The “it works on my machine” problem became a thing of the past.
Service Isolation Link to heading
Each service runs in its own container, preventing conflicts between different projects’ requirements. I can work on multiple projects simultaneously without worrying about port conflicts or version incompatibilities.
Development Workflow Integration Link to heading
Hot Reloading Link to heading
By mounting the source code as volumes, changes to the codebase are immediately reflected in the running containers. The development experience feels identical to running services natively, but with all the benefits of containerisation.
Debugging Support Link to heading
I configured the Node.js service to support debugging by exposing the debug port:
api:
# ... other configuration
ports:
- "3001:3001"
- "9229:9229" # Debug port
command: npm run dev:debug
This allows IDEs to connect to the debugger running inside the container.
Database Management Link to heading
The PostgreSQL service includes an initialisation script that sets up the database schema and test data on first run:
-- init.sql
CREATE TABLE users (
id SERIAL PRIMARY KEY,
email VARCHAR(255) UNIQUE NOT NULL,
name VARCHAR(255) NOT NULL,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
INSERT INTO users (email, name) VALUES
('dev@example.com', 'Development User'),
('test@example.com', 'Test User');
Selective Service Management Link to heading
Different services can be started independently for specific testing scenarios:
# Start only the database for migration testing
docker-compose up database
# Run the full stack except the frontend
docker-compose up database redis api proxy
Testing Integration Link to heading
Docker Compose transformed our testing strategy by enabling reliable integration tests:
Test Database Isolation Link to heading
# docker-compose.test.yml
version: "3.8"
services:
test-database:
image: postgres:13
environment:
POSTGRES_DB: taskapp_test
POSTGRES_USER: test
POSTGRES_PASSWORD: testpass
tmpfs:
- /var/lib/postgresql/data
test-api:
build:
context: ./api
dockerfile: Dockerfile.test
environment:
- NODE_ENV=test
- DATABASE_URL=postgresql://test:testpass@test-database:5432/taskapp_test
depends_on:
- test-database
command: npm test
Tests run against a fresh database every time, eliminating test pollution and ensuring reliable results.
Performance Considerations Link to heading
Build Optimisation Link to heading
Initial Docker builds were slow, so I optimised the Dockerfiles for development:
# Dockerfile.dev
FROM node:16-alpine
WORKDIR /app
# Copy package files first for better caching
COPY package*.json ./
RUN npm ci
# Copy source code
COPY . .
EXPOSE 3001 9229
CMD ["npm", "run", "dev"]
Volume Performance Link to heading
On macOS, bind mounts can be slow for projects with many files. Using named volumes for node_modules significantly improved performance:
volumes:
- ./api:/app
- /app/node_modules # This overrides the bind mount for node_modules
Challenges and Solutions Link to heading
Network Communication Link to heading
Understanding Docker Compose networking took time. Services communicate using service names as hostnames, not localhost:
// Wrong - tries to connect to localhost
const dbUrl = "postgresql://user:pass@localhost:5432/db"
// Correct - uses Docker service name
const dbUrl = "postgresql://user:pass@database:5432/db"
Environment Variable Management Link to heading
Managing different environment configurations for development, testing, and production required careful organisation:
.env.development
.env.test
.env.production
.env.example
Using .env files with Docker Compose made environment management much cleaner.
File Permissions Link to heading
On Linux systems, file permission issues arose when containers created files with root ownership. Adding user mapping solved this:
api:
user: "${UID}:${GID}" # Maps to host user
# ... other configuration
CI/CD Integration Link to heading
The Docker Compose setup translates well to CI/CD pipelines:
# GitHub Actions workflow
- name: Run integration tests
run: |
docker-compose -f docker-compose.test.yml up --build --abort-on-container-exit
docker-compose -f docker-compose.test.yml down
This ensures that CI tests run in the same environment as local development.
Team Collaboration Benefits Link to heading
Onboarding Speed Link to heading
New developers can be productive immediately:
- Clone repository
- Run
docker-compose up - Start coding
Documentation Reduction Link to heading
The docker-compose.yml file serves as executable documentation of the system architecture and dependencies.
Environment Parity Link to heading
Development environments now match production more closely, reducing environment-specific bugs.
Resource Management Link to heading
Memory Usage Link to heading
Running multiple services locally does consume more memory than native installations. Monitoring Docker Desktop’s resource usage became important:
# Monitor resource usage
docker stats
# Clean up unused resources
docker system prune
Startup Time Link to heading
The full stack takes longer to start than individual services, but the consistency benefits outweigh this minor inconvenience.
Looking Forward Link to heading
Docker Compose has become an essential part of my development workflow. The benefits of consistent environments, easy onboarding, and simplified multi-service development far outweigh the learning curve and resource overhead.
Key principles I’ve learned:
- Start simple - Begin with basic service definitions and add complexity gradually
- Optimise for development - Use bind mounts and debug ports for a smooth development experience
- Document the non-obvious - Not everything is captured in the compose file; maintain README files for setup instructions
- Version everything - Pin image versions to avoid surprises when images are updated
The move to container-based development has improved both individual productivity and team collaboration. It’s become the foundation for all new multi-service projects.
Have you adopted Docker Compose for local development? What challenges did you face, and how has it changed your development workflow?