Last modified: Dec 02, 2025 By Alexander Williams
FastAPI Microservices with Docker Compose
Modern applications need to be scalable and maintainable. Microservices architecture helps achieve this. It breaks an app into small, independent services.
Each service focuses on a single business capability. They communicate over a network. This approach offers great flexibility and resilience.
FastAPI is a perfect framework for building these services. It is fast, modern, and Python-based. It automatically generates interactive API documentation.
Combining FastAPI with Docker Compose is a powerful strategy. Docker containers package each service with its dependencies. Compose orchestrates multiple containers as a single application.
This guide walks you through building a simple microservices system. You will create two FastAPI services and a database. Docker Compose will manage their lifecycle.
Why FastAPI and Docker Compose?
FastAPI provides high performance, on par with Node.js and Go. It offers automatic data validation using Pydantic models. This reduces bugs and development time.
Docker ensures your application runs the same everywhere. It eliminates the "it works on my machine" problem. Each service lives in its own isolated container.
Docker Compose defines and runs multi-container applications. You describe your services in a YAML file. A single command spins up your entire stack.
This combination is ideal for development and production. It simplifies dependency management and service discovery. It also makes scaling individual services easier.
Project Structure
Start by creating a clear project directory. This keeps your code organized and manageable.
fastapi-microservices/
├── docker-compose.yml
├── service_a/
│ ├── Dockerfile
│ ├── main.py
│ └── requirements.txt
└── service_b/
├── Dockerfile
├── main.py
└── requirements.txt
We have a root directory with a Compose file. Inside are two service directories. Each has its own Dockerfile and application code.
Creating the FastAPI Services
Let's create two simple services. Service A will handle user data. Service B will process orders.
First, define the dependencies for each service. Create a requirements.txt file in each service folder.
# service_a/requirements.txt
fastapi[all]
uvicorn
sqlalchemy
psycopg2-binary
Now, write the FastAPI application for Service A. We'll create a basic endpoint.
# service_a/main.py
from fastapi import FastAPI
app = FastAPI(title="User Service")
@app.get("/")
def read_root():
return {"service": "User Service", "status": "active"}
@app.get("/users/{user_id}")
def get_user(user_id: int):
# In a real app, you would fetch from a database.
# For a robust database setup, see our guide on FastAPI Database Migrations with Alembic.
return {"user_id": user_id, "name": f"User {user_id}"}
Service B will be similar but with a different context.
# service_b/main.py
from fastapi import FastAPI
app = FastAPI(title="Order Service")
@app.get("/")
def read_root():
return {"service": "Order Service", "status": "active"}
@app.get("/orders/{order_id}")
def get_order(order_id: str):
return {"order_id": order_id, "status": "shipped"}
These are minimal examples. Real services would use dependency injection and connect to databases. For advanced patterns, check our FastAPI Dependency Injection Best Practices guide.
Dockerizing Each Service
Each service needs a Dockerfile. This file defines how to build the container image.
Here is the Dockerfile for Service A. Service B's Dockerfile is identical in structure.
# service_a/Dockerfile
FROM python:3.11-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"]
The Dockerfile uses an official Python slim image. It sets the working directory and copies requirements. It installs dependencies and copies the app code.
The CMD instruction runs Uvicorn. This starts the FastAPI application. It listens on all network interfaces.
Orchestrating with Docker Compose
The docker-compose.yml file is the heart of the system. It defines all services, networks, and volumes.
# docker-compose.yml
version: '3.8'
services:
service_a:
build: ./service_a
container_name: fastapi_service_a
ports:
- "8001:8000"
networks:
- microservices_net
service_b:
build: ./service_b
container_name: fastapi_service_b
ports:
- "8002:8000"
networks:
- microservices_net
networks:
microservices_net:
driver: bridge
This file defines two services. Each builds from its directory. They are exposed on different host ports.
Both services connect to a custom bridge network. This allows them to communicate using service names as hostnames.
Running and Testing the System
Navigate to your project root. Run the following command to start everything.
docker-compose up --build
The --build flag ensures images are rebuilt. You will see logs from both services in your terminal.
Open a new terminal to test the services. Use curl or a browser.
# Test Service A
curl http://localhost:8001/
curl http://localhost:8001/users/42
# Test Service B
curl http://localhost:8002/
curl http://localhost:8002/orders/abc123
You should see JSON responses from both services. This confirms they are running correctly.
To stop the services, press Ctrl+C in the terminal running Compose. To remove the containers, run docker-compose down.
Adding a Database Service
Most microservices need a database. Let's add PostgreSQL for Service A.
Update the docker-compose.yml file to include a database.
services:
db:
image: postgres:15
container_name: microservices_db
environment:
POSTGRES_USER: admin
POSTGRES_PASSWORD: secret
POSTGRES_DB: appdb
volumes:
- postgres_data:/var/lib/postgresql/data
networks:
- microservices_net
service_a:
build: ./service_a
container_name: fastapi_service_a
ports:
- "8001:8000"
environment:
DATABASE_URL: postgresql://admin:secret@db:5432/appdb
depends_on:
- db
networks:
- microservices_net
volumes:
postgres_data:
Service A now has an environment variable for the database URL. The depends_on ensures the database starts first.
You would then update Service A's code to use SQLAlchemy. It would connect to the database using the provided URL.
For production deployment considerations, our guide on Deploy FastAPI with Docker for Production offers crucial insights.
Benefits of This Approach
This setup offers significant advantages. Development environments become consistent across teams.
Scaling is straightforward. You can increase replicas of a single service. Docker Compose or Kubernetes can manage this.
Testing is easier. You can spin up the entire stack for integration tests. Then tear it down cleanly.
It also simplifies onboarding. New developers run one command. They get a fully working system.
Conclusion
Building FastAPI microservices with Docker Compose is a robust method. It combines FastAPI's speed with Docker's isolation.
You learned to create separate services. You containerized them and orchestrated their startup. You also added a shared database.
This foundation can be extended in many ways. You can add message queues, caching layers, or API gateways. Each becomes a new service in your Compose file.
Start with this simple structure. Experiment by adding more services or complex logic. You now have a powerful, scalable platform for modern web applications.