Pular para o conteúdo principal

Infraestrutura Docker

O Módulo Empreendedorismo utiliza Docker para containerização de todos os serviços, facilitando o desenvolvimento e deploy.

Arquitetura de Containers

┌─────────────────────────────────────────────────────────────────┐
│ DOCKER COMPOSE │
└─────────────────────────────────────────────────────────────────┘

┌─────────────────────────┼─────────────────────────┐
│ │ │
▼ ▼ ▼
┌─────────┐ ┌─────────┐ ┌─────────┐
│ db │ │ redis │ │ minio │
│ :5432 │ │ :6379 │ │:9000/01 │
└─────────┘ └─────────┘ └─────────┘
│ │ │
│ ┌───────────────┼───────────────┐ │
│ │ │ │ │
▼ ▼ ▼ ▼ ▼
┌─────────────────────────────────────────────────────────────┐
│ backend :8000 │
│ (Django + Gunicorn) │
└─────────────────────────────────────────────────────────────┘

┌───────────────────┼───────────────────┐
▼ ▼
┌─────────────────────┐ ┌─────────────────────┐
│ celery-worker │ │ celery-worker-emails│
│ (default,reports) │ │ (emails) │
└─────────────────────┘ └─────────────────────┘

┌─────────────────────────────────────────────────────────────┐
│ frontend :9501 │
│ (React + Vite) │
└─────────────────────────────────────────────────────────────┘

Serviços

PostgreSQL (db)

Banco de dados relacional principal.

db:
image: postgres:16-alpine
environment:
POSTGRES_DB: modulo_empreendedorismo
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
ports:
- "5432:5432"
volumes:
- postgres_data:/var/lib/postgresql/data

Redis

Cache e message broker para Celery.

redis:
image: redis:7-alpine
ports:
- "6379:6379"

MinIO

Armazenamento de arquivos compatível com S3.

minio:
image: minio/minio
command: server /data --console-address ":9001"
environment:
MINIO_ROOT_USER: minioadmin
MINIO_ROOT_PASSWORD: minioadmin
ports:
- "9000:9000" # API
- "9001:9001" # Console
volumes:
- minio_data:/data

Backend (Django)

API REST com Django e Gunicorn.

backend:
build:
context: ./backend
dockerfile: Dockerfile
command: gunicorn core.wsgi:application --bind 0.0.0.0:8000
environment:
- DEBUG=False
- DATABASE_URL=postgres://postgres:postgres@db:5432/modulo_empreendedorismo
- REDIS_URL=redis://redis:6379/0
- MINIO_ENDPOINT=minio:9000
ports:
- "8000:8000"
depends_on:
- db
- redis
- minio

Celery Workers

Workers para processamento assíncrono.

celery-worker:
build:
context: ./backend
command: celery -A core worker -l info -Q default,reports
depends_on:
- backend
- redis

celery-worker-emails:
build:
context: ./backend
command: celery -A core worker -l info -Q emails -c 2
depends_on:
- backend
- redis

Frontend (React)

Aplicação React com Vite.

frontend:
build:
context: ./frontend
dockerfile: Dockerfile
ports:
- "9501:80"
depends_on:
- backend

Docker Compose Completo

docker-compose.yml
version: '3.8'

services:
db:
image: postgres:16-alpine
environment:
POSTGRES_DB: modulo_empreendedorismo
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
ports:
- "5432:5432"
volumes:
- postgres_data:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U postgres"]
interval: 5s
timeout: 5s
retries: 5

redis:
image: redis:7-alpine
ports:
- "6379:6379"
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 5s
timeout: 5s
retries: 5

minio:
image: minio/minio
command: server /data --console-address ":9001"
environment:
MINIO_ROOT_USER: minioadmin
MINIO_ROOT_PASSWORD: minioadmin
ports:
- "9000:9000"
- "9001:9001"
volumes:
- minio_data:/data
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:9000/minio/health/live"]
interval: 30s
timeout: 20s
retries: 3

backend:
build:
context: ./backend
dockerfile: Dockerfile
command: >
sh -c "python manage.py migrate &&
python manage.py setup_minio &&
gunicorn core.wsgi:application --bind 0.0.0.0:8000"
environment:
- DEBUG=False
- SECRET_KEY=${SECRET_KEY}
- DATABASE_URL=postgres://postgres:postgres@db:5432/modulo_empreendedorismo
- REDIS_URL=redis://redis:6379/0
- CELERY_BROKER_URL=redis://redis:6379/0
- MINIO_ENDPOINT=minio:9000
- MINIO_ACCESS_KEY=minioadmin
- MINIO_SECRET_KEY=minioadmin
ports:
- "8000:8000"
depends_on:
db:
condition: service_healthy
redis:
condition: service_healthy
minio:
condition: service_healthy
volumes:
- ./backend:/app

celery-worker:
build:
context: ./backend
command: celery -A core worker -l info -Q default,reports
environment:
- DATABASE_URL=postgres://postgres:postgres@db:5432/modulo_empreendedorismo
- REDIS_URL=redis://redis:6379/0
- CELERY_BROKER_URL=redis://redis:6379/0
depends_on:
- backend
- redis

celery-worker-emails:
build:
context: ./backend
command: celery -A core worker -l info -Q emails -c 2
environment:
- DATABASE_URL=postgres://postgres:postgres@db:5432/modulo_empreendedorismo
- REDIS_URL=redis://redis:6379/0
- CELERY_BROKER_URL=redis://redis:6379/0
depends_on:
- backend
- redis

celery-beat:
build:
context: ./backend
command: celery -A core beat -l info
environment:
- DATABASE_URL=postgres://postgres:postgres@db:5432/modulo_empreendedorismo
- REDIS_URL=redis://redis:6379/0
- CELERY_BROKER_URL=redis://redis:6379/0
depends_on:
- backend
- redis

frontend:
build:
context: ./frontend
dockerfile: Dockerfile
ports:
- "9501:80"
depends_on:
- backend

volumes:
postgres_data:
minio_data:

Dockerfiles

Backend Dockerfile

backend/Dockerfile
FROM python:3.11-slim

WORKDIR /app

# Instalar dependências do sistema
RUN apt-get update && apt-get install -y \
libpq-dev \
gcc \
&& rm -rf /var/lib/apt/lists/*

# Instalar dependências Python
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

# Copiar código
COPY . .

# Coletar arquivos estáticos
RUN python manage.py collectstatic --noinput

EXPOSE 8000

CMD ["gunicorn", "core.wsgi:application", "--bind", "0.0.0.0:8000"]

Frontend Dockerfile

frontend/Dockerfile
# Stage 1: Build
FROM node:20-alpine AS builder

WORKDIR /app

COPY package*.json ./
RUN npm ci

COPY . .
RUN npm run build

# Stage 2: Production
FROM nginx:alpine

COPY nginx.conf /etc/nginx/conf.d/default.conf
COPY --from=builder /app/dist /usr/share/nginx/html

EXPOSE 80

CMD ["nginx", "-g", "daemon off;"]

Comandos Úteis

# Subir todos os serviços
docker-compose up -d

# Ver logs
docker-compose logs -f backend
docker-compose logs -f celery-worker

# Executar migrations
docker-compose exec backend python manage.py migrate

# Criar superusuário
docker-compose exec backend python manage.py createsuperuser

# Seed de dados
docker-compose exec backend python manage.py seed_all

# Rebuild de um serviço
docker-compose up -d --build backend

# Parar todos os serviços
docker-compose down

# Limpar volumes
docker-compose down -v

Health Checks

Todos os serviços possuem health checks configurados:

ServiçoEndpoint/ComandoIntervalo
PostgreSQLpg_isready5s
Redisredis-cli ping5s
MinIO/minio/health/live30s
Backend/api/health/30s