Upgrade your n8n from docker run to docker compose and simplify your workflow

“Upgrade your n8n workflow with Docker Compose and switch to PostgreSQL for a reliable and secure data storage solution, avoiding data loss and simplifying your workflow.”

🔥 Key Takeaways
🔥 Upgrading from Docker Run to Docker Compose simplifies your n8n workflow and ensures data safety.
🔥 Switching to PostgreSQL provides enterprise-grade reliability for your data storage.
🔥 Maintaining your new setup involves regular backups, log monitoring, and easy updates.

Running n8n with a single docker run command seems easy at first. But it’s a trap that can cost you hours of work and precious data.

I learned this the hard way when I lost three months of workflows after a failed update. I didn’t know my n8n used a SQLite database that doesn’t migrate when I update. That’s when I knew I needed a better solution. If you’re running n8n with docker run, you’re probably facing similar risks.

This guide will show you how to upgrade to docker compose, switch to PostgreSQL, and never worry about data loss again. I’ve helped dozens of n8n users make this switch, and now I’ll help you too.

The Problem with Docker Run

You need to constantly update your n8n version to keep up!

Your n8n data lives dangerously when using docker run. One wrong command during an update, and everything disappears.

SQLite, the default database, keeps your data in a single file. If that file gets corrupted, you lose everything. Here’s what makes it worse:

  • Manual backups are easy to forget
  • Updates often fail silently
  • Container restarts can corrupt your database

I’ve seen teams lose entire production workflows because they trusted docker run too much. Don’t let that be you.

Some bad news, you can’t migrate SQLite to PostgreSQL

I looked at all the forums and I found that it’s almost impossible to migrate SQLite to PostgreSQL. You basically need to write code to copy each row of data. I gave up after 45 minutes.

Why Docker Compose is Better

Docker compose turns your n8n setup into a well-oiled machine. It’s like upgrading from a bicycle to a Tesla.

PostgreSQL brings enterprise-grade reliability to your data storage. Your workflows stay safe, even during updates. Here’s the real magic:

  • One command updates everything
  • Automatic backups become simple
  • Your data lives in a separate container

My update time dropped from 30 minutes to 2 minutes after switching because the YAML does all the command line work for me.

How to do it? I found the official n8n install guide. Just follow the guide here.

The n8n official install was designed to be run locally so it doesn’t support internet access. If you’re like and run it on a Google cloud machine, you need to set up Nginx. Prompt A.I. to help you with it.

I just installed n8n locally using docker compose and it doesn't handle the dns settings. I want rumjahn.com to point to the server. how do i do that after i installed it?

Below is what my YAML file looks like after adding DNS settings.

volumes:
n8n_storage:
postgres_storage:
ollama_storage:
qdrant_storage:

networks:
demo:

x-n8n: &service-n8n
image: n8nio/n8n:latest
networks: ['demo']
environment:
- DB_TYPE=postgresdb
- DB_POSTGRESDB_HOST=postgres
- DB_POSTGRESDB_USER=${POSTGRES_USER}
- DB_POSTGRESDB_PASSWORD=${POSTGRES_PASSWORD}
- N8N_DIAGNOSTICS_ENABLED=false
- N8N_PERSONALIZATION_ENABLED=false
- N8N_ENCRYPTION_KEY={Insert your own}
- N8N_USER_MANAGEMENT_JWT_SECRET
- DB_SQLITE_PATH=/home/node/.n8n/database.sqlite
links:
- postgres

x-ollama: &service-ollama
image: ollama/ollama:latest
container_name: ollama
networks: ['demo']
restart: unless-stopped
ports:
- 11434:11434
volumes:
- ollama_storage:/root/.ollama

x-init-ollama: &init-ollama
image: ollama/ollama:latest
networks: ['demo']
container_name: ollama-pull-llama
volumes:
- ollama_storage:/root/.ollama
entrypoint: /bin/sh
command:
- "-c"
- "sleep 3; OLLAMA_HOST=ollama:11434 ollama pull llama3.2"

services:
postgres:
image: postgres:16-alpine
networks: ['demo']
restart: unless-stopped
environment:
- POSTGRES_USER
- POSTGRES_PASSWORD
- POSTGRES_DB
volumes:
- postgres_storage:/var/lib/postgresql/data
healthcheck:
test: ['CMD-SHELL', 'pg_isready -h localhost -U ${POSTGRES_USER} -d ${POSTGRES_DB}']
interval: 5s
timeout: 5s
retries: 10

n8n-import:
<<: *service-n8n
container_name: n8n-import
entrypoint: /bin/sh
command:
- "-c"
- "n8n import:credentials --separate --input=/backup/credentials && n8n import:workflow --separate --input=/backup/workflows"
volumes:
- ./n8n/backup:/backup
depends_on:
postgres:
condition: service_healthy

n8n:
<<: *service-n8n
container_name: n8n
restart: unless-stopped
ports:
- 5678:5678
environment:
- N8N_HOST=rumjahn.com
- WEBHOOK_TUNNEL_URL=https://rumjahn.com/
- WEBHOOK_URL=https://rumjahn.com/
volumes:
- n8n_storage:/home/node/.n8n
- ./n8n/backup:/backup
- ./shared:/data/shared
depends_on:
postgres:
condition: service_healthy
n8n-import:
condition: service_completed_successfully
env_file: .env

qdrant:
image: qdrant/qdrant
container_name: qdrant
networks: ['demo']
restart: unless-stopped
ports:
- 6333:6333
volumes:
- qdrant_storage:/qdrant/storage

ollama-cpu:
profiles: ["cpu"]
<<: *service-ollama

ollama-gpu:
profiles: ["gpu-nvidia"]
<<: *service-ollama
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1
capabilities: [gpu]

ollama-pull-llama-cpu:
profiles: ["cpu"]
<<: *init-ollama
depends_on:
- ollama-cpu

ollama-pull-llama-gpu:
profiles: ["gpu-nvidia"]
<<: *init-ollama
depends_on:
- ollama-gpu
n8n-migrate:
<<: *service-n8n
container_name: n8n-migrate
entrypoint: /bin/sh
command: n8n start --tunnel
depends_on:
postgres:
condition: service_healthy

Maintaining Your New Setup

Updating n8n is now simple:

docker-compose pull
docker-compose up -d

Back up your data weekly:

docker-compose exec db pg_dump -U n8n n8n > backup.sql

Watch your logs for issues:

docker-compose logs -f

Conclusion

Moving to Docker Compose transforms n8n from a fragile setup into a robust system. You’ll save time, keep your data safe, and stay current with updates.

Ready to upgrade? Start by creating that backup right now. Your future self will thank you.

Need help? Leave a comment and I’ll get back to you.

Leave a Reply

Your email address will not be published. Required fields are marked *