Reputation:
I try to run Django project in Docker container. Until now, everything was working fine. I didn't change anything but I get error shown at the bottom. Any ideas what is wrong? My PostgreSQL settings look in this way:
DATABASES = {
"default":
{
'ENGINE': 'django.db.backends.postgresql',
"NAME": 'myproject',
"HOST": "db",
"PORT": "5432",
}
}
DATABASES = {'default': dj_database_url.config(default='postgres://[email protected]:5432/qdsfks')}
docker-compose.yml:
version: '3'
services:
db:
image: postgres
environment:
POSTGRES_USER: myproject
POSTGRES_PASSWORD: somepass
volumes:
- "./local_pgdata:/var/lib/postgresql/data/pgdata"
django:
build: .
command: python3 manage.py runserver 0.0.0.0:8001
volumes:
- .:/code
ports:
- "8001:8001"
depends_on:
- db
Logs:
db_1 | The files belonging to this database system will be owned by user "postgres".
db_1 | This user must also own the server process.
db_1 |
db_1 | The database cluster will be initialized with locale "en_US.utf8".
db_1 | The default database encoding has accordingly been set to "UTF8".
db_1 | The default text search configuration will be set to "english".
db_1 |
db_1 | Data page checksums are disabled.
db_1 |
db_1 | initdb: directory "/var/lib/postgresql/data" exists but is not empty
db_1 | If you want to create a new database system, either remove or empty
db_1 | the directory "/var/lib/postgresql/data" or run initdb
db_1 | with an argument other than "/var/lib/postgresql/data".
Upvotes: 0
Views: 2352
Reputation: 4654
You should only map the host file system to your docker container if you want to use the existing host database. Otherwise I suggest to remove the volume mapping and use a custom Dockerfile to load existing content as described below.
./web/proj/settings.py
Improvement: You also need to set USER and PASSWORD in your settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ['POSTGRES_NAME'],
'USER': os.environ['POSTGRES_USER'],
'PASSWORD': os.environ['POSTGRES_PASSWORD'],
'HOST': os.environ['POSTGRES_HOST'],
'PORT': os.environ['POSTGRES_PORT'],
}
}
./docker-compose.yml
db:
restart: always
build:
context: ./db
dockerfile: Dockerfile
env_file: .env
expose:
- "5432"
django:
restart: always
build:
context: ./web
dockerfile: Dockerfile
volumes:
- .:/app
command: bash -c "cd ./web && ./manage.py makemigrations && ./manage.py migrate && ./manage.py runserver"
expose:
- "8000"
env_file: .env
depends_on:
- db
links:
- db
./.env
POSTGRES_NAME=db_name
POSTGRES_USER=db_user
POSTGRES_PASSWORD=password
POSTGRES_HOST=db
POSTGRES_PORT=5432
./db/Dockerfile
FROM postgres
RUN mkdir /app
WORKDIR /app
ADD ./ /app/ # add all sources if you need them
ADD ./path-to-your-csv.csv /app/
RUN [import content here]
Here you can find a way how you load custom sql code via initdb: Load custom sql script. You will also find a way to load the content of your csv file.
Moreover you should consider using fixtures instead of loading csv or sql scripts. This will be easier.
Hope this will help solving your problem.
Upvotes: 1