Reputation: 3171
Fairly new to Docker, I am trying to add the execution of a custom sql script (triggers and functions) to Django's migration process and I am starting to feel a bit lost. Overall, What I am trying to achieve follows this pretty clear tutorial. In this tutorial, migrations are achieved by the execution of an entry point script. In the Dockerfile:
# run entrypoint.sh
ENTRYPOINT ["/usr/src/my_app/entrypoint.sh"]
Here is the entrypoint.sh:
#!/bin/sh
if [ "$DATABASE" = "postgres" ]
then
echo "Waiting for postgres..."
while ! nc -z $SQL_HOST $SQL_PORT; do
sleep 0.1
done
echo "PostgreSQL started"
fi
# tried several with and without combinations
python manage.py flush --no-input
python manage.py makemigrations my_app
python manage.py migrate
exec "$@"
So far so good. Turning to the question of integrating the execution of custom sql scripts in the migration process, most articles I read (this one for instance) recommend to create an empty migration to add the execution of sql statements. Here is what I have in
my_app/migrations/0001_initial_data.py
import os
from django.db import migrations, connection
def load_data_from_sql(filename):
file_path = os.path.join(os.path.dirname(__file__), '../sql/', filename)
sql_statement = open(file_path).read()
with connection.cursor() as cursor:
cursor.execute(sql_statement)
class Migration(migrations.Migration):
dependencies = [
('my_app', '0001_initial'),
]
operations = [
migrations.RunPython(load_data_from_sql('my_app_base.sql'))
]
As stated by dependencies, this step depends on the initial one (0001_initial.py
):
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='Unpayed',
fields=[
etc etc
[The Issue] However, even when I try to manually migrate (docker-compose exec web python manage.py makemigrations my_app
), I get the following error because the db in the postgresql container is empty:
File "/usr/src/my_app/my_app/migrations/0001_initial_data.py", line 21, in Migration
migrations.RunPython(load_data_from_sql('my_app_base.sql'))
File "/usr/local/lib/python3.7/site-packages/django/db/backends/utils.py", line 82, in _execute
....
return self.cursor.execute(sql)
django.db.utils.ProgrammingError: relation "auth_user" does not exist
[What I do not understand] However, when I log in the container, remove 0001_initial_data.py
and run ./entrypoint.sh
, everything works like a charm and tables are created. I can add 0001_initial_data.py
manually later on, run entrypoint.sh angain and have my functions. Same when I remove this file before running docker-compose up -d --build
: tables are created.
I feel like I am missing some obvious way and easier around trying integrate sql script migrations in this canonical way. All I need is this script to be run after 0001_initial migration is over. How would you do it?
[edit] docker-compose.yml:
version: '3.7'
services:
web:
build:
context: ./my_app
dockerfile: Dockerfile
command: python /usr/src/my_app/manage.py runserver 0.0.0.0:8000
volumes:
- ./my_app/:/usr/src/my_app/
ports:
- 8000:8000
environment:
- SECRET_KEY='o@@xO=jrd=p0^17svmYpw!22-bnm3zz*%y(7=j+p*t%ei-4pi!'
- SQL_ENGINE=django.db.backends.postgresql
- SQL_DATABASE=postgres
- SQL_USER=postgres
- SQL_PASSWORD=N0tTh3D3favlTpAssw0rd
- SQL_HOST=db
- SQL_PORT=5432
depends_on:
- db
db:
image: postgres:10.5-alpine
volumes:
- postgres_data:/var/lib/postgresql/data/
volumes:
postgres_data:
django:2.2 python:3.7
Upvotes: 0
Views: 1063
Reputation: 3171
[The easiest way I could find] I simply disentangled django migrations from the creation of custom functions in the DB. Migration are run first so that the tables exists when creating the functions. Here is the entrypoint.sh
#!/bin/sh
if [ "$DATABASE" = "postgres" ]
then
echo "Waiting for postgres..."
while ! nc -z $SQL_HOST $SQL_PORT; do
sleep 0.1
done
echo "PostgreSQL started"
fi
python manage.py flush --no-input
python manage.py migrate
# add custom sql functions to db
cat my_app/sql/my_app_base.sql | python manage.py dbshell
python manage.py collectstatic --no-input
exec "$@"
Keep in mind that manage.py dbshell
requires a postgresql-client
to run. I just needed to add it in the Dockerfile:
# pull official base image
FROM python:3.7-alpine
...........
# install psycopg2
RUN apk update \
&& apk add --virtual build-deps gcc python3-dev musl-dev \
&& apk add postgresql-dev postgresql-client\
&& pip install psycopg2 \
&& apk del build-deps
Upvotes: 0
Reputation: 2359
I believe the issue has to do with you naming the migration file and manually making your dependencies with the same prefix "0001" The reason I say this is because when you do reverse migrations, you simply can just reference the prefix. IE if you wanted to go from your 7th migration to your 6th migration. The command looks like this python manage.py migrate my_app 0006
Either way, I would try deleting and creating a new migration file via python manage.py makemigrations my_app --empty
and moving your code into that file. This should also write the dependencies for you.
The error message alongside the test you ran with adding he migration file after is indicative of the issue though. Some how initial migrations aren't running before the other one. I would also try dropping your DB as it may have persisted some bad state ./manage.py sqlflush
Upvotes: 1