Reputation: 3928
One thing that I can't figure out is how to set up a database host in case when we dockerize a Rail app ? For example, a Postgres DB is supposed to run on localhost on a dev machine. But in a the docker-compose file the database service has its own name, - it's on that host that the database will be accessible for other containers, foe example:
version: '3'
services:
web:
build: .
ports:
- "3000:3000"
volumes:
- .:/usr/src/app
env_file:
- .env/development/database
- .env/development/web
redis:
image: redis
database:
image: postgres
env_file:
- .env/development/database
volumes:
- db-data:/var/lib/postgresql/data
volumes:
db-data:
Most examples suppose to execute all the commands related the development Rails (creating models, migrations, etc.) from inside the container, e.g.
docker-compose exec web rails g scaffold User first_name:string last_name:string
And to run the above migration I'd have to run
docker-compose exec web rails db:migrate
This way it works. But why do I need to run Docker for my dev locally to be able to access the app ?
So I come back to my original essential question: when the app was generated, database.yml had the below settings (for Postgres):
default: &default
adapter: postgresql
encoding: unicode
pool: <%= ENV.fetch("RAILS_MAX_THREADS") { 5 } %>
development:
<<: *default
database: rails5-ember_development
This way, everybody could clone the project and continue to develop when having Postgres DB running on localhost. Now when dockerizing the app, how to change/adapt the host value, - localhost:5432 being by default so that the application could run both in a Docker container ?
So, to resume my question is:
To be able to simulate the same behaviour in a dockerized Rails app, is the only solution would be to run it in a special environment other than development
? In this case, I'd add it to database.yml
and set the same DB values as in docker-compose.yml
file (username, host, etc.).
Thank you.
Upvotes: 2
Views: 4275
Reputation: 3928
Here is the solution I came to.
F
FROM ruby:2.5.1
LABEL maintainer="Serguei CAMBOUR <[email protected]>"
RUN apt-get update -yqq
RUN apt-get install -yqq --no-install-recommends nodejs
COPY Gemfile* /usr/src/app/
WORKDIR /usr/src/app
RUN bundle install
COPY . /usr/src/app/
CMD ["rails", "s", "-b", "0.0.0.0"]
docker-compose.yml
as follows:v
version: '3'
services:
web:
build: .
ports:
- "3000:3000"
volumes:
- .:/usr/src/app
env_file:
- .env/development/database
- .env/development/web
redis:
image: redis
database:
image: postgres
env_file:
- .env/development/database
volumes:
- db-data:/var/lib/postgresql/data
volumes:
db-data:
.env/development/database
file as follows:P
POSTGRES_USER=postgres
POSTGRES_DB=myapp_development
.env/development/web
file as follows:DATABASE_HOST=database
database.yml
as follows to be able to read env variable values:d
default: &default
adapter: postgresql
encoding: unicode
host: <%= ENV['DATABASE_HOST'] %>
username: <%= ENV['POSTGRES_USER'] %>
database: <%= ENV['POSTGRES_DB'] %>
pool: 5
variables:
statement_timeout: 5000
development:
<<: *default
test:
<<: *default
database: myapp_test
production:
<<: *default
Now, you can run your rails app as usual with rails s
and it will work. The same is for running all the Rails generators, migrations etc, - it will work and communicate with your Postgresql DB in local.
To run your code in a docker container:
docker-compose build web
(where web
is the name of my service declared in docker-compose.yml
before).docker-compose up --build
.docker-compose run --rm web rails db:create
.docker-compose exec web rails db:migrate
. docker-compose exec web rails db:create db:migrate
.Hope this helps.
Upvotes: 1