Reputation: 191
I'm building a web app whose file structure looks something like this:
db/
Dockerfile
init/
00-init.sql
graphql/
Dockerfile
app/
Dockerfile
package.json
bin/
www
public/
index.html
index.js
app.js
frontend/
.babelrc
package.json
webpack.config.js
src/
index.jsx
docker-compose.yml
The project is composed of three containers: db
runs a postgres server, graphql
runs postgraphile (to auto-generate a GraphQL schema and handle GraphQL requests), and app
is an express app.
My frontend code is contained in a separate package within app/frontend
. Typically, I'd run npm run build
within that directory to invoke webpack, building the contents of app/frontend/src
into a redistributable form in app/public/index.js
. That way, building the frontend is an offline process, and the app itself just serves up the frontend.
Here's my dilemma: I'm trying to add Relay support to my frontend. This requires an additional step in the frontend build, namely running relay-compiler. But relay-compiler requires a GraphQL schema. Since I'm using postgraphile, I don't actually have a schema during the offline build. Postgraphile can spit out a schema file (with --export-schema-graphql
), but as of right now I'm only running postgraphile within a container - my db container has to start up and initialize the database, at which point postgraphile can connect and generate a schema, at which point I have a graphql.schema file... sitting inside the graphql container. But that all happens after I spin up the application - I need a schema file before that, as part of the offline build.
I'm still pretty green with Docker, GraphQL, postgraphile, and especially Relay - so I'm hoping that somebody who's more familiar with these technologies can lay some wisdom on me. I haven't found a whole lot of resources that specifically address using Relay with postgraphile.
In lieu of specific best practices, maybe somebody who's more familiar with modern web development can help me sort through these options:
Do I update my frontend so that it's fully capable of spinning up a postgres server and running postgraphile (as dev dependencies only), then add an offline script that generates a schema.graphql file for its own use? That seems like a big mess.
Do I update my graphql container so that it always writes out a schema file, and just manually copy-and-paste that file into my frontend as needed?
Do I add an extra process (which I'd manually run any time my database schema changed) external to any container (via an alternative docker-compose file, perhaps) that spins up a subset of my application stack, dumps out a graphql schema, and exits? If so, is there some way of directing a Docker container to write out a file on the host machine, or is mounting a local directory to the container the best way to accomplish that?
Is there a better way of structuring my project that avoids this problem in the first place? I'm new to Docker and I feel like maybe my brain is just split between a pre-Docker and post-Docker mindset.
Upvotes: 0
Views: 356
Reputation: 7666
You can use graphql-cli's get-schema
to download the schema file from an endpoint that has introspection enabled. This way you could simply fetch the schema from your remote or local PostGraphile endpoint.
Nevertheless, it can be very beneficial to also have a local version of PostGraphile running that reliably recreates the same schema as your production schema.
Upvotes: 1