Reputation: 428
In my company, we have a system organized with microservices with a dedicated git repository per service. We would like to introduce gRPC and we were wondering how to share protobuf files and build libs for our various languages. Based on some examples we collected, we decided at the end to go for a single repository with all our protobuf inside, it seems the most common way of doing it and it seems easier to maintain and use.
I would like to know if you have some examples on your side ? Do you have some counter examples of companies doing the exact opposite, meaning hosting protobuf in a distributed way ?
Upvotes: 17
Views: 3498
Reputation: 5815
We have a distinct repo for protofiles (called schema
) and multiple repos for every microservice. Also we never store generated code. Server and client files are generated from scratch by protoc
during every build on CI.
Actually this approach works and fits our needs well. But there are two potential pitfalls:
schema
and microservice repositories. Commits to two different git repos are not atomic, so, at the time of schema
updates, there is always a little time period when schema
is updated, while microservice's repo is not yet. Go
, there is a potential problem of moving to Go modules introduced in Go 1.11. We didn't make a comprehensive research on it yet. Upvotes: 7
Reputation: 7642
Each of our microservices has it's own API (protobuf or several protobuf files). For each API we have separate repository. Also we have CI job which build protoclasses into jar (and not only for Java but for another language too) and publish it into our central repository. Than you just add dependencies to API you need.
For example, we have microservice A
, we also have repository a-api
(contains only protofiles) which build by job into jar (and to another languages) com.api.a-service.<version>
Upvotes: 3