Reputation: 780
I am trying to understand if the concept of Consumer-Driven Contracts ( CDC ) is still valid in the context of Kafka based Event Driven Architecture.
In the RESTful world, Consumer-Driven Contract verification is vital, because there is no concept of schema evolution based compatibility testing that happens at runtime. A bad producer may be free to change the schema and deploy without notifying consumers and if the APIs are not versioned properly, that may break your consumers if the producer hasn't paid attention to the consumer patterns. So the concept of CDC provides the best way in my opinion to ensure that producers stay consumer compliant as they evolve their schemas.
Now, focusing on the Kafka based EDA, the concept of Schema Registry plays a bigger role in consumer - producer relationships. It acts as the coordinator between consumers and producers helping to evolve schemas with a level of compatibility you desire so that producers won't break consumers or at least it has been made very difficult to do so. With that trust in place, I am trying to understand if the concept of CDC is still applicable for Kafka / Schema Registry driven micro-service architectures?
Referring: https://docs.confluent.io/cloud/current/client-apps/testing.html#schema-management-and-evolution
After your applications are running in production, schemas may evolve but still need to be compatible for all applications that rely on both old and new versions of a schema. Confluent Schema Registry allows for schema evolution and provides compatibility checks to ensure that the contract between producers and consumers is not broken."""
Update: Further more, In Kafka, we mostly use Avro like Schema Oriented Serialization formats, it seems the support within the Contract Testing Frameworks for such formats is limited too.
Upvotes: 3
Views: 1346
Reputation: 1054
Schema Registry when used properly "guarantees" that the consumer can de-serialise the message that the producer serialised. That is all.
"Producer driven contract (PDC) testing" should be viewed as an extra semantic layer, that when used properly "guarantees" that the consumer actually "understands" the de-serilised message and can react appropriately (e.g. deliver business value).
If you think about it, if you do proper contract testing, you do not need avro schema evolution at all. As the consumer contract tests will drive which fields are necessary or not. If no consumer needs a field (i.e. no consumer contract test fails), then a message field can be deleted by the producer. Still you have to make consumers flexible to be able to deserialise every (e.g. json) message from "beginning of time", if you keep events forever though and you want to be able to read from offset 0.
Especially if you are doing event-based microservices that emit/ consume Avro events, then I do not see any safe way to deploy your microservices independently just by using a Schema Registry and without doing PDC. Usually companies end up building a distributed monolith that all services are released at the same time and tested via time consuming integration tests. Or just release and deploy the service that changed, but still do time consuming integration tests.
EDIT: In my case I ended up adding support for Avro based contracts. Take a look here: github.com/vspiliop/kafka-contract-test-producer and https://github.com/vspiliop/kafka-contract-test-consumer
Upvotes: 2
Reputation: 191983
By default, sure, but the registry checks can be disabled or reversed such that only forward changes are compatible, not including backwards compatible
It's still recommended to do unit testing on your consumer apps for expected inputs from producers, regardless of a gate-keeping registry (which is not a required property for any producer)
Upvotes: 1