Reputation: 177
I'm looking for options for our eventsourcing solution. I received recommendations to use Eventbridge Archive since we are already using AWS Eventbridge.
For what I found about the Archive feature, it can store all the events that we send to the eventbus and can replay them later based on a filter.
The problem is that I couldn't find a way to read data from this Archive without replaying the events.
Does anyone know if there is an API that allow me to do that?
Everything I can find on Google is about create, archive and replay, but nothing about reading data from events stored in the Archive.
I found this other question but it didn't have any answers.
Upvotes: 5
Views: 5561
Reputation: 20621
(Disclaimer: this answer is informed only by a quick skim of the Eventbridge docs)
Event sourcing is a bit of an overloaded term: it's been used to mean everything from an architecture built on exchanging events to using persisted events as the source of truth.
In the latter usage of the term (which might be called "going full event sourcing"), strictly speaking, all reads are replays of an event stream, with there being multiple event streams and the same event often being written to multiple event streams (e.g. a "cart checked out" event may well be written to an event stream for that particular cart and to an event stream consisting only of "cart checked out" events).
EventBridge and EventBridge Archive would appear to be sufficient to "go full event sourcing": each one of these event streams becomes an event bus. It might not be ergonomic to create new buses on the fly and the mechanism for publishing the same event to multiple streams may be complex (e.g. a service getting fed base events and continuously projecting them to other streams), assuming that you will have somewhat less than 100 entities (as AWS accounts are limited to 100 event buses). If you have more than 100 entities, then EventBridge isn't a fit for event sourcing.
That said, in event sourced systems it's almost universally common to have some read functionality which doesn't map well to replaying a stream (e.g. adhoc queries/aggregations over the events). This is where Command Query Responsibility Segregation comes in: it's OK if adopting CQRS to
The command-processing/write side of an application often benefits from event sourcing (especially if there's infrastructure to help with some combination of concurrency control/caching/maintaining single-writer). You then project event streams into other data models (e.g. databases or search systems or alerting infrastructure) which can be tuned for the queries that will be performed against them.
So your question is perhaps better expressed as "how do I do CQRS in an event sourced system", and thankfully, there's a well-established body of technique for that. In your case, sending all the events in (at least some of) the streams to a Lambda which writes them to a database is likely to be a reasonable way to support CQRS.
Upvotes: 1