Reputation: 641
In my event sourcing model, I have an aggregate modeled after a warehouse. For this I have issued such commands like CreateBox
and ChangeBoxLocation
.
With the CreateBox
command, I have implemented it such that it has a 1:n relationship with events. That is, a valid CreateBox
command will dispatch two events: BoxCreated
and BoxLocationChanged
, because an added box should be moved to a location.
I noticed that the user's of the system actually add boxes to the system in bulk. For example, 300 boxes of the same type may come in, and they would scan these all into the system at once (instead of individually scanning each one).
This lead me to the idea of implementing a CreateBoxesInBulk
command which if valid, would dispatch BoxCreated
and BoxLocationChanged
N times where N is the number of boxes scanned in bulk.
My question is simply, is this a valid approach?
Additionally: What potential complications could this cause? I am thinking complexity could suffer from this command, and maybe implementing snapshot optimization for the aggregate would be beneficial.
Upvotes: 1
Views: 43
Reputation: 57239
is this a valid approach?
Yes - there's nothing wrong with deriving N events from a single command.
What potential complications could this cause?
Figuring out the right groupings of your events.
Writing N events into a single document/stream will normally be fine.
Writing N events into N documents/streams is fine, if you can treat the write of each event as an independent thing (there are no invariants to maintain, or if it is OK to lose the events that are not consistent with the invariant of their own individual document).
Trying to manage the invariant of N different documents as a set; that's painful. A data race happens, and all but the last event can be written correctly. Now what?
Upvotes: 1