Reputation: 131
I have a question regarding the suggested implementation that is in binance documentation. The guidelines are avaliable on the link: How to manage a local order book correctly
If I need a constant stream of @depth data, why do I need first four steps they suggest. Why would I buffer the stream first and then take snapshot just to determine which data to throw away and then continue listening to stream? I don't understand the logical need for those steps if they are even needed for my use case (which is tracking the real time order book data)
Upvotes: 4
Views: 5322
Reputation: 398
If you take a snapshot and then start listening to the stream you may miss an event between getting the snapshot and starting the stream. This'll mean your local order book will be invalid (and you definitely don't want this in a trading application).
The idea behind taking the snapshot after is that you are guaranteed to have all the events after your snapshot. A side effect of this approach is that you may also have some from before your snapshot. So you can discard the few (if any) you don't need based on their lastUpdateId.
I'm not sure what language you're using to manage one but if you want a java implementation let me know and i'll push mine to github so you can use it.
Upvotes: 5