Reputation: 1027
I have a collection of 10 messages sorted by number of likes message has. Periodically i update that collection with replacing some of the old messages with new that got more likes in meantime, so that collection again contains 10 messages and is sorted by number of likes.
I have api to insert or remove message from collection relative to existing member message. insert(message_id, relative_to, above_or_bellow)
and remove(message_id)
. I want to minimize number of api calls by optimizing position where I insert new messages so that collection is always sorted and 10 long at the end. (in the process length and order is irrelevant, just at the end of process)
I know i can calculate new collection and then replace just messages that dont match their new position but I believe it can be further optimized and algorithms exist already.
Edit:
Note the word "periodically", meaning messages do not come one by one, but in time interval i collect new messages, sort them, and make new collection which i then publish on site via api. So i do have 2 collections, one is simple array in memory, and other is on site.
Idea is to reuse already inserted messages that should be kept and their order in updated collection to save http api calls. I believe there are existing algorithms i could reuse to transform existing collection into already known resulting collection with minimal number of insert, remove operations.
Upvotes: 0
Views: 104
Reputation: 749
First remove all messages that are no longer in the top 10 liked messages.
In order to get the most from the existing list, we should now look for the longest subsequence of messages that is ordered by their likes (we can use the algorithm mentioned here using number of likes as value How to determine the longest increasing subsequence using dynamic programming? )
We would then remove all other messages (not in subsequence) and insert the missing ones by their order.
Upvotes: 1
Reputation: 14307
I think you only need to keep one list/vector of messages and keep it sorted at all times and up to date with every new message.
Since this collection will always be sorted and assuming it has random access you could use binary search to find the insertion point i.e. O(log_2^M) where M is your maximum list size e.g. 10. But then when you insert here it anyway requires O(M) to shift the elements. Therefore, I would just use a linked list and iterate it while the message to insert (or update) has less likes than the current one.
Upvotes: 1