Reputation:
In my page, I have some logic that searches through the large array of items by given term.
These items are the large list of data on the client side.
For example, I search 'user' and it filters out 'user.edit', 'user.save' and so on.
When the list gets very large, it blocks the search input.
When I write 'a' it starts searching, and if I type something, it gets rendered when filtering is complete.
I want the data to be on the client side for a couple of reasons.
Is there the best way to solve the issue:
My current suggestions are these:
1) Filter the items by batches of 2000(or 5000) whatever makes sense.
Filter the first 2000 records show filtered results, after that filter the next 2000, show them and so until all items are iterated.
2) Using setTimeout() for each batch - but this could cause an issue because I don't know how long will take to filter each batch.
3) Using setImmediate - "This method is used to break up long-running operations" - its solution maybe but it is Non-standard, and I don't know if its gonna break sometime in the future.
4) Using promises somehow - I think it will be hard with promises because the code is sync (uses indexOf for example) and with or without the promises it will block the UI.
Can you recommend me something? I avoid large libraries or webworkers.
Thank you.
Upvotes: 0
Views: 46
Reputation: 824
All 1)-4) are valid points. But mainly the optimization of the search depends on your implementation. For example, if you search for strings starting with a given query then you could build a suffix tree https://www.geeksforgeeks.org/pattern-searching-using-suffix-tree/ to decrease complexity.
Also if you are researching thru that array every time user types a letter then I would debounce (https://underscorejs.org/#debounce) search function to be executed only once he stops typing.
Upvotes: 0
Reputation: 1248
This does sound like a good use case for web workers. As they are off thread, it would not block the user interaction.
If I understand correctly, the data is already loaded and it's searching large datasets what is causing the delay. If this is correct:
I think the general answer to your question is using better data structures and algorithms to reduce the complexity.
Assuming that it does not need to match, but simply "start with":
You could store data in a Trie and run the tree until the point and return all the children.
If the data is ordered, you could implement a variation of binary search to look for the the index-range of elements.
If the issue is on handling the large dataset. Yes, loading it progressively would be best. APIs, for example, usually have a next page token for you to use this to call it again. You could do something like that, load a batch, complete the process and, when completed, call the same operations on the next batch.
Upvotes: 1