user1592579
user1592579

Reputation:

JSON / jQuery performance with lots of data

This is my first question on this website, and I'm hoping someone with experience can give me some advice.

I have built a javascript/jquery app for a client, and the key feature of it is the filtering mechanism for certain items. There is around 2000 of those items which are stored in a list like this:

<ul>
<li id='3000429'>Item</li>
<li id='3000429'>Item</li>
<li id='3000429'>Item</li>
<li id='3000429'>Item</li>
</ul>

So around 2000 lines of html just for that, I've removed the onClick handlers for the <li> elements for demonstration purposes. The html list with appropriate ids for the items is generated from the mysql db via php on page load.

What I do now is filter my results from my database and hide all the items first, then once I get the calculated results back from PHP via ajax, which is an array od ids, I then display only the items whos id is in the results array (using jQuery's .show() and .hide() ). So all the items are present in code all the time, just that some are hidden.

Would it be better to use JSON and totally remove all the items from the html list upon the receive of results from it, and then generate new html items that are filtered out from the db via JSON objects and jQuery.

I am asking because I know that some browsers don't take the stress of a lot of html very well (IE especially...).

When I get a lot of results back, say more than 1000, all browsers tend to lag a bit, probably because they are going through all the items and re-displaying the items (.show()). If the user triggers the filtering in short intervals the user experience sucks.

So, from a performance point of view, do you guys think I would be better off with loading JSON and constantly generating and deleting html, or the way I have it set up now (the show/hide method)? I'm new to JSON so I'm not quite familiar with the performance aspects of it when it comes to a lot of data.

Thanks in advance, I appreciate it!

Upvotes: 3

Views: 1231

Answers (4)

Jason Sperske
Jason Sperske

Reputation: 30446

I have an HTML+JavaScript web app that has a page that contains as many as 50,000 rows, each with unique data. In the original code that I inherited, the server would generate the full markup (approximately 30MB of HTML), and then generate a filtered list (again on the server) every time you changed some filter constraint. The performance was so bad that the page would default to the most filtered (and least usful) view and include a warning that expanding the results only worked on modern computers. Here is how I fixed it and why I made each choice.

First to address the volume of data being sent I created a lean JSON representation (take a look at Google Charts Datasource API for a solid example of this) (this reduced the data down to about 1 MB when I combined it with GZip compression)

Then the next problem was that a table with 50,000 rows (each with 4 cells) creates 250,000 DOM Nodes (each <tr> and <td> is a node), that is a huge burden on Selector engines and the rendering pipeline. To fix this I fetched the JSON via AJAX and stored it's results in a variable that was outside of the AJAX fetch function. Then I wrote code to use this cached data and render it in pages (so page change events don't go back to the server). I played with the best number of rows per page (500 seemed the most compatible for my needs). As for hiding verses rebuilding, I say rebuild. The performance gains of hiding specific cells are quickly lost when the number of things you are going to hide increases.

Finally because every node isn't visible on the page at once things like Ctrl+F to find are broken. For this I created a filtered render function so table filtering can be handled with a in page UI (which lets me do things like find based on specific columns or even with data that isn't rendered in the table but is passed over the JSON). The other challenge is printing, again because no single page has a visual representation of everything. To address this I did something a little more complicated. Using MemcacheD I stored the filtered results on the server and added a export to PDF button that would take the cached copy of the result set (you have to adjust MemCacheD to store things larger than 1MB). Form a user's perspective the PDF (which is generated by a server that I can control) can take more punishment than a browser that I don't control.

In the end I'm very satisfied with the results. the page loads in seconds now, users like that their browsers don't crash, and the solution is generic enough that it is easy to apply it to all of our other reports.

Upvotes: 0

dqhendricks
dqhendricks

Reputation: 19251

instead of filtering this on the client side, why not just generate the list on the server side and send it to the client?

html

<div id="bigAssList">
   <ul>
      <li></li>
      A billion rows...
   </ul>
</div>

js

$.ajax( {
   url: 'someurl.php',
   data: someFilterCriteria,
   success: function( response ) {
      $( '#bigAssList' ).html( response );
   }
} );

php

// query based on filter criteria
// output list in ul html form

And even better than that, pagenate your data. I doubt anyone needs 2000 rows on one page. And why use Ajax to do this at all? Wouldn't a submit/page refresh be just as effective? Might be unnecessarily complicating this.

Upvotes: 0

tonino.j
tonino.j

Reputation: 3871

You should use additional library, like backbone & underscore stack (underscore has fenomenal filtering & other utilities, and templating, and the stack is perfect for your purpose), or you should consider angular.js, which also has much higher level of operation on lists, objects, data & html. In angular you can achieve complex filtering & dynamic html updating probably with 2-3 lines of code.

In any case, regardless of performance, don't do it with html <li> items, do it with data. These libraries I suggested will make it much simpler & faster.

Worth learning.

Upvotes: 1

wezzy
wezzy

Reputation: 5935

as a general tip for performance try to recude DOM manipulation, if you generate a nel HTML code and change the content of your container tag using $().html() it's a lot faster than traversing 2000 tag elements and change the visibility of 1000 of them.

You can also listen to the click event on the container avoiding 2000 event listeners

Hope this helps

Upvotes: 0

Related Questions