Reputation: 175
I have a global array containing the IDs of elements that I am currently working with. Every second I run a routine that does stuff to these elements.
var ids = ['abc', 'def', 'zyx']
// the following code happens every second
for (var i = 0; i < ids.length; i++) {
el = $("#" + ids[i])
// do stuff with el
}
My question: would I suffer a notable performance hit or improvement to do the following:
var ids = []
ids.push($("#abc"))
ids.push($("#def"))
ids.push($("#zyx"))
for (var i = 0; i < ids.length; i++) {
el = ids[i]
// do stuff with el
}
Upvotes: 1
Views: 329
Reputation: 149564
You don’t have to guess — why not create a jsPerf test case?
Anyhow, this change would greatly improve performance. There’s no reason to reconstruct jQuery objects for the same elements all the time.
My advice: keep things DRY — cache everything that can be reused.
Also note that instead of:
var ids = [];
ids.push($("#abc"));
ids.push($("#def"));
ids.push($("#zyx"));
You could just do:
var ids = [ $("#abc"), $("#def"), $("#zyx") ];
This saves a few function calls.
Upvotes: 0
Reputation: 700342
You would get a slight improvement, as you are moving some work out of the loop and doing it only once.
For just three items and as seldom as once a second you would hardly notice the difference, though. Locating an element by id is rather easy as the browser has a method specifically for that, and creating a jQuery object is not so much work either.
Upvotes: 1
Reputation: 254926
You will not see any notable performance growth but it is just a good habit - to not request the thing twice, as long as you can do that just once an keep somewhere.
So my final advice: cache jquery objects just once and after that work with array of jquery objects
Upvotes: 2