Reputation: 15732
I use jQuery to intentionally remove css classes from elements in a potentially large html table. See below for an explanation why I am doing that.
Currently I am doing it like this:
var tableElements = $("#TreeListElemente").find("*").addBack();
tableElements.removeClass("dxtl dxtl__B2 dxtl__B0 dxtlSelectionCell dxtlHeader dxtl__B3 dxtlControl dx-wrap dxtl__IM dxeHyperlink");
The table sometimes is large and has many elements. I would like to speed up the page load / DOM manipulation.
The IE's built-in Javascript profiler tells me that especially the .addBack() is slow. It seems to do some kind of sorting, which is totally unnecessary to my use case. Could I get rid of that? Is there another way to include the selected element itself, besides addBack()?
IE javascript profiler: Execution time for a collection of about 60000 elements. The inclusive times are in the third column
Or is there another, more efficient way to remove classes from a large set of elements, with selecting an element, itself, and all children?
Note: Why am I doing this: I am using the DevXpress TreeList Component which comes with it's own styling. There is no easy way to "unstyle it" on the server side, thus I chose to do that client-side, the way demonstrated above. In the end, I am selecting the TreeList, all child elements, and remove the relevant css classes from them.
I have successfully implemented the solution proposed by Frédéric Hamidi an got quite an improvement:
IE javascript profiler: Execution time for a collection of about 60000 elements, using the proposal by Frederic. The inclusive times are in the third column
The time needed for the addBack() operation are just gone, remaining just the other stuff. This means an improvement by more than factor 4 overall. Yay!
I have also implemented the solution proposed by A. Wolff and got a slight additional improvement:
IE javascript profiler: Execution time for a collection of about 60000 elements, using the proposal by A. Wolff. The inclusive times are in the third column
The time needed for the find() operation is gone, remaining just the other stuff again. This means an slight improvement of some 10s of milliseconds on my machine. Cool!
This is the solution I am using now:
$("#TreeListElemente, #TreeListElemente [class]").removeClass("dxtl dxtl__B2 dxtl__B0 dxtlSelectionCell dxtlHeader dxtl__B3 dxtlControl dx-wrap dxtl__IM dxeHyperlink");
Upvotes: 8
Views: 247
Reputation: 74420
The relevant selector to select element with ID TreeListElemente
and all its descendants would be:
"#TreeListElemente, #TreeListElemente *"
Now you could filter out descendants having class:
"#TreeListElemente, #TreeListElemente [class]"
So it would give:
$("#TreeListElemente, #TreeListElemente [class]").removeClass("dxtl dxtl__B2 dxtl__B0 dxtlSelectionCell dxtlHeader dxtl__B3 dxtlControl dx-wrap dxtl__IM dxeHyperlink");
Upvotes: 3
Reputation: 46341
Here's a thought:
function deClassify(jq, classes) {
var remove = classes.join(' ');
jq.find('.' + classes.join(',.')).removeClass(remove);
jq.removeClass(remove);
}
deClassify($('.keepme'), ['remove', 'remove2', 'remove3']);
.remove, .remove2, .remove3 {
color: red;
}
.keepme, .keepme2 {
font-weight: bold;
}
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<div class="keepme remove remove2">
<div class="keepme2 remove remove3">x</div>
</div>
This avoids "selecting" non-matching elements, reducing the load, and of course no extra sorting is involved...
Upvotes: 0
Reputation: 263047
addBack() does perform a sort to put the matched elements in document order. The easy alternative, add(), does the exact same thing, so it won't solve your problem.
However, the documentation is helpful enough to provide a solution:
To create a jQuery object with elements in a well-defined order and without sorting overhead, use the
$(array_of_DOM_elements)
signature.
Therefore, to avoid that overhead, you can write:
var ancestor = $("#TreeListElemente"),
tableElements = $(ancestor.find("*").get().concat(ancestor[0]));
get() and concat() end up building two arrays under the hood, though, so it will affect performance. The end result may be faster than your current approach, depending on the number of elements you match.
Upvotes: 4