Reputation: 605
We are having an xml request that is very huge which contains nearly 10000 xml elements as shown below
<root>
<message></message>
<message></message>
<message></message>
.....
.....
.....
.....
<message></message>
<message></message>
<message></message>
</root>
In mule we are using xpath extractor in for-each processor, which is taking huge amount of time.
Is there a way where we can process huge xml files faster in mule ?
<foreach doc:name="Foreach" batchSize="1" collection="#[xpath://message]">
<!-- stuff -->
</foreach>
Also changing batchSize didn't help.
Is there any other processing way which makes it faster?
Upvotes: 1
Views: 130
Reputation: 51
we can achieve this by following few best practices in mule
use sax parser instead of DOM for loading the xml
divide the file into chunks and process the chunks parellelly
if you are storing the data in several variables do not store the complete xml which may cause the memory leak
remove variables at the end of each mule flow if unnecessary
Upvotes: 1