Reputation: 12373
I've got a very large XML file (1.5GB) that I need to parse and then insert specific values into a MySQL table.
Now the way I would have usually done parsing on a DOM would be to use jQuery or PHP Simple Dom Parser but in this situation, given the file size, I don't think either are suitable. I need the emphasis to be on performance. I've read a little about SimpleXML and XML Parser for PHP and it seems each have their advantages but I'm not sure if either of these are suitable for a file with a size of 1.5GB.
I've also seen Pear's XML parser mentioned but, again, I don't know if this is suitable in this situation. From what I've read it seems that I need to load into memory only the required nodes and not the whole tree itself. Even now i'm having trouble actually viewing the document due to the size. VIM seems to be the only editor that can handle it but even then scrolling through the document can cause a crash.
If anyone can recommend one of these above the other, or even an entirely different solution that would be great.
That would then bring me to my SQL inserts which I was going to do on the fly - so after i've parse a node and pulled the values I require I will insert these into the database. Again, any advice would be great.
Upvotes: 4
Views: 1817
Reputation:
SimpleXml and DOM are not meant for big XML files
try:
or even better/faster (but slightly more complicate to use)
Upvotes: 2
Reputation: 57690
For such huge XML file its recommended to use SAX based XML parsers. In PHP you can do it with "XML Parser". It consumes less memory than its peers. Also its very fast.
Upvotes: 2