Reputation: 614
I've got a local exist-db server running and I have over 1 million xml files I wish to add to it. The server is headless and I'm using the cli client. All of my files are in a collection of hundreds of zip files on the server, each with 5000-10000 files in them. My current workflow is manually adding each zip file using the client started in this manner:
eXist-db/bin/client.sh --no-gui
I'm using the putzip
command and waiting for the indexing to complete and return me to the prompt:
exist:/db/collection> putzip /home/user/data/batch_01/xml_doc_01.zip
entering directory doc0001.xml
storing Zip-entry document doc0001.xml (1 of 5000) ...done
...
entering directory doc5000.xml
storing Zip-entry document doc5000.xml (5000 of 5000) ...done
parsed 1234567 bytes in 6000ms.
... *several minute delay*
exist:/db/collection>
I have several hundred zip files, so this would take a very long time to do manually. Is there an automated way to do this? Thank you.
Upvotes: 1
Views: 61
Reputation: 15205
find /path/to/base-directory -type f -name xml\*zip | while IFS= read -r name ; do
echo "putzip ${name}" | eXist-db/bin/client.sh --no-gui
done
Should work - obviously untested.
Upvotes: 2