Reputation: 423
I am working on a plone site with version 4. I want to have a copy of all the uploaded images. What is the way to get it? I couldn't get appropriate documentation for it.
Upvotes: 1
Views: 346
Reputation: 41
If you are comfortable with the Linux command line and want something quick and dirty to get ALL images locally, you can add a script (Python) via the zmi into the portal_skins/custom folder on the Plone site, with the following code:
from Products.CMFCore.utils import getToolByName
base = 'http://example.com/'
pc = getToolByName(context, 'portal_catalog')
query = {'portal_type': 'Image'}
brains = pc(query)
for brain in brains:
path = brain.getObject().absolute_url()[len(base):]
dire = '/'.join(path.split('/')[:-1])
print "mkdir -p '%s';wget %s%s -O '%s'" % (dire, base, path, path)
return printed
(Replace example.com with your site domain name)
Call the script 'image_links'.
Then go to to http://example.com/image_links
You should see a whole bunch of bash shell script commands on the web page. Copy and paste those into a terminal, and your machine should download the images.
Alternatively, put those commands into a shell script and download them all at once.
This produces a directory structure with all images on the site.
I used something similar a few weeks ago to download 3500 images from a client site. Took an hour or 2, but without involvement from me.
Keep in mind that this is quick and dirty: Images get the Plone object name, so might not have the correct extension, etc. YMMV.
Upvotes: 1
Reputation: 7819
A simple (but manual) way is to connect to your Plone site using a webdav client and authenticating using Plone credentials.
In this way you can move inside Plone folders and drag & drop images from remove site to your local machine.
Upvotes: 0