Reputation: 485
SquareSpace does not offer any way to export uploaded content directly. The only export option available is for WordPress, but this only generates a small XML file. What is the best way to download the actual image files from a gallery, other than right-clicking each image and choosing "Save as..."?
Upvotes: 4
Views: 17042
Reputation: 1
I downloaded the Image Downloader plugin from Chrome. Super easy to download all images into folders. Once installed, go to the URL of your website page, hit the plugin, and create a download folder. Done.
Upvotes: 0
Reputation: 1
You can use this repo to download the images from Squarespace. It has a Tkinter GUI to make it easier to use :)
I just coded it and it works fine on my end.
Github link: https://github.com/Mascobot/squarespace_image_downloader
Upvotes: 0
Reputation: 21
In Chrome: File > Save Page As > Web Page Complete
Do this for each page that you want to download the images from.
Upvotes: 2
Reputation: 1
Copy the image and open it in a photo editor like Preview and then export it. That works well for a few images but not so well for many. Or screen shots. Make the image as large as possible and screen shoot it that way.
Upvotes: -2
Reputation: 1
If you don't have too many images, you can do them one at a time from a gallery. While viewing a gallery (Chrome) I can right-click and open the image in a new tab and then save that (getting rid of the parameters after *.jpeg )
Upvotes: 0
Reputation: 21
This worked for me [Python]. If you take the XML file that is exported for you, you can run the following against it.
I had only .png images uploaded. You will have to modify to include jpg and other image file formats.
import requests
import shutil
import xml.etree.ElementTree as ET
tree = ET.parse('filename.xml')
root = tree.getroot()
for i in root.findall('wp:attachment_url'):
print(i)
images = set([elem.text for elem in root.iter() if elem.tag=='link' and '.png' in elem.text])
for img in images:
resp = requests.get(img+'?format=3000w', stream=True)
local_file = open(f'images/{img.split("/")[-1]}', 'wb')
resp.raw.decode_content = True
shutil.copyfileobj(resp.raw, local_file)
del resp
Upvotes: 2
Reputation: 1
Here's an alternative:
Use a crawler like ScreamingFrog and crawl your entire domain. Copy all of your image URLs.
Download the Chrome Addon 'Tab Save' and paste all the links in there.
Download them. Done!
Upvotes: -1
Reputation: 485
I just spent way too long figuring out how to do this, so I'm leaving this here in hopes that it will save someone else time. It's not pretty, and it involves a browser extension, but I believe this is the most efficient way. Broadly speaking, this is what the process looks like:
Repeat the following steps for each gallery:
That's it. All said and done, it's a pretty simple and straightforward process. I went through a lot of different WordPress plugins in an attempt to rehost the external links to the local wp-content folder, export the media library by post, etc. This ended up being much faster and much simpler. Hope it saves you some time.
Upvotes: 1