Reputation: 4389
I'm have a list of URLs, and I need to save files from them to my local folder.
All URLs are links to images, one image per URL. Do I need to read data from the URL or is there any sort of library where I can just specify the URL and file name (of course, get an image extension before) and save it?
Upvotes: 1
Views: 4571
Reputation: 2210
Being a Linux administrator, the first thing that comes to mind is possibly using a Wget list function. From Java you could call something like:
Process p = Runtime.getRuntime().exec(" wget -i list_of_images.txt");
With the .txt
file containing the list of all the images, you could export the images to a temporary list. This is a rough method to do it, but it should work.
Upvotes: 3
Reputation: 117587
Check the example URL file Download and Save in the Local Directory.
Upvotes: 5
Reputation: 7838
Get an InputStream via URL#openStream and write it to a File (e.g. via FileWriter) to your desired location.
Upvotes: 2