Reputation: 1461
I need to download files with multiple links from a page (may be more than 100 files with separate links) automatically. I know the URL to login and I have credentials.
I'm willing to do this in Java program by automation. The only way to go to the downloading location page is through login to the site.
Is cURL command helpful to this?
Please advise me to do this.
Upvotes: 2
Views: 1492
Reputation: 87
You can use wget which can download log files:
wget -r --no-parent --user=user --password=password --no-check-certificate <URL>
Or You can use following HttpClient in java:
public void saveFile(String url, String FileName) throws ClientProtocolException, IOException{ HttpGet httpget = new HttpGet(url); HttpResponse response = httpClient.execute(httpget); HttpEntity entity = response.getEntity(); if (entity != null) { long len = entity.getContentLength(); InputStream is = entity.getContent(); FileOutputStream fos = new FileOutputStream(new File(filePath))); IOUtils.copy(is, fos); } return; }
Upvotes: 1
Reputation: 317
If you mean to copy a file from a site to a local file then you can use java.nio.file
Files.copy(new URL("http://host/site/filename").openStream(), Paths.get(localfile)
Upvotes: 0