Estimate
Estimate

Reputation: 1461

How to automate downloading a files using java program

I need to download files with multiple links from a page (may be more than 100 files with separate links) automatically. I know the URL to login and I have credentials.

I'm willing to do this in Java program by automation. The only way to go to the downloading location page is through login to the site.

Is cURL command helpful to this?

Please advise me to do this.

Upvotes: 2

Views: 1492

Answers (2)

Sn.
Sn.

Reputation: 87

You can use wget which can download log files:

wget -r --no-parent --user=user --password=password --no-check-certificate <URL>
  • You can pass headers in --header, e.g. --header "Cookie: JSONSESSIONID=3433434343434"
  • you can pass post data using --post-data 'email=$EMAIL&password=$PASSWRD'

Or You can use following HttpClient in java:

  • Here is examples of HTTPClient for login and passing POST/GET/Headers information
  • First get whole HTML page as String
  • Either parse that String to get links for files or convert to java objects using XML to Object mappers like https://github.com/FasterXML/jackson-dataformat-xml
  • Once you get the links of files to download files using HttpClient
public void saveFile(String url, String FileName) throws ClientProtocolException, IOException{
            HttpGet httpget = new HttpGet(url);
            HttpResponse response = httpClient.execute(httpget);
            HttpEntity entity = response.getEntity();
            if (entity != null) {
                long len = entity.getContentLength();
                InputStream is = entity.getContent();
                FileOutputStream fos = new FileOutputStream(new File(filePath)));
                IOUtils.copy(is, fos);
            }
            return;
        }

Upvotes: 1

Jackcob
Jackcob

Reputation: 317

If you mean to copy a file from a site to a local file then you can use java.nio.file

Files.copy(new URL("http://host/site/filename").openStream(), Paths.get(localfile)

Upvotes: 0

Related Questions