Reputation: 33783
I am designing an application that needs to load HTML content from a specific URL on server side by using Java. How can I solve it?
Regards,
Upvotes: 1
Views: 13520
Reputation: 2106
import java.io.BufferedReader; import java.io.IOException; import java.io.InputStreamReader; import java.net.MalformedURLException; import java.net.URL; import java.net.URLConnection;
public class URLConetent{ public static void main(String[] args) {
URL url;
try {
// get URL content
String a="http://localhost:8080//TestWeb/index.jsp";
url = new URL(a);
URLConnection conn = url.openConnection();
// open the stream and put it into BufferedReader
BufferedReader br = new BufferedReader(
new InputStreamReader(conn.getInputStream()));
String inputLine;
while ((inputLine = br.readLine()) != null) {
System.out.println(inputLine);
}
br.close();
System.out.println("Done");
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
}
Upvotes: 0
Reputation: 49339
If all you need is read the url you do not need to resort to third party libraries, java has built in support to retrieve urls.
import java.net.*;
import java.io.*;
public class URLConnectionReader {
public static void main(String[] args) throws Exception {
URL yahoo = new URL("http://www.yahoo.com/");
URLConnection yc = yahoo.openConnection();
BufferedReader in = new BufferedReader(
new InputStreamReader(
yc.getInputStream()));
String inputLine;
while ((inputLine = in.readLine()) != null)
System.out.println(inputLine);
in.close();
}
}
Upvotes: 1
Reputation: 9644
I have used the Apache Commons HttpClient library to do this. Have a look here: http://hc.apache.org/httpclient-3.x/tutorial.html
It is more feature rich than the JDK HTTP client support.
Upvotes: 4
Reputation: 37065
If it was php, you could use cURL, but since it's java, you would use HttpURLConnection, as I just found out on this question:
Upvotes: 0