Reputation: 6532
I'm working on a software that does extensive queries to a database which is has a http interface. So my program parses and handles queries that are in form of long http:// addresses..
I have realized that the bottleneck of this whole system is the querying and the data transfer barely goes above 20KB/s even though I am sitting in the university network with a gigabit connection. Recently a friend of mine mentioned that I might have written my code in an ineffective way and that might be reason for the lack of speed in the process. So my question is what is the fastest/most effective way of getting data from a web source in Java.
Here's the code I have right now:
private void handleQuery(String urlQuery,int qNumber, BufferedWriter out){
BufferedReader reader;
try{
// IO - routines: read from the webservice and print to a log file
reader = new BufferedReader(new InputStreamReader(openURL(urlQuery)));
....
}
}
private InputStream openURL(String urlName)
throws IOException
{
URL url = new URL(urlName);
URLConnection urlConnection = url.openConnection();
return urlConnection.getInputStream();
}
Upvotes: 2
Views: 104
Reputation: 75456
Your code looks good to me. The code snippet doesn't explain the slow read.
Possible problems are,
A profiler and network trace will pin-point the problem.
Upvotes: 2
Reputation: 718758
There is nothing in the code that you have provided that should be a bottleneck. The problem is probably somewhere else; e.g. what you are doing with the characters after you read them, how the remote server is writing them, network or webproxy issues, etc.
Upvotes: 1