Reputation: 21
We have a requirement to develop a service which should be able to handle high volume i.e. around 5TPS. We have to do 3-4 parallel downstream rest calls(below mentioned is POC with 2 rest calls)-->aggregate all the responses --> do some processing --> send response back to consumer. The service endpoint is sync.
I am trying to achieve it using Completable future to do async rest call and join method to aggregate the responses like below: { ResponseWrapper response = null;
CompletableFuture<Student> s = CompletableFuture.supplyAsync(() -> {
return studentService.getStudentDetails(requestContext, id, name, correlationId);
},pool
).exceptionally(ex -> {
log.error("Something went wrong in devicedm: ", ex);
Student wrapper = new Student();
wrapper.setStatus("500");
Errors err = new Errors();
err.setDescription(ex.getCause().toString());
wrapper.setError(err);
return wrapper;
}
);
CompletableFuture<Employee> s1 = CompletableFuture.supplyAsync(() -> {
return employeeService.getEmployeeDetails(requestContext, id, name, correlationId);
},pool
).exceptionally(ex -> {
log.error("Something went wrong in Employee: ", ex);
Employee wrapper = new Employee();
wrapper.setStatus(500);
Errors err = new Errors();
err.setDescription(ex.getCause().toString());
wrapper.setError(err);
return wrapper;
}
);
CompletableFuture<ResponseWrapper> combinedDataCompletionStage = CompletableFuture.allOf(s, s1)
.thenApply(ignoredVoid -> combine(s.join(), s1.join()));
try {
response = combinedDataCompletionStage.get();
} catch (InterruptedException | ExecutionException e) {
Errors error = new Errors();
error.setDescription(e.getCause().toString());
response.setError(error);
}
private ResponseWrapper combine(Student s, Employee s1) {
ResponseWrapper response = new ResponseWrapper();
response.setStudentInfo(s);
response.setEmployeeInfo(s1);
return response;
}
}
So need help to understand whether two threads s and s1 would be released as soon as rest call is done as we are using completable future and available to handle other requests?
What would be the behaviour of join method, as from java doc seems like it would wait for s and s1 to return the response and then join operation would be performed. Is this a blocking call?
And once we aggregate the response, we have to do get call which return us the final response like below:
response = combinedDataCompletionStage.get();
I tried to do performance testing using Jmeter but not able to come to conclusion, would like to know if this approach would work to achieve the required throughput or any modifications need to do in above code or there is limitation in java considering so many IO calls? If yes then is it feasible in NodeJs?
Any suggestions would be appreciated!
Upvotes: 0
Views: 202
Reputation: 13535
So you want to maximize throughput having low level of parallelism. The best way to do so is to use threads and not CompletableFuture. And of course Node.js is not a better variant.
Let the main thread (which handles the request) starts parallel threads which make downstream rest calls. Let the threads save results in special object which notifies main thread when all rest calls are finished. Then the main thread makes processing and sends response back to consumer.
Asynchronous I/O has lower performance and must be used only when the number of threads is so high that cannot fit in available core memory.
Upvotes: 0