Shashank
Shashank

Reputation: 416

High latency on increasing thread count

In below code DataGather = endDataGather - beginDataGather takes 1.7ms & time for service to respond = service_COMPLETED - service_REQUEST_SENT which vary from 20us to 200 us(since they are mocked dummy on same lan hence so low) now if i increase tomcat8 thread from 10 to 200,DataGather increase to 150ms + and even if I increase thread from 200 to 1000 then it even increase 250+.Machine specs 8 core Xenon,64gb ram. Time is measured when apache benchmark runs with -n 40000 -c 100 args , is this due to thread scheduling/context swtiching or something else? How do I get rid of this variation? Will it remain when real services will come into picture which have latency of 20-100ms.

       public List<ServiceResponse> getData(final List<Service> services, final Data data) {
          //beginDateGather;

          final List<ServiceResponse> serviceResponses = Collections.synchronizedList(new ArrayList<>());
          try {
            final CountDownLatch latch = new CountDownLatch(services.size());
            Map<Future<HttpResponse>, HttpRequestBase> responseRequestMap = new HashMap<Future<HttpResponse>, HttpRequestBase>();

            for (final service service : services) {
              //creating request for a service
              try {
                HttpRequestBase request = RequestCreator.getRequestBase(service, data);
                //service_REQUEST_SENT
                Future<HttpResponse> response = client.execute(request,
                    new MyFutureCallback(service, data, latch, serviceResponses));
                responseRequestMap.put(response, request);
              } catch (Exception e) {
                latch.countDown();
              }
            }
            try {
              boolean isWaitIsOver = latch.await(timeout, TimeUnit.MILLISECONDS);
              if (!isWaitIsOver) {
                for (Future<HttpResponse> response : responseRequestMap.keySet()) {
                  if (!response.isDone()) {
                    response.cancel(true);
                  }
                }
              }
            } catch (InterruptedException e) {
            }
          } catch (Exception e) {
          }
          //endDataGather
          return serviceResponses;
    }


     public class MyFutureCallback implements FutureCallback<HttpResponse> {

        private Service service;
        private Data data;
        private CountDownLatch latch;
        private List<serviceResponse> serviceResponses;

        public MyFutureCallback( Service service, Data data, CountDownLatch latch, List<ServiceResponse> serviceResponses) {
          this.service = service;
          this.data = data;
          this.latch = latch;
          this.serviceResponses = serviceResponses;
        }

        @Override
        public void completed(HttpResponse result) {
          try {
            ServiceResponse serviceResponse = parseResponse(result, data, service);
              serviceResponses.add(serviceResponse);
          } catch (Exception e) {
          } finally {
            //service_COMPLETED
            latch.countDown();
          }
        }

        @Override
        public void failed(Exception ex) {
          latch.countDown();
        }

        @Override
        public void cancelled() {
          latch.countDown();
        }
       }

Upvotes: 1

Views: 414

Answers (1)

Rahul Vedpathak
Rahul Vedpathak

Reputation: 1436

Yes it seems due to context switching of threads. Increasing the number of threads won't help in this case. You can use a thread pool for callbacks. Check this link for your reference and try to use .PoolingClientAsyncConnectionManager

How to use HttpAsyncClient with multithreaded operation?

Upvotes: 1

Related Questions