Master.Aurora
Master.Aurora

Reputation: 1038

Simulating latency in a HTTP POST call on Linux

Use Case

I am working on an enterprise level payment application (written in JAVA). I am looking to simulate latency on a HTTP POST call that is made to a bank. This will allow me to simulate different latency/unavailability scenarios that may occur.

The Code

The following piece of code sends the request to the Bank:

try {
    // Set the location of the Bank Of America payment gateway
    URL url = new URL(getParameterGatewayUrl());

    // Open the connection
    urlConnection = url.openConnection();

    // Set the connection timeout
    urlConnection.setConnectTimeout(getTimeoutSeconds() * 1000);

    // Set the DoOutput flag to true because we intend
    // to use the URL connection for output
    urlConnection.setDoOutput(true);

    // Send the transaction via HTTPS POST
    OutputStream outputStream = urlConnection.getOutputStream();
    outputStream.write(postVars.getBytes());
    outputStream.close();
} catch (TimeoutException exception){

}

try {
    //Set the read timeout
    urlConnection.setReadTimeout(getTimeoutSeconds() * 1000);

    // Get the response from Bank Of America
    InputStream inputStream = urlConnection.getInputStream();
    while ((inputStreamCharacter = inputStream.read()) != -1)
        responseText.append((char) inputStreamCharacter);

    inputStream.close();
    LOG.debug("Bank Of America responseText: " + responseText);
} catch (SocketTimeoutException exception){

}

This piece of code runs in an async payment task.

Scenario

Here is what happens:

  1. We make a connection to the bank's payment gateway
  2. We send a payment request to the bank.
  3. We wait for the bank to send us a response to our request.
  4. We receive the response and parse it and check to success.

Now, we would like to simulate latency after the request has been sent to the bank i.e. before we receive a response from the bank. So that a SocketTimeoutException exception is raised in the second try/catch block.

Problem

The instance of the application that I need to work on is hosted on a server VM running 14.04.1-Ubuntu. I have used Fiddler to introduce latency on our windows hosted applications in the past. But the tricky thing is that Fiddler is a UI based program and I am not able to use over on Linux shell. Also, we have not had much help from the bank. Otherwise it would have been much easier if this all was simulated on the server side rather than on the client side.

Question

I have googled and have not been able to find a solution for this. Has anyone ever tried something along these lines? If so how can we do it? Any suggestions would be welcome.

Upvotes: 4

Views: 1125

Answers (2)

EricLaw
EricLaw

Reputation: 57085

Fiddler is a proxy server; you can simply run it on one machine (anywhere) and point the proxy settings of the Linux client at it.

See Monitor Remote requests for more information.

Upvotes: 0

Master.Aurora
Master.Aurora

Reputation: 1038

I have been able to find a workaround for testing this. I took help from this answer. While httpbin is an amazing project it was missing the ability to delay the response of a POST request. Thus I forked their repository, and added the required endpoint myself. The fork is available for anyone who needs it.

Now, one can simply change the gateway URL to a URL based on httpbin with /post/delay/ and a delayed POST request response will be generated as a result.

Upvotes: 2

Related Questions