Fried Rice
Fried Rice

Reputation: 3713

Handle Java socket concurrency

I am building a server that sends data via a single TCP socket for each user every 2 seconds and on a separate thread. There are also special events occasionally sent along side with the regular data. Sometimes, data in multiple packets would mix up so I created a queue to make sure it does not happen. However, the issue is still there, is my approach not correct or is there something wrong with my code?

    protected void sendData (byte[] data) {
        if (isSendingData) {
            dataQueue.add(data);
            return;
        }

        isSendingData = true;
        Thread sendThread = new Thread() {
            public void run () {
                try {
                    BufferedOutputStream outStream = new BufferedOutputStream(connectionSocket.getOutputStream());
                    outStream.write(data);
                    outStream.flush();

                    // check queue, if there are data, send
                    byte[] moreData = null;
                    if (dataQueue.size() > 0) {
                        moreData = dataQueue.remove(0);
                    }
                    isSendingData = false;
                    if (moreData != null) {
                        sendData(moreData);
                    }
                }
                catch (Exception e) {
                    System.out.println ("Error sending data to peripheral: " + e);
                    isSendingData = false;
                }
            }
        };

        sendThread.start ();
    }

Upvotes: 0

Views: 568

Answers (2)

user207421
user207421

Reputation: 310884

I would do this completely differently. You don't want arbitrarily long queues in your application.

  • Have your hearbeat thread synchronize on the socket when sending the heartbeat.
  • Don't have it sending anything else.
  • Get rid of the queue, isSendingData, etc.
  • Have your main application synchronize on the socket when it wants to send, and just send whenever it needs to.
  • Use the same BufferedOutputStream or BufferedWriter for all sending, and flush it after each send.

Upvotes: 1

Marko Topolnik
Marko Topolnik

Reputation: 200148

The proper idiom to remove concurrency issues using a queue is to have a long-lived thread run an infinite loop which takes elements from the queue and processes them. Typically you'll use a blocking queue so that on each iteration the thread goes to sleep until there is an item to process.

Your solution deviates from the above in many aspects. For example:

    if (isSendingData) {
        dataQueue.add(data);
        return;
    }

    isSendingData = true;

—if this method is called concurrently, this will result in a race condition: both threads can read isSendingData as false, then concurrently proceed to sending data over the network. If isSendingData isn't volatile, you've also got a data race on it (entirely separate from the race condition explained above).

                if (dataQueue.size() > 0) {
                    moreData = dataQueue.remove(0);
                }

—this is another race condition: after you read size as zero, the other thread can add an item to the queue. Now that item will possibly never be processed. It will linger in the queue until another such thread is started.

The more obvious way your solution is not correct is that the thread you start has no loops and is intended to just process one message, plus possibly one extra message in the queue. This should be reworked so that there are no special cases and sendData always, unconditionally, submits to a queue and never does any sending on its own.

Upvotes: 2

Related Questions