Reputation: 996
This is a very unexpected behavior:
Here is a textbook classic short program that uses one thread to get characters one by one from a socket stream and display it, and a second thread to read input and send the input over the same socket stream.
import socket import threading import getch import sys
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.setsockopt(socket.SOL_TCP, socket.TCP_NODELAY, 1)
def rxWorker():
while(True):
r = s.recv(1)
print(r.decode('ascii'), end='')
def txWorker():
while (True):
i = input('')
s.send(i.encode())
s.connect(('localhost',9999))
threading.Thread(name='Rx', target=rxWorker).start()
threading.Thread(name='Tx', target=txWorker).start()
This works against netcat listener that is running is a different terminal:
nc -l localhost 9999
At this point everything works well. Lines are sent from side to side and appear as expected.
Now the input is changed to immediate, so the python side sends characters as they are typed (not waiting for a newline), like so:
import socket import threading import getch import sys
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.setsockopt(socket.SOL_TCP, socket.TCP_NODELAY, 1)
def rxWorker():
while(True):
r = s.recv(1)
print(r.decode('ascii'), end='')
def txWorker():
while (True):
# i = input('')
i = getch.getche()
s.send(i.encode())
s.connect(('localhost',9999))
threading.Thread(name='Rx', target=rxWorker).start()
threading.Thread(name='Tx', target=txWorker).start()
Note that the only change is the way the input is read: i = getch.getche()
vs i = input('')
Now the behavior is different.
Characters from the python side appear at the netcat side correctly and immediately.
The problem: character from the netcat side now do not show at the python side immediately. Thy actually do not show until one or several characters are sent from python to netcat.
This is very strange and kind of breaks my control flow/:
Please advise
System: Ubuntu 16.04 , python 3.5.2
Upvotes: 1
Views: 124
Reputation: 365895
The main problem is that you haven't actually enabled TCP_NODELAY
.
This line does not do what you think it does:
s = socket.socket(socket.AF_INET, socket.TCP_NODELAY )
The arguments to socket
are:
socket.socket(family=AF_INET, type=SOCK_STREAM, proto=0, fileno=None)
It just so happens that, at least on *nix systems, TCP_NODELAY
is 1
, and SOCK_STREAM
is an enum with value 1
, so by passing TCP_NODELAY
as the type you're selecting SOCK_STREAM
, aka TCP. When a future version of the socket
library switches to using enums for sockopt
values (as it already does for the types), this will become a TypeError
instead of silently doing something unexpected that occasionally happens to sort of work.
If you want to enable a sockopt
value like TCP_NODELAY
, you have to call setsockopt
on the socket. Like this:
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.setsockopt(socket.SOL_TCP, socket.TCP_NODELAY, 1)
As a side note, there's at least one other obvious error in your code, although it isn't causing any problems:
rxt = threading.Thread(name='Rx', target=rxWorker).start()
Thread.start
returns None
. So you're seeing rxt
(and txt
) to None
. Fortunately, you don't actually do anything with them—but as soon as you do, e.g., rxt.join()
, you're going to get an AttributeError
.
Upvotes: 2