krispet krispet
krispet krispet

Reputation: 1678

UDP broadcast and automatic server discovery in python, TCP socket unavailable

I'm developing a reverse shell application in python, and right now I'm trying to implement an autodiscovery feature. It should work as follows:

  1. The server broadcasts the IP/port it listens for connections on, and waits for a client. If no client tries to connect in a few seconds, it broadcasts again (and repeat till a connection).
  2. The client tries to receive the broadcast of the server, and connects to the advertised IP/port.

The broadcast works fine, the client receives the IP/port and successfully connects, however after using the connected pair of ports I get (server side):

socket.error: [Errno 35] Resource temporarily unavailable

Server side test code:

sckt = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sckt.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
sckt.settimeout(2)
sckt.bind(('', 9999))
sckt.listen(5)

broadcastSocket = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
broadcastSocket.setsockopt(socket.SOL_SOCKET, socket.SO_BROADCAST, 1)
broadcastSocket.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
while True:
    broadcastSocket.sendto(socket.gethostbyname(socket.getfqdn()) + ' ' + str(9999), ('<broadcast>', 8888))
    try:
        sock, address = sckt.accept()
        break
    except socket.timeout:
        pass
broadcastSocket.close()
sckt.settimeout(None)

sock.send('test')
# if I add time.sleep(1) here, it works, but I don't get why
# would sock be unavailable at first, but available a second later
print sock.recv(1) # this is where it fails
# note that it also fails with any recv buffer size, for instance 1024

Why on earth would I want to receive 1 byte of data, you might ask. I have an algorithm which prefixes messages with their lengths, and the receiver reads this prefix byte-by-byte till a delimiter, thats why. Client side test code:

broadcastSocket = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
broadcastSocket.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
broadcastSocket.settimeout(3)
broadcastSocket.bind(('', 8888))
while True:
    try:
        data = broadcastSocket.recv(1024)
        break
    except socket.timeout:
        pass
broadcastSocket.close()

sckt = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sckt.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
sckt.connect((str(data.split()[0]), int(data.split()[1])))

print sckt.recv(1024)
sckt.send('lel')

If I omit the whole broadcast and autodiscovery part of the code and simply manually enter the IP/port of the server print sock.recv(1) doesn't fail. Any clues on what the issue might be?

Upvotes: 3

Views: 3944

Answers (1)

mementum
mementum

Reputation: 3203

Change sckt.settimeout(None) to sock.settimout(None) in the server code.

You want to have the accepted socket in blocking mode and not the accepting one.

This ensures that the sckt.recv waits for an incoming message from the client.

P.S. sock.setblocking(1) is exactly the same

Upvotes: 2

Related Questions