Reputation: 5006
Systems such as beanstalk client-server communications or HTTP pipelining preserve the ordering between multiple requests and responses over a shared connection.
In other words, the first request received by server is the first one the client receives a response back for.
When using Python Streams to access this kind of systems as a client, does Python likewise maintain ordering among writers and readers? In such a scenario, we'd have:
StreamWriter
and a StreamReader
instanceThe code accessing the stream would be in a single place such as in the example below, but could be used from different parts of the app repeatedly before any response is received back.
async def request(host, port, data):
reader, writer = await asyncio.open_connection(host, port)
writer.write(data)
response = await reader()
...
So when our application has called request()
& written to the stream, data is transmitted, remote server is working and finally returning response.
But until then, we await for response, and other code runs, calling our request()
function to send more requests and await for responses in turn.
So, to put the question in a more practical terms here, are the many reader awaiters always getting access to the responses in the same order they sent requests, or not?
Somewhat similar question that however does not provide answer for this one: Does Python's asyncio lock.acquire maintain order?
Upvotes: 2
Views: 653
Reputation: 1363
Answer is: no.
To my knowledge there is no ordered queue guaranteeing the order of processing.
A simple PoC is in gist https://gist.github.com/HQJaTu/345f7147065f1e10587169dc36cc1edb
When run with --server
, a trivial asyncio-server is running. To simulate processing load, each request is treated with a max. 3 second bump-in-the-road with: await asyncio.sleep(randrange(3000) / 1000)
To simulate your question, the client can run specified number of parallel requests.
Example run:
$ asyncio-ordering-PoC/tester.py --client --count 5
2020-09-23 10:28:10,539 [MainThread ] [INFO ] Running client to localhost TCP-port: 8888
2020-09-23 10:28:10,541 [MainThread ] [DEBUG] Send: Test 1
2020-09-23 10:28:10,541 [MainThread ] [DEBUG] Send: Test 2
2020-09-23 10:28:10,541 [MainThread ] [DEBUG] Send: Test 3
2020-09-23 10:28:10,541 [MainThread ] [DEBUG] Send: Test 4
2020-09-23 10:28:10,541 [MainThread ] [DEBUG] Send: Test 5
2020-09-23 10:28:11,105 [MainThread ] [DEBUG] Received: Got: 'Test 2'
2020-09-23 10:28:11,105 [MainThread ] [DEBUG] Close the socket
2020-09-23 10:28:11,443 [MainThread ] [DEBUG] Received: Got: 'Test 4'
2020-09-23 10:28:11,444 [MainThread ] [DEBUG] Close the socket
2020-09-23 10:28:12,404 [MainThread ] [DEBUG] Received: Got: 'Test 1'
2020-09-23 10:28:12,404 [MainThread ] [DEBUG] Close the socket
2020-09-23 10:28:12,619 [MainThread ] [DEBUG] Received: Got: 'Test 5'
2020-09-23 10:28:12,619 [MainThread ] [DEBUG] Close the socket
2020-09-23 10:28:12,922 [MainThread ] [DEBUG] Received: Got: 'Test 3'
2020-09-23 10:28:12,922 [MainThread ] [DEBUG] Close the socket
2020-09-23 10:28:12,922 [MainThread ] [INFO ] Done.
Server log:
2020-09-23 10:28:10,541 [MainThread ] [DEBUG] Received Test 1 from ('127.0.0.1', 47252)
2020-09-23 10:28:10,541 [MainThread ] [DEBUG] Received Test 2 from ('127.0.0.1', 47254)
2020-09-23 10:28:10,541 [MainThread ] [DEBUG] Received Test 3 from ('127.0.0.1', 47256)
2020-09-23 10:28:10,541 [MainThread ] [DEBUG] Received Test 4 from ('127.0.0.1', 47258)
2020-09-23 10:28:10,541 [MainThread ] [DEBUG] Received Test 5 from ('127.0.0.1', 47260)
2020-09-23 10:28:11,104 [MainThread ] [DEBUG] Sending after delay of 562 ms: Test 2
2020-09-23 10:28:11,443 [MainThread ] [DEBUG] Sending after delay of 901 ms: Test 4
2020-09-23 10:28:12,404 [MainThread ] [DEBUG] Sending after delay of 1860 ms: Test 1
2020-09-23 10:28:12,618 [MainThread ] [DEBUG] Sending after delay of 2076 ms: Test 5
2020-09-23 10:28:12,921 [MainThread ] [DEBUG] Sending after delay of 2379 ms: Test 3
Upvotes: 2