Reputation: 2272
To close the application as soon as possible, can I interrupt requests.post call from another thread and have it terminate connection immediately ?
I played with adapters, but no luck so far:
for ad in self.client.session.adapters.values():
ad.close()
Upvotes: 9
Views: 9326
Reputation: 15558
The right way to do this is to use message passing into the other thread. We can do a poor-mans version of this by using a shared global variable. As an example, you can try running this script:
#!/usr/bin/env python
# A test script to verify that you can abort streaming downloads of large
# files.
import threading
import time
import requests
stop_download = False
def download(url):
r = requests.get(url, stream=True)
data = ''
content_gen = r.iter_content()
while (stop_download == False):
try:
data = r.iter_content(1024)
except StopIteration:
break
if (stop_download == True):
print 'Killed from other thread!'
r.close()
if __name__ == '__main__':
t = threading.Thread(target=download,
args=('http://ftp.freebsd.org/pub/FreeBSD/ISO-IMAGES-amd64/9.1/FreeBSD-9.1-RELEASE-amd64-dvd1.iso',)
).start()
time.sleep(5)
stop_download = True
time.sleep(5) # Just to make sure you believe that the message actually stopped the other thread.
When doing this in production, especially if you don't have the protection of the GIL, you will want to use more caution around the message-passing state to avoid awkward multithreading bugs. I'm leaving that up to the implementor.
Upvotes: 5
Reputation: 2272
I found a way, here is how to interrupt connection
def close():
time.sleep(5)
r.raw._fp.close()
t = threading.Thread(target=close).start()
print "getting"
s = requests.Session()
r = s.get("http://download.thinkbroadband.com/1GB.zip", stream = True)
for line in r.iter_content(1024):
log.debug("got it: %s", len(line))
print "done"
However it is a hack and I don't like it, private members can change in the future, I am returning to urllib2
Upvotes: 2
Reputation: 28845
So if you do the following from the interactive shell, you'll see that closing the adapters doesn't appear to do what you're looking for.
import requests
s = requests.session()
s.close()
s.get('http://httpbin.org/get')
<Response [200]>
for _, adapter in s.adapters.items():
adapter.close()
s.get('http://httpbin.org/get')
<Response [200]>
s.get('https://httpbin.org/get')
<Response [200]>
This looks like it may be a bug in requests, but in general, closing the adapter should prevent you from making further requests but I'm not entirely sure it will interrupt currently running requests.
Looking at HTTPAdapter (which powers both the standard 'http://'
and 'https://'
adapters), calling close on it will call clear
on the underlying urrllib3 PoolManager. From urllib3's documentation of that method you see that:
This will not affect in-flight connections, but they will not be
re-used after completion.
So in essence, you see that you cannot affect a connection that has not yet completed.
Upvotes: 0