Reputation: 75
I am trying to share session between 2 functions. After logging in, I must be able to access other pages accessible only if I am authenticated.
import asyncio
import aiohttp
import time
class Http:
async def __aenter__(self):
self._session = aiohttp.ClientSession()
return self
async def __aexit__(self, *err):
await self._session.close()
self._session = None
async def do_post(self, url,data, headers):
async with self._session.post(url, data=data, headers=headers) as resp:
resp.raise_for_status()
return await resp.read()
async def do_get(self, url, headers):
async with self._session.get(url , headers=headers) as resp:
resp.raise_for_status()
return await resp.read()
async def Login():
url = "https:/userlogin"
data={
'email': '[email protected]',
'pswd': '12345'
}
headers={
'Content-type': 'application/x-www-form-urlencoded; charset=UTF-8',
'Accept': 'application/json, text/javascript, */*; q=0.01',
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:107.0) Gecko/20100101 Firefox/107.0',
}
async with Http() as http:
try:
data = await asyncio.gather(
http.do_post(url, data=data, headers=headers))
return data
except Exception as e:
print("Exception Login: ", e)
async def do_something():
url="https://url_test.it?IsWork=0"
headers={
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:107.0) Gecko/20100101 Firefox/107.0',
'Content-type': 'application/x-www-form-urlencoded; charset=UTF-8'
}
async with Http() as http:
try:
data = await http.do_get(url, headers=headers)
return data
except Exception as e:
print("Exception Login: ", e)
results=asyncio.run(Login())
print (results)
time.sleep(10)
results=asyncio.run(do_something())
print (results)
Successfully logged in, when I try to reach the do_something()
function, I get a session timed out message from the function return. How can I use the same Session aiohttp between the two functions?
UPDATE
Obviously using the requests session in accordance with what is suggested in this post: python-requests keep session between function it is possible to pass the same session in the return value of the function .
Is it possible to do the same thing with aiohttp.ClientSession()?
s = requests.session()
# we're now going to use the session in 3 different function calls
login_to_site(s)
page1 = scrape_page(s, 'page1')
page2 = scrape_page(s, 'page2')
# once this function ends we either need to pass the session up to the
# calling function or it will be gone forever
def login_to_site(s):
s.post('http://www.example.com/login')
def scrape_page(s, name):
page = s.get('http://www.example.com/secret_page/{}'.format(name))
return page
Upvotes: 1
Views: 1787
Reputation: 926
Yes, you can create a ClientSession
in aiohttp
and pass it to other functions like this:
import asyncio
import aiohttp
async def login_to_site(s):
await s.post('http://www.example.com/login')
async def scrape_page(s, name):
page = await s.get('http://www.example.com/secret_page/{}'.format(name))
return page
async def main():
name = "some_name"
async with aiohttp.ClientSession() as session:
await login_to_site(session)
result = await scrape_page(session, name)
if __name__ == "__main__":
asyncio.run(main())
The session is closed after the with
block.
To run async stuff you need to be inside an asyncronous function, so you have first to wrap the aiohttp part inside an async function and then run it using asyncio.
To run async functions. you need to call them first with ()
, which returns a coroutine and then await
them.
If you want to pass and use the session in other functions, you need to convert them to coroutines.
After that your code should run as expected.
Upvotes: 1
Reputation: 75
I'd like to exploit the potential of aiohttp and therefore async\wait. In my program, there are calls that need to be invoked asynchronously and wait for the result before moving forward.
Now this should be the same code using aiohttp.ClientSession:
from aiohttp import ClientSession
import asyncio
async def main():
session = ClientSession()
headers={ "Some" : "Header"}
data = {"username": "foo",
"password": "bar",
"csrf_token": csrf}
async with session.post(URL_Login, data=data, headers=headers) as Login:
print(await Login.text())
async with session.get(URL_Scrape) as Scrape:
html = await Scrape.text()
asyncio.run(main())
The URL_Scrape html page is still the login page (the json result invites you to login the website does after unsuccessful login request, redirected me to the login page). It is surely what you are saying: the session is closed after the with block.
How can I keep the same session active between multiple functions? Should cookies be used?
Upvotes: 0