Reputation: 1
I'm trying to scrape ticket availability across multiple event dates.
Each event date has its own eventID, so for the 23 possible dates the eventids are 1001-1023
I have started to do this manually, and the below gives me all the seats availability for a given date, but repeating this 22 times cannot be the most efficient way.
import requests
import json
f = open('tickets.txt','a')
r = requests.get('https://www.website.com/events/1000/tickets/seatmap?sectionid=3')
d = json.loads(r.text)
zones = d['zones']
for key, value in zones.iteritems() :
print >>f, (key, value)
I want to loop through the eventIDs and print all availability across all dates at once. However im having trouble building up the request/URL. So far I've created:
eventIDs = range(1001, 1023)
baseurl = "https://www.website.com/events/"
sectionId = "/tickets/seatmap?sectionId=3"
UPDATE: I think i've got there, this that i think works...
for i in eventIDs:
url=baseurl+str(i)+sectionId
r = requests.get(url)
d = json.loads(r.text)
print >>f, (d)
Is this the best way to do this? Any help much appreciated. Thanks.
Upvotes: 0
Views: 412
Reputation: 6749
You should consider making your rest calls async. If you want to stick with requests
-ish style, you can use grequests
:
# Python3
import grequests
event_ids = range(1001, 1023)
base_url = "https://www.website.com/events/"
section_id = "/tickets/seatmap?sectionId=3"
# Create an array of urls
urls = [base_url + str(i) + section_id for i in event_ids ]
# Preapare requests
rs = (grequests.get(u) for u in urls)
# Send them
results = grequests.map(rs)
Or you can make use of asyncio
with aiohttp
. If you're interested in it and want to check how it looks like, you can visit this question
Upvotes: 2