Reputation: 76
I am trying to obtain the COVID-19 data present in different worksheets of the following google sheet. The g-sheet being open for public usage, the URL only returns the first worksheet only.I want to scrape all the worksheets.Can any one help. Here's the google sheet link:
Upvotes: 3
Views: 6572
Reputation: 9440
You can do it using requests. All the tables are in the source of one HTML document. Simply iterate through the tables and write to a CSV.
from bs4 import BeautifulSoup
import csv
import requests
html = requests.get('https://docs.google.com/spreadsheets/d/e/2PACX-1vSc_2y5N0I67wDU38DjDh35IZSIS30rQf7_NYZhtYYGU1jJYT6_kDx4YpF-qw0LSlGsBYP8pqM_a1Pd/pubhtml').text
soup = BeautifulSoup(html, "lxml")
tables = soup.find_all("table")
index = 0
for table in tables:
with open(str(index) + ".csv", "w") as f:
wr = csv.writer(f, quoting=csv.QUOTE_NONNUMERIC)
wr.writerows([[td.text for td in row.find_all("td")] for row in table.find_all("tr")])
index = index + 1
Upvotes: 10