kratze
kratze

Reputation: 185

Beautifulsoup parse thousand pages

I have a script parsing a list with thousands of URLs. But my problem is, that it would take ages to be done with that list.

The URL request takes around 4 seconds before the page is loaded and can be parsed.
Is there any way to parse really a large amount of URLs fast?

My code looks like this:

from bs4 import BeautifulSoup   
import requests                 

#read url-list
with open('urls.txt') as f:
    content = f.readlines()
# remove whitespace characters
content = [line.strip('\n') for line in content]
 
#LOOP through urllist and get information
for i in range(5):
    try:
        for url in content:
        
            #get information
            link = requests.get(url)
            data = link.text
            soup = BeautifulSoup(data, "html5lib")

            #just example scraping
            name = soup.find_all('h1', {'class': 'name'})

EDIT: how to handle Asynchronous Requests with hooks in this example? I tried the following as mentioned on this site Asynchronous Requests with Python requests:

from bs4 import BeautifulSoup   
import grequests

def parser(response):
    for url in urls:
        
        #get information
        link = requests.get(response)
        data = link.text
        soup = BeautifulSoup(data, "html5lib")

        #just example scraping
        name = soup.find_all('h1', {'class': 'name'})

#read urls.txt and store in list variable
with open('urls.txt') as f:
    urls= f.readlines()
# you may also want to remove whitespace characters 
urls = [line.strip('\n') for line in urls]

# A list to hold our things to do via async
async_list = []

for u in urls:
    # The "hooks = {..." part is where you define what you want to do
    # 
    # Note the lack of parentheses following do_something, this is
    # because the response will be used as the first argument automatically
    rs = grequests.get(u, hooks = {'response' : parser})

    # Add the task to our list of things to do via async
    async_list.append(rs)

# Do our list of things to do via async
grequests.map(async_list, size=5)

This doesn't work for me. I don't even get any error in the console, it is just running for long time until it stops.

Upvotes: 3

Views: 850

Answers (0)

Related Questions