Reputation: 493
I am trying to write a code that gets 1000 first URL's of http pages in google search of some word. I used this code in Python to get the 1000 first URL's
import GoogleScraper
import urllib
urls = GoogleScraper.scrape('english teachers', number_pages=2)
for url in urls:
print(urllib.parse.unquote(url.geturl()))
print('[!] Received %d results by asking %d pages with %d results per page' %
(len(urls), 2, 100))`
but this code returns 0 received results. is there another way to get a lot of URL's from google search in a convenient way? I also tried xgoogle and pygoogle modules but they can handle just with small amount of requests for pages.
Upvotes: 0
Views: 2274
Reputation:
Google has a Custom Search API which allows you to make 100 queries a day for free. Given that each page has 10 results per page, you can barely fit in 1000 results in a day. xgoogle
and pygoogle
are just wrappers around this API, so I don't think you'll be able to get more results by using them.
If you do need more, consider creating another Google account with another API key which will effectively double your limit. If you're okay with slightly inferior results, you can try out Bing's Search API (they offer 5000 requests a month).
Upvotes: 1