Reputation: 477
I am trying to scrape YT video URL's into a list based on search keywords:
import urllib.request
import re
search_keyword = "cute+cats"
html = urllib.request.urlopen("https://www.youtube.com/results?search_query=" +
search_keyword)
video_ids = re.findall(r"watch\?v=(\S{11})", html.read().decode())
n = 0
for x in video_ids:
print("https://www.youtube.com/watch?v=" + x)
n+=1
print(n)
The problem is I get very small number of results back ( between 30 and 50). Every time different number.
I was hoping to get something around 300 results...
Also if I look into Youtube API: Search list MaxResults is maximum 50, it says.
That is why I am not using API - just emulating browser search, as a regular user would do. It gives you much more results in endless scrolls back.
How can this be resolved? I am new to this and pretty much lost.
Upvotes: 1
Views: 1600
Reputation: 26
when you want to search videos related to a particular video then apply this:
youtube=build('youtube','v3',developerKey='your_youtube_api_key')
search_list_request=youtube.search().list(part='snippet',relatedToVideoId='video_id',maxResults='set the limit of videos that you want to see on per result page')
when you want to search videos related to a particular keyword
youtube=build('youtube','v3',developerKey='your_youtube_api_key')
search_list_request=youtube.search().list(part='snippet',q='write a keyword',maxResults='set the limit of videos that you want to see on per result page')
Although summary is that we should use 'maxResults' keyword Argument to set limit of no. of retrieve videos
Upvotes: 1
Reputation: 113
I think the problem is that the URL you are using only loads a certain number of videos. If you were to do this your self and go to "https://www.youtube.com/results?search_query=cute+cats" then scroll down the page you would only get so many videos before the page has to load some more.
I recommend looking into selenium as it has a function that allows you to scroll down web pages (therefore loading more videos) and can also grab URLs and scrape web pages.
selenium scrolling example:
driver.execute_script("window.scrollTo(X, Y)")
Upvotes: 0