Tom
Tom

Reputation: 1

Adding videos to playlist - YouTube API SERVICE_UNAVAILABLE error during execution of a Python batch HTTP request

I've been experimenting with the YouTube API client for a few days now and managed to upload single videos to a YouTube playlist. I'm now trying to batch-add multiple videos to a playlist through the use of batch http requests (new_batch_http_request()). I'm aware that the alternative is to just loop through a list of videos and, for each, execute an individual request, but I read that I may be able to reduce quota usage using batch requests. I've read conflicting views on whether or not this is actually true, but I'm hoping to get to a point where I can see this for myself.

The issue I'm having is that when executing the batch request, only 1 of the videos is actually added, and the other ones return an exception:

<HttpError 409 when requesting https://youtube.googleapis.com/youtube/v3/playlistItems?part=snippet&alt=json returned "The operation was aborted.". Details: "[{'domain': 'youtube.CoreErrorDomain', 'reason': 'SERVICE_UNAVAILABLE'}]">

One of the videos in a list would consistently be added, and it would not even necessarily be the first or last in the list used.

The code used:

from googleapiclient.discovery import build
from authentication import authenticate

credentials = authenticate()
playlist_id = "PAAqJtw5-yz3jRraaH13HlxhnjE4hw_wDu" # dummy ID
video_ids = ["TWTX1T3yxzs", "1P1BSm_4FJg", "Pyx_FIYa7EE"] # dummy IDs

youtube = build("youtube", "v3", credentials=credentials)

batch_request = youtube.new_batch_http_request()

def callback(request_id, response, exception):
        if exception is not None:
            print(f"Error adding video {request_id}: {exception}")
        else:
            video_id = response['snippet']['resourceId']['videoId']
            print(f"Added video {video_id} to playlist {playlist_id}")

for video_id in video_ids:
    req_body = {"snippet": {"playlistId": playlist_id,
                        "resourceId": {
                            "kind": "youtube#video",
                            "videoId": video_id}
                            }}   
    request = youtube.playlistItems().insert(part="snippet", body=req_body)
    print(f"Adding video {video_id} to request...")
    batch_request.add(request, callback=callback)
    
print("Executing request...")
batch_request.execute()

I've considered this might be a quota issue, but it's not. I tried looking for different variations of the specific error, but came up empty handed. I'm not sure if this is a limitation of playlistItems().insert when using batch requests.

EDIT:

Concerning the suggested question that has already been asked: Thank you for the suggestion, but the referenced question deals specifically with quota errors, as both the body of the question and accepted answer indicate. It is also concerning uploading videos rather than adding videos to playlists.

As mentioned above, the issue I'm experiencing is not a quota issue. I'm well within the quota limits and am able to re-run the script and with each iteration, a video will be uploaded to the list, albeit a single one. This eliminates the possibility of quota limits. The error received is also non-quota related (SERVICE_UNAVAILABLE) and the console confirms not having reached the quota limits (10,000 requests a day). Additionally, the code in the question suggested just iterates through files in a folder and creates individual requests for each upload, which eats up the quota (as the accepted answer indicates), which is the very reason I'm using batch requests which resulted in my question. I don't actually expect issues looping through individual requests.

EDIT:

Latest quota report from today showing quota has not been reached: Screenshot.

EDIT:

I've just confirmed that batch HTTP requests in fact don't help reducing quota usage. I started with a fresh quota today, which resets every day, and made a single batch request containing 3 API calls and watched Google's console. 150 were used, which lines up with the cost of 50 per each video. I've repeated the same exercise again, and the usage increased to 300.

So although I could technically loop through the video list and make HTTP requests per each individual video without it having any effect on quota usage, it still seems to be the preferred method for reducing overhead and it'd be interesting to know how to make it function properly

Upvotes: 0

Views: 64

Answers (0)

Related Questions