Reputation: 245
I'm using PHP SDK
to copy files to Google Drive, my account can use unlimited storage
After 300-500 files uploaded, User rate limit exceeded show up and i can't upload until next day
{
"code": 403,
"message": "User rate limit exceeded."
}
function copydrive($driveid,$folder,$newname){
$fileMetadata = new Google_Service_Drive_DriveFile(array(
'name' => $newname,
'parents' => array($folder)
));
$result = $GLOBALS['service']->files->copy($driveid, $fileMetadata);
return $result->id;
}
$newid = copydrive($episode['drive'],$folderid,$filename);
if($newid!=''){
$qrun2=$conn->prepare("update episode set mydrive='".$newid."' where id=".$episode['id']);
$qrun2->execute();
echo "<script language='javascript'>";
echo 'setTimeout(function(){window.location.reload(1);}, 2000);';
echo "</script>";
}
Update using batching request still getting error
$qrun=$conn->prepare("SELECT * FROM episode WHERE drive!='' and mydrive is null limit 50");
$qrun->execute();
$episode=$qrun->fetchAll();
$batch = $service->createBatch();
$num=0;
foreach($episode as $item){
$fileMetadata = new Google_Service_Drive_DriveFile(array(
'parents' => array('parent folder id')
));
$result = $GLOBALS['service']->files->copy($item['drive'], $fileMetadata);
$batch->add($result,'copy-'.$num);
$num++;
}
$results = $batch->execute();
$total_rs = count($results);
for($x=0;$x<$total_rs;$x++){
$qrun2=$conn->prepare("update episode set mydrive='".$results['response-copy-'.$x]->id."' where id=".$episode[$x]['id']);
$qrun2->execute();
}
In API dashboard the request in 1 hour is 344
Upvotes: 0
Views: 2677
Reputation: 17613
That's happening because you're doing your request probably at a fast rate. I would suggest:
slowing down, like creating a delay in between requests.
If you can perform Batching Requests then do so.
If it applies to you, try using other users to create the request under the project, like sharding the requests,.
Upvotes: 1