Reputation: 31
I am trying to implement google Big Query. I had to use composer to install google libraries as it was not working for Storage client and Big Query Client. My code reads file from cloud storage and inserts data into Big query table. I am getting following errors and warnings
PHP Fatal error: Uncaught exception 'Google\Cloud\Core\Exception\ServiceException' with message 'Option 20056 is not supported by this curl implementation.' in /base/data/home/apps/s~xxx/1.402961109127965627/vendor/google/cloud/src/Core/RequestWrapper.php:241 Stack trace:
0 /base/data/home/apps/s~xxx/1.402961109127965627/vendor/google/cloud/src/Core/RequestWrapper.php(150): Google\Cloud\Core\RequestWrapper->convertToGoogleException(Object(google\appengine\runtime\CurlLiteOptionNotSupportedException))
1 /base/data/home/apps/s~xxx/1.402961109127965627/vendor/google/cloud/src/Core/RestTrait.php(96): Google\Cloud\Core\RequestWrapper->send(Object(GuzzleHttp\Psr7\Request), Array)
2 /base/data/home/apps/s~xxx/1.402961109127965627/vendor/google/cloud/src/BigQuery/Connection/Rest.php(220): Google\Cloud\BigQuery\Connection\Rest->send('jobs', 'insert', Array)
3 /base/data/home/apps/s~xxx/1.402961109127965627/vendor/google/cloud/src/BigQuery/Table.php(344): Google\Cloud in /base/data/home/apps/s~xxx/1.402961109127965627/vendor/google/cloud/src/Core/RequestWrapper.php on line 241
PHP Warning: php_sapi_name() has been disabled for security reasons. It can be re-enabled by adding it to the google_app_engine.enable_functions ini variable in your applications php.ini in /base/data/home/apps/s~datascrub-152522/1.402961109127965627/vendor/guzzlehttp/guzzle/src/Client.php on line 173
<?php
session_start();
# Includes the autoloader for libraries installed with composer
require __DIR__ . '/vendor/autoload.php';
# Imports the Google Cloud client library
use Google\Cloud\BigQuery\BigQueryClient;
use Google\Cloud\Storage\StorageClient;
use Google\Cloud\Core\ExponentialBackoff;
$projectId="xxx-1111";
$datasetId="datasetid";
$tableId="test";
$bucketName="gs://my-bucket";
$objectName="Cloud_SQL_Export.csv";
echo "\n".$projectId."==".$datasetId."==".$tableId."==".$bucketName."==".$objectName."\n";
import_from_storage($projectId, $datasetId, $tableId, $bucketName, $objectName);
function import_from_storage($projectId, $datasetId, $tableId, $bucketName, $objectName)
{
echo "inside function";
// determine the import options from the object name
$options = [];
/*if ('.backup_info' === substr($objectName, -12)) {
$options['jobConfig'] = ['sourceFormat' => 'DATASTORE_BACKUP'];
} elseif ('.json' === substr($objectName, -5)) {
$options['jobConfig'] = ['sourceFormat' => 'NEWLINE_DELIMITED_JSON'];
}*/
$options['jobConfig'] = [ 'sourceFormat' => 'CSV', 'skipLeadingRows' => 1 ];
print_r($options);
echo "gfgfhg".$projectId;
// instantiate the bigquery table service
$bigQuery = new BigQueryClient([
'projectId' => $projectId,
]);
$dataset = $bigQuery->dataset($datasetId);
$table = $dataset->table($tableId);
// load the storage object
$storage = new StorageClient([
'projectId' => $projectId,
]);
$object = $storage->bucket($bucketName)->object($objectName);
// create the import job
$job = $table->loadFromStorage($object, $options);
echo "job set";
// poll the job until it is complete
$backoff = new ExponentialBackoff(10);
$backoff->execute(function () use ($job) {
echo 'Waiting for job to complete' . PHP_EOL;
$job->reload();
if (!$job->isComplete()) {
throw new Exception('Job has not yet completed', 500);
}
});
// check if the job has errors
if (isset($job->info()['status']['errorResult'])) {
$error = $job->info()['status']['errorResult']['message'];
echo 'Error running job: ' . PHP_EOL." ". $error;
} else {
echo 'Data imported successfully' . PHP_EOL;
}
}
Upvotes: 0
Views: 1307
Reputation: 31
Got the issue resolved by adding extension = "php_curl.dll" in php.ini file
Upvotes: 1