Reputation: 299
Similar issues were submitted but none of the solutions work.
When trying to do this tutorial from the Google Cloud doc, I'm getting the following error when trying to access the datastore:
google.api_core.exceptions.Forbidden: 403 Missing or insufficient
permissions.
The executed file can be found here.
I did execute the following commands:
gcloud auth application-default login
export GOOGLE_APPLICATION_CREDENTIALS="file.json"
Please note that I'm executing the file on a local computer. The goal is to perform reads/writes on the datastore directly from Google Engine app.
Upvotes: 12
Views: 34064
Reputation: 1
You need to check your service account permission if it was properly selected. And after you add new permission, stop the instance/cluster, go to Edit, go to Service account and reselect the account; this make sure your instance didn't store old permission config in cache and accept all added new permissions. Then restart the VM and run your task normally.
Upvotes: 0
Reputation: 1
I was facing the same issue until I added "Cloud Datastore persmission" to my service account.
Upvotes: 0
Reputation: 1
actually its more simpler, I have a solution that should work in any environment.
either it be on a:
and or whatever cloud environment you fancy.
def get_client() -> ndb.Client:
if is_heroku():
# NOTE: hosted in Heroku service key should be saved as environment variable in heroku or in any platform other than GCP
app_credentials = json.loads(os.environ.get('GOOGLE_APPLICATION_CREDENTIALS'))
credentials = service_account.Credentials.from_service_account_info(info=app_credentials)
ndb_client: ndb.Client = ndb.Client(namespace="main", project=config_instance.PROJECT, credentials=credentials)
else:
# NOTE: could be GCP or another cloud environment
ndb_client: ndb.Client = ndb.Client(namespace="main", project=config_instance.PROJECT)
return ndb_client
def use_context(func: Callable) -> Callable:
"""
**use_context**
will insert ndb context for working with ndb. Cloud Databases
**NOTE**
functions/ methods needs to be wrapped by this wrapper when they interact with the database somehow
:param func: function to wrap
:return: function wrapped with ndb.context
"""
@functools.wraps(func)
def wrapper(*args, **kwargs) -> Callable:
ndb_client = get_client()
print(f' ndb_client : {str(ndb_client)}')
with ndb_client.context():
return func(*args, **kwargs)
return wrapper
@use_context
def save_model(model: Optional[ndb.Model]) -> Optional[ndb.Key]:
"""save ndb model to store and return ndb.Key"""
return model.put() if isinstance(model, ndb.Model) else None
NOTE: the contents of GOOGLE_APPLICATION_CREDENTIALS
environment variable needs to be obtained from the JSON file and the contents of this file needs to be set as an environment variable if you are on Heroku or any other cloud Offering other than GCP
On Local Development you can save the file on your local drive,
On Docker you can set the environment variable or use the file
control which is which by this logic
if is_heroku():
in my case its just a function that tries to read an environment variable to see if the app is running on Heroku or not
in your case it could be anything as long as it tells you which environment you are running at so you could choose to load your key file from local or environment .
this is just so you can load the contents of a json file from environment variables
app_credentials = json.loads(os.environ.get('GOOGLE_APPLICATION_CREDENTIALS'))
the above allows you to save the actual contents of a JSON File to an environment variable and then load it back as JSON again,
To avoid having to save the file in the src folder or any other folder.
Upvotes: 1
Reputation: 651
If the above solutions does not work go to your firebase console click on the settings icon then navigate to
and set GOOGLE_APPLICATION_CREDENTIALS variable to the json file path this worked for me :)
Upvotes: 0
Reputation: 151
You're trying to use two different forms of authentication, which I wouldn't recommend.
From Google's documentation, gcloud auth application-default login
is if you want your local application to temporarily use your own user credentials for API access.
When you use export GOOGLE_APPLICATION_CREDENTIALS='file.json'
, per Google's documentation, you are setting an environment variable to the file.json
. This means you will need to create a Service Account, assign the Service Account the proper permissions, create/download a key (which in this case is file.json
) and then the environment variable will be in effect when your code is executed.
Since you're just getting started, I would recommend starting out using your Cloud Shell that's available in the Google Cloud Console and using an account that has full Owner rights on your Google Project. This will make it much easier for you to learn the basics (and then you can run it more securely later and/or in production). The Cloud Shell has everything installed and updated.
If you absolutely have to run this Quickstart through a local computer, I'd recommend the first option above: gcloud auth application-default login
. You will need to have the Google Cloud SDK installed for your operating system. When you run the command, it should open a browser and you will be prompted to log into your Google Cloud account. That will give you permissions to run the script locally. Hope this helps!
Upvotes: 9
Reputation: 161
I was also having the same error message when running the tutorial from a local computer. I am using a service account (and not the "gcloud auth application-default login), as this is the preferred approach recommended in the Google tutorials.
However, after a lot of investigation I found that the problem was occurring due an error in Google's documentation (it seems that the documentation is not up-to-date).
Setting up authentication To run the client library, you must first set up authentication by creating a service account and setting an environment variable. Complete the following steps to set up authentication. For more information, see the GCP authentication documentation .
GCP CONSOLECOMMAND LINE In the GCP Console, go to the Create service account key page.
- GO TO THE CREATE SERVICE ACCOUNT KEY PAGE
- From the Service account drop-down list, select New service account.
- In the Service account name field, enter a name . 4. From the Role drop-down list, select Project > Owner.
The error in the documentation, has to do with step 4 of the instructions. In the current implementation of the GCP console, the Role cannot be set directly from the Service Account Key page. Instead, you must go to the "IAM & admin"page to set the 'Owner' role:
In your Google Cloud console select “IAM & admin”->”IAM”
You will see the “ADD” option. This will allow you to set permissions for your new Service Account. Click “ADD”.
You can then enter the service account and role ('Owner' if you are following the instructions in the tutorial).
The following article "The Missing Guide To Setting Up Google Cloud Service Accounts For Google BigQuery" provides more information. The article is written in the context of BigQuery, but it is equally applicable for Google Datastore :
Upvotes: 11