Reputation: 3267
I written a script that extract some data from an API and build an Excel file. I'm not a dev, it is my first real program ever writted. I hosted the code on Google Colab.
There is API secret keys in clear. I want to share it with a Google Drive sharing link to people needing to generate the Excel file so that they can execute it. However I would prefer not to include API secret keys in clear in order to avoid accidental sharings outside of the entreprise.
I'm wondering how to hide this... Or how to provide users an alternative methode to execute the file without knowing the passwords. I don't have access to a shared webserver internally to the entreprise.
Regards
CLIENT_KEY = u'*****'
CLIENT_SECRET = u'*****'
BASE_URL = u'*****'
access_token_key = '*****'
access_token_secret = '*****'
print ('Getting user profile...',)
oauth = OAuth(CLIENT_KEY, client_secret=CLIENT_SECRET, resource_owner_key=access_token_key,
resource_owner_secret=access_token_secret)
r = requests.get(url=BASE_URL + '1/user/me/profile', auth=oauth)
print (json.dumps(r.json(), sort_keys=True, indent=4, separators=(',', ': ')))
...
Upvotes: 39
Views: 24680
Reputation: 1911
You can now store your private keys in Secrets :
To get API_KEY
, Toggle Notebook access
then use this code :
from google.colab import userdata
userdata.get('API_KEY')
Upvotes: 9
Reputation: 884
You could using credentials and save on Google Drive as follows:
.env
file
API_KEY="xxxxxx"
SECRET_KEY="xxxxxx"
When you want to using within Google Colab, here are the steps:
Mount Google Drive
from google.colab import drive
drive.mount('/content/drive')
Load the key
!pip install --quiet python-dotenv
import dotenv
import os
dotenv.load_dotenv('/content/drive/MyDrive/01 Work
File/Credentials/.env')
secret_key = os.getenv('SECRET_KEY')
Print the key to ensure they were uploaded
print(secret_key)
Now you could share your Notebook without expose your secret key.
Upvotes: 2
Reputation: 40858
You can save the secret key as file on Google Drive. Then read the file into Colab.
Now you can set permission to access the key file in Google Drive. Only you and the people you share the key file can use it.
As @efbbrown suggest, you can create an aws key file and store it in Google Drive, e.g.
[default]
aws_access_key_id=AKIAIOSFODNN7EXAMPLE
aws_secret_access_key=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
But now (2020) you don't need pydrive
any more. You can just
Default place to store credential is ~/.aws/config
. So you can do this (if your file above is named aws_config
)
!mkdir -p ~/.aws
!cp /content/drive/MyDrive/aws_config ~/.aws/config
Upvotes: 19
Reputation: 381
I would recommand using GCP's Secret Manager :
You get useful features such as rights management (in IAM & Admin), you can update your passwords via Secret versions, etc.. really useful.
Pre-requisits:
Here is a way to get your secret with python 3 :
# Install the module and import it :
!pip install google-cloud-secret-manager
from google.cloud import secretmanager
# Create a Client:
client = secretmanager.SecretManagerServiceClient()
secret_name = "my-secret" # => To be replaced with your secret name
project_id = 'my-project' # => To be replaced with your GCP Project
# Forge the path to the latest version of your secret with an F-string:
resource_name = f"projects/{project_id}/secrets/{secret_name}/versions/latest"
# Get your secret :
response = client.access_secret_version(request={"name": resource_name})
secret_string = response.payload.data.decode('UTF-8')
# Tada ! you secret is in the secret_string variable!
Do not try it with your real password or secret while testing this.
Enjoy !
Upvotes: 26
Reputation: 4352
To expand on @Korakot Chaovavanich's answer, here is the step by step of that solution:
[default]
aws_access_key_id=AKIAIOSFODNN7EXAMPLE
aws_secret_access_key=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
!pip install -U -q PyDrive
(Some of this code comes from @wenkesj's answer on this question.)
# Imports
import os
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
from google.colab import auth
from oauth2client.client import GoogleCredentials
# Google drive authentication
auth.authenticate_user()
gauth = GoogleAuth()
gauth.credentials = GoogleCredentials.get_application_default()
drive = GoogleDrive(gauth)
# File params
local_save_dir = "/root/.aws"
filename = "credentials"
save_path = "{0}/{1}".format(local_save_dir, filename)
# Choose/create a local (colab) directory to store the data.
local_download_path = os.path.expanduser(local_save_dir)
try:
os.makedirs(local_download_path)
except: pass
drive_list = drive.ListFile().GetList()
f = [x for x in drive_list if x["title"] == filename][0]
print('title: %s, id: %s' % (f['title'], f['id']))
fname = os.path.join(local_download_path, f['title'])
print('downloading to {}'.format(fname))
f_ = drive.CreateFile({'id': f['id']})
f_.GetContentFile(fname)
with open(save_path) as creds:
for i, line in enumerate(creds):
if i == 1:
access_token_key = line.replace("aws_access_key_id=", "").replace("\n", "")
if i == 2:
access_token_secret = line.replace("aws_secret_access_key=", "").replace("\n", "")
Now your AWS keys are in the two variables access_token_key
& access_token_secret
.
Upvotes: 8
Reputation: 38619
Try getpass
. For example:
from getpass import getpass
secret = getpass('Enter the secret value: ')
Then, you can share the notebook and each user can enter a distinct value, which you can then use later in the notebook as a regular Python variable.
Upvotes: 27