d-_-b
d-_-b

Reputation: 4502

Boto3 Error: botocore.exceptions.NoCredentialsError: Unable to locate credentials

When I simply run the following code, I always gets this error.

s3 = boto3.resource('s3')
bucket_name = "python-sdk-sample-%s" % uuid.uuid4()
print("Creating new bucket with name:", bucket_name)
s3.create_bucket(Bucket=bucket_name)

I have saved my credential file in

C:\Users\myname\.aws\credentials, from where Boto should read my credentials.

Is my setting wrong?

Here is the output from boto3.set_stream_logger('botocore', level='DEBUG').

2015-10-24 14:22:28,761 botocore.credentials [DEBUG] Skipping environment variable credential check because profile name was explicitly set.
2015-10-24 14:22:28,761 botocore.credentials [DEBUG] Looking for credentials via: env
2015-10-24 14:22:28,773 botocore.credentials [DEBUG] Looking for credentials via: shared-credentials-file
2015-10-24 14:22:28,774 botocore.credentials [DEBUG] Looking for credentials via: config-file
2015-10-24 14:22:28,774 botocore.credentials [DEBUG] Looking for credentials via: ec2-credentials-file
2015-10-24 14:22:28,774 botocore.credentials [DEBUG] Looking for credentials via: boto-config
2015-10-24 14:22:28,774 botocore.credentials [DEBUG] Looking for credentials via: iam-role

Upvotes: 246

Views: 517033

Answers (20)

Yogendra Shinde
Yogendra Shinde

Reputation: 11

this is Yogendra Shinde here. I faced similar issue and this is how I solved it.

import os

import boto3

os.environ['AWS_ACCESS_KEY_ID'] = '...'

os.environ['AWS_SECRET_ACCESS_KEY'] = ...'

client = boto3.client('s3')

bucket_name = 'yogendrafeb2022'

client.create_bucket(Bucket=bucket_name)

Upvotes: -4

Aniruddha Deshpande
Aniruddha Deshpande

Reputation: 1

i encountered same issue but this method worked for me

session = boto3.Session(
aws_access_key_id=config('AWS_ACCESS_KEY_ID'),
aws_secret_access_key=config('AWS_SECRET_ACCESS_KEY'),
region_name=config('AWS_S3_REGION_NAME'))
text_client = session.client('textract')
s3_client = session.client('s3')
s3 = session.resource('s3')

Upvotes: 0

Ivailo Bardarov
Ivailo Bardarov

Reputation: 3885

If you run those commands from ec2 instance, and the metadata endpoint is configured to require http tokens you can get the same error. Make the http tokens optional or upgrade your client.

Upvotes: 1

Dmitriy Kupch
Dmitriy Kupch

Reputation: 101

In case of using AWS

In my case I had to add the following policy in IAM role to allow ec2 tags to be read by the EC2 instances. That would eliminate Unable to locate credentials error :

{
"Version": "2012-10-17",
"Statement": [
    {
        "Sid": "VisualEditor0",
        "Effect": "Allow",
        "Action": "ec2:DescribeTags",
        "Resource": "*"
    }
  ]
}

Upvotes: 0

SHASHANK MADHAV
SHASHANK MADHAV

Reputation: 2086

try specifying keys manually

    s3 = boto3.resource('s3',
         aws_access_key_id=ACCESS_ID,
         aws_secret_access_key= ACCESS_KEY)

Make sure you don't include your ACCESS_ID and ACCESS_KEY in the code directly for security concerns. Consider using environment configs and injecting them in the code as suggested by @Tiger_Mike.

For Prod environments consider using rotating access keys: https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_access-keys.html#Using_RotateAccessKey

Upvotes: 186

Gianmarco G
Gianmarco G

Reputation: 381

I have solved the problem like this:

aws configure

Afterwards I manually entered:

AWS Access Key ID [None]: xxxxxxxxxx
AWS Secret Access Key [None]: xxxxxxxxxx
Default region name [None]: us-east-1
Default output format [None]: just hit enter

After that it worked for me

Upvotes: 1

Fernando Ciciliati
Fernando Ciciliati

Reputation: 1369

I had the same issue and found out that the format of my ~/.aws/credentials file was wrong.

It worked with a file containing:

[default]
aws_access_key_id=XXXXXXXXXXXXXX
aws_secret_access_key=YYYYYYYYYYYYYYYYYYYYYYYYYYY

Note that there must be a profile name "[default]". Some official documentation make reference to a profile named "[credentials]", which did not work for me.

Upvotes: 126

mirekphd
mirekphd

Reputation: 6743

In case of MLflow a call to mlflow.log_artifact() will raise this error if you cannot write to AWS3/MinIO data lake.

The reason is not setting up credentials in your python env (as these two env vars):

os.environ['DATA_AWS_ACCESS_KEY_ID'] = 'login'
os.environ['DATA_AWS_SECRET_ACCESS_KEY'] = 'password'

Note you may also access MLflow artifacts directly, using minio client (which requires a separate connection to the data lake, apart from mlflow's connection). This client can be started like this:

minio_client_mlflow = minio.Minio(os.environ['MLFLOW_S3_ENDPOINT_URL'].split('://')[1],
                    access_key=os.environ['AWS_ACCESS_KEY_ID'],
                    secret_key=os.environ['AWS_SECRET_ACCESS_KEY'],
                    secure=False)

Upvotes: 3

stefandydx
stefandydx

Reputation: 13

I just had this problem. This is what worked for me:

pip install botocore==1.13.20

Source: https://github.com/boto/botocore/issues/1892

Upvotes: 1

ahmed meraj
ahmed meraj

Reputation: 886

Exporting the credential also work, In linux:

export AWS_SECRET_ACCESS_KEY="XXXXXXXXXXXX"
export AWS_ACCESS_KEY_ID="XXXXXXXXXXX"

Upvotes: 12

kathir raja
kathir raja

Reputation: 1016

Create an S3 client object with your credentials

AWS_S3_CREDS = {
    "aws_access_key_id":"your access key", # os.getenv("AWS_ACCESS_KEY")
    "aws_secret_access_key":"your aws secret key" # os.getenv("AWS_SECRET_KEY")
}
s3_client = boto3.client('s3',**AWS_S3_CREDS)

It is always good to get credentials from os environment

To set Environment variables run the following commands in terminal

if linux or mac

$ export AWS_ACCESS_KEY="aws_access_key"
$ export AWS_SECRET_KEY="aws_secret_key"

if windows

c:System\> set AWS_ACCESS_KEY="aws_access_key"
c:System\> set AWS_SECRET_KEY="aws_secret_key"

Upvotes: 18

JJFord3
JJFord3

Reputation: 1985

I work for a large corporation and encountered this same error, but needed a different work around. My issue was related to proxy settings. I had my proxy set up so I needed to set my no_proxy to whitelist AWS before I was able to get everything to work. You can set it in your bash script as well if you don't want to muddy up your Python code with os settings.

Python:

import os
os.environ["NO_PROXY"] = "s3.amazonaws.com"

Bash:

export no_proxy = "s3.amazonaws.com"

Edit: The above assume a US East S3 region. For other regions: use s3.[region].amazonaws.com where region is something like us-east-1 or us-west-2

Upvotes: 6

Avinash Dalvi
Avinash Dalvi

Reputation: 9291

from the terminal type:-

aws configure

then fill in your keys and region.

after this do next step use any environment. You can have multiple keys depending your account. Can manage multiple enviroment or keys

import boto3
aws_session = boto3.Session(profile_name="prod")
# Create an S3 client
s3 = aws_session.client('s3')

Upvotes: 13

Nija I Pillai
Nija I Pillai

Reputation: 1136

I also had the same issue,it can be solved by creating a config and credential file in the home directory. Below show the steps I did to solve this issue.

Create a config file :

touch ~/.aws/config

And in that file I entered the region

[default]
region = us-west-2

Then create the credential file:

touch ~/.aws/credentials

Then enter your credentials

[Profile1]
aws_access_key_id = XXXXXXXXXXXXXXXXXXXX 
aws_secret_access_key = YYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY

After set all these, then my python file to connect bucket. Run this file will list all the contents.

import boto3
import os

os.environ['AWS_PROFILE'] = "Profile1"
os.environ['AWS_DEFAULT_REGION'] = "us-west-2"

s3 = boto3.client('s3', region_name='us-west-2')
print("[INFO:] Connecting to cloud")

# Retrieves all regions/endpoints that work with S3

response = s3.list_buckets()
print('Regions:', response)

You can also refer below links:

Upvotes: 15

Sanket Patel
Sanket Patel

Reputation: 41

If you have multiple aws profiles in ~/.aws/credentials like...

[Profile 1]
aws_access_key_id = *******************
aws_secret_access_key = ******************************************
[Profile 2]
aws_access_key_id = *******************
aws_secret_access_key = ******************************************

Follow two steps:

  1. Make one you want to use as a default using export AWS_DEFAULT_PROFILE=Profile 1 command in terminal.

  2. Make sure to run above command in the same terminal from where you use boto3 or you open an editor.[Understand the following scenario]

Scenario:

  • If you have two terminal open called t1 and t2.
  • And you run the export command in t1 and you open JupyterLab or any other from t2, you will get NoCredentialsError: Unable to locate credentials error.

Solution:

  • Run the export command in t1 and then open JupyterLab or any other from the same terminal t1.

Upvotes: 4

Mohamed Hamed
Mohamed Hamed

Reputation: 31

If you're sure you configure your aws correctly, just make sure the user of the project can read from ./aws or just run your project as a root

Upvotes: 0

TheWalkingData
TheWalkingData

Reputation: 1067

Make sure your ~/.aws/credentials file in Unix looks like this:

[MyProfile1]
aws_access_key_id = yourAccessId
aws_secret_access_key = yourSecretKey

[MyProfile2]
aws_access_key_id = yourAccessId
aws_secret_access_key = yourSecretKey

Your Python script should look like this, and it'll work:

from __future__ import print_function
import boto3
import os

os.environ['AWS_PROFILE'] = "MyProfile1"
os.environ['AWS_DEFAULT_REGION'] = "us-east-1"

ec2 = boto3.client('ec2')

# Retrieves all regions/endpoints that work with EC2
response = ec2.describe_regions()
print('Regions:', response['Regions'])

Source: https://boto3.readthedocs.io/en/latest/guide/configuration.html#interactive-configuration.

Upvotes: 40

Samuel Nde
Samuel Nde

Reputation: 2743

The boto3 is looking for the credentials in the folder like

C:\ProgramData\Anaconda3\envs\tensorflow\Lib\site-packages\botocore\.aws

You should save two files in this folder credentials and config.

You may want to check out the general order in which boto3 searches for credentials in this link. Look under the Configuring Credentials sub heading.

Upvotes: 0

hru_d
hru_d

Reputation: 936

These instructions are for windows machine with a single user profile for AWS. Make sure your ~/.aws/credentials file looks like this

[profile_name]
aws_access_key_id = yourAccessId
aws_secret_access_key = yourSecretKey

I had to set the AWS_DEFAULT_PROFILEenvironment variable to profile_name found in your credentials.
Then my python was able to connect. eg from here

import boto3

# Let's use Amazon S3
s3 = boto3.resource('s3')

# Print out bucket names
for bucket in s3.buckets.all():
    print(bucket.name)

Upvotes: 6

Amri
Amri

Reputation: 1100

If you are looking for an alternative way, try adding your credentials using AmazonCLI

from the terminal type:-

aws configure

then fill in your keys and region.

Upvotes: 44

Related Questions