mumbles
mumbles

Reputation: 1149

AWS: Boto3: AssumeRole example which includes role usage

I'm trying to use the AssumeRole in such a way that i'm traversing multiple accounts and retrieving assets for those accounts. I've made it to this point:

import boto3
stsclient = boto3.client('sts')

assumedRoleObject = sts_client.assume_role(
RoleArn="arn:aws:iam::account-of-role-to-assume:role/name-of-role",
RoleSessionName="AssumeRoleSession1")

Great, i have the assumedRoleObject. But now i want to use that to list things like ELBs or something that isn't a built-in low level resource.

How does one go about doing that? If i may ask - please code out a full example, so that everyone can benefit.

Upvotes: 70

Views: 194970

Answers (12)

Anand Tripathi
Anand Tripathi

Reputation: 16126

Here's the code snippet I used

import boto3

sts_client = boto3.client('sts')
assumed_role_object = sts_client.assume_role(
    RoleArn=<arn of the role to assume>,
    RoleSessionName="<role session name>"
)
print(assumed_role_object)
credentials = assumed_role_object['Credentials']

session = Session(
    aws_access_key_id=credentials['AccessKeyId'],
    aws_secret_access_key=credentials['SecretAccessKey'],
    aws_session_token=credentials['SessionToken']
)
self.s3 = session.client('s3')

Upvotes: 12

Akshat Shah
Akshat Shah

Reputation: 191

In case you already have profiles configured in your ~/.aws/config, you can simply do:

profile = boto3.session.Session(profile_name='profile_name')
client = profile.client('s3', region_name='region')

Upvotes: 0

Sar009
Sar009

Reputation: 2276

with reference to the solution by @jarrad which is not working as of Feb 2021, and as a solution that does not use STS explicitly please see the following


import boto3
import botocore.session
from botocore.credentials import AssumeRoleCredentialFetcher, DeferredRefreshableCredentials


def get_boto3_session(assume_role_arn=None):
    session = boto3.Session(aws_access_key_id="abc", aws_secret_access_key="def")
    if not assume_role_arn:
        return session

    fetcher = AssumeRoleCredentialFetcher(
        client_creator=session.client,
        source_credentials=session.get_credentials(),
        role_arn=assume_role_arn,
    )
    botocore_session = botocore.session.Session()
    botocore_session._credentials = DeferredRefreshableCredentials(
        method='assume-role',
        refresh_using=fetcher.fetch_credentials
    )

    return boto3.Session(botocore_session=botocore_session)

the function can be called as follows

ec2_client = get_boto3_session(role_arn='my_role_arn').client('ec2', region_name='us-east-1')

Upvotes: 12

Cory Brickner
Cory Brickner

Reputation: 41

  • I just created a solution in Airflow for anyone not using the CLI where a service account uses an assumed role. I'm using Google's Secret Manager to house the credentials in JSON form. You can pass credentials however you'd like, our company just uses that service.
  • I get the secret location from an Airflow variable, which is why you're seeing these initial code blocks happen in that way.
  • My JSON for s3_bucket uses the s3://bucket/prefix/ notation, which is why I'm splitting the string with '/'.
  • The RoleSessionName parameter in the assume_role method can be whatever string you'd like. However, DurationSeconds has a minimum allowed value of 900.
  • This solution would work in general with creating any services the boto3 client handles. You would just substitute the AWS service name in the client definition instead of using S3.
import boto3
import logging
import json

from airflow.models import Variable
from google.cloud import secretmanager

sm_client = secretmanager.SecretManagerServiceClient()
config = Variable.get("project_variable", deserialize_json=True)
secret_location = config["s3_secret"]
secret_request = sm_client.access_secret_version(name=secret_location)
s3_creds = json.loads(secret_request.payload.data.decode("UTF-8"))

logging.info('Creating boto3 session.')
session = boto3.Session(
    aws_access_key_id=s3_creds["access_key"],
    aws_secret_access_key=s3_creds["secret_key"],
)

logging.info(f'Assuming S3 Role: {s3_creds["role_arn"]}')
sts_connection = session.client('sts')
assume_role_object = sts_connection.assume_role(RoleArn=s3_creds["role_arn"],
                                                RoleSessionName='DataEngineering',
                                                DurationSeconds=900)
credentials = assume_role_object['Credentials']

logging.info('Creating S3 resource.')
s3 = boto3.client('s3',
                  aws_access_key_id=credentials['AccessKeyId'],
                  aws_secret_access_key=credentials['SecretAccessKey'],
                  aws_session_token=credentials['SessionToken'])

bucket = s3_creds["s3_bucket"].split("/")[2]
prefix = f'{s3_creds["s3_bucket"].split("/")[3]}/'
objects = s3.list_objects_v2(Bucket=bucket, Prefix=prefix, Delimiter='/')

Upvotes: 0

eatsfood
eatsfood

Reputation: 1088

Assuming that 1) the ~/.aws/config or ~/.aws/credentials file is populated with each of the roles that you wish to assume and that 2) the default role has AssumeRole defined in its IAM policy for each of those roles, then you can simply (in pseudo-code) do the following and not have to fuss with STS:

import boto3

# get all of the roles from the AWS config/credentials file using a config file parser
profiles = get_profiles()

for profile in profiles:

    # this is only used to fetch the available regions
    initial_session = boto3.Session(profile_name=profile)

    # get the regions
    regions = boto3.Session.get_available_regions('ec2')

    # cycle through the regions, setting up session, resource and client objects
    for region in regions:
        boto3_session = boto3.Session(profile_name=profile, region_name=region)
        boto3_resource = boto3_session.resource(service_name='s3', region_name=region)
        boto3_client = boto3_session.client(service_name='s3', region_name=region)

        [ do something interesting with your session/resource/client here ]

Upvotes: 1

alfredocambera
alfredocambera

Reputation: 3410

#!/usr/bin/env python3

import boto3

sts_client = boto3.client('sts')
assumed_role = sts_client.assume_role(RoleArn =  "arn:aws:iam::123456789012:role/example_role",
                                      RoleSessionName = "AssumeRoleSession1",
                                      DurationSeconds = 1800)
session = boto3.Session(
    aws_access_key_id     = assumed_role['Credentials']['AccessKeyId'],
    aws_secret_access_key = assumed_role['Credentials']['SecretAccessKey'],
    aws_session_token     = assumed_role['Credentials']['SessionToken'],
    region_name           = 'us-west-1'
)

# now we make use of the role to retrieve a parameter from SSM
client = session.client('ssm')
response = client.get_parameter(
    Name = '/this/is/a/path/parameter',
    WithDecryption = True
)
print(response)

Upvotes: 4

Vinay
Vinay

Reputation: 1651

Here's a code snippet from the official AWS documentation where an s3 resource is created for listing all s3 buckets. boto3 resources or clients for other services can be built in a similar fashion.

# create an STS client object that represents a live connection to the 
# STS service
sts_client = boto3.client('sts')

# Call the assume_role method of the STSConnection object and pass the role
# ARN and a role session name.
assumed_role_object=sts_client.assume_role(
    RoleArn="arn:aws:iam::account-of-role-to-assume:role/name-of-role",
    RoleSessionName="AssumeRoleSession1"
)

# From the response that contains the assumed role, get the temporary 
# credentials that can be used to make subsequent API calls
credentials=assumed_role_object['Credentials']

# Use the temporary credentials that AssumeRole returns to make a 
# connection to Amazon S3  
s3_resource=boto3.resource(
    's3',
    aws_access_key_id=credentials['AccessKeyId'],
    aws_secret_access_key=credentials['SecretAccessKey'],
    aws_session_token=credentials['SessionToken'],
)

# Use the Amazon S3 resource object that is now configured with the 
# credentials to access your S3 buckets. 
for bucket in s3_resource.buckets.all():
    print(bucket.name)

Upvotes: 87

meh Man
meh Man

Reputation: 11

After a few days of searching, this is the simplest solution I have found. explained here but does not have a usage example.

import boto3


for profile in boto3.Session().available_profiles:

    boto3.DEFAULT_SESSION = boto3.session.Session(profile_name=profile)

    s3 = boto3.resource('s3')

    for bucket in s3.buckets.all():
        print(bucket)

This will switch the default role you will be using. To not make the profile the default, just do not assign it to boto3.DEFAULT_SESSION. but instead, do the following.

testing_profile = boto3.session.Session(profile_name='mainTesting')
s3 = testing_profile.resource('s3')

for bucket in s3.buckets.all():
    print(bucket)

Important to note that the .aws credentials need to be set in a specific way.

[default]
aws_access_key_id = default_access_id
aws_secret_access_key = default_access_key

[main]
aws_access_key_id = main_profile_access_id
aws_secret_access_key = main_profile_access_key

[mainTesting]
source_profile = main
role_arn = Testing role arn
mfa_serial = mfa_arn_for_main_role

[mainProduction]
source_profile = main
role_arn = Production role arn
mfa_serial = mfa_arn_for_main_role

I don't know why but the mfa_serial key has to be on the roles for this to work instead of the source account which would make more sense.

Upvotes: 0

upaang saxena
upaang saxena

Reputation: 789

You can assume role using STS token, like:

class Boto3STSService(object):
    def __init__(self, arn):
        sess = Session(aws_access_key_id=ARN_ACCESS_KEY,
                       aws_secret_access_key=ARN_SECRET_KEY)
        sts_connection = sess.client('sts')
        assume_role_object = sts_connection.assume_role(
            RoleArn=arn, RoleSessionName=ARN_ROLE_SESSION_NAME,
            DurationSeconds=3600)
        self.credentials = assume_role_object['Credentials']

This will give you temporary access key and secret keys, with session token. With these temporary credentials, you can access any service. For Eg, if you want to access ELB, you can use the below code:

self.tmp_credentials = Boto3STSService(arn).credentials

def get_boto3_session(self):
    tmp_access_key = self.tmp_credentials['AccessKeyId']
    tmp_secret_key = self.tmp_credentials['SecretAccessKey']
    security_token = self.tmp_credentials['SessionToken']

    boto3_session = Session(
        aws_access_key_id=tmp_access_key,
        aws_secret_access_key=tmp_secret_key, aws_session_token=security_token
    )
    return boto3_session

def get_elb_boto3_connection(self, region):
    sess = self.get_boto3_session()
    elb_conn = sess.client(service_name='elb', region_name=region)
    return elb_conn

Upvotes: 22

singh30
singh30

Reputation: 1503

import json
import boto3


roleARN = 'arn:aws:iam::account-of-role-to-assume:role/name-of-role'
client = boto3.client('sts')
response = client.assume_role(RoleArn=roleARN, 
                              RoleSessionName='RoleSessionName', 
                              DurationSeconds=900)

dynamodb_client = boto3.client('dynamodb', region_name='us-east-1',
                    aws_access_key_id=response['Credentials']['AccessKeyId'],
                    aws_secret_access_key=response['Credentials']['SecretAccessKey'],
                    aws_session_token = response['Credentials']['SessionToken'])

response = dynamodb_client.get_item(
Key={
    'key1': {
        'S': '1',
    },
    'key2': {
        'S': '2',
    },
},
TableName='TestTable')
print(response)

Upvotes: 1

billkw
billkw

Reputation: 3699

If you want a functional implementation, this is what I settled on:

def filter_none_values(kwargs: dict) -> dict:
    """Returns a new dictionary excluding items where value was None"""
    return {k: v for k, v in kwargs.items() if v is not None}


def assume_session(
    role_session_name: str,
    role_arn: str,
    duration_seconds: Optional[int] = None,
    region_name: Optional[str] = None,
) -> boto3.Session:
    """
    Returns a session with the given name and role.
    If not specified, duration will be set by AWS, probably at 1 hour.
    If not specified, region will be left unset.
    Region can be overridden by each client or resource spawned from this session.
    """
    assume_role_kwargs = filter_none_values(
        {
            "RoleSessionName": role_session_name,
            "RoleArn": role_arn,
            "DurationSeconds": duration_seconds,
        }
    )
    credentials = boto3.client("sts").assume_role(**assume_role_kwargs)["Credentials"]
    create_session_kwargs = filter_none_values(
        {
            "aws_access_key_id": credentials["AccessKeyId"],
            "aws_secret_access_key": credentials["SecretAccessKey"],
            "aws_session_token": credentials["SessionToken"],
            "region_name": region_name,
        }
    )
    return boto3.Session(**create_session_kwargs)


def main() -> None:
    session = assume_session(
        "MyCustomSessionName",
        "arn:aws:iam::XXXXXXXXXXXX:role/TheRoleIWantToAssume",
        region_name="us-east-1",
    )
    client = session.client(service_name="ec2")
    print(client.describe_key_pairs())

Upvotes: 7

Jarrad
Jarrad

Reputation: 1027

To get a session with an assumed role:

import botocore
import boto3
import datetime
from dateutil.tz import tzlocal

assume_role_cache: dict = {}
def assumed_role_session(role_arn: str, base_session: botocore.session.Session = None):
    base_session = base_session or boto3.session.Session()._session
    fetcher = botocore.credentials.AssumeRoleCredentialFetcher(
        client_creator = base_session.create_client,
        source_credentials = base_session.get_credentials(),
        role_arn = role_arn,
        extra_args = {
        #    'RoleSessionName': None # set this if you want something non-default
        }
    )
    creds = botocore.credentials.DeferredRefreshableCredentials(
        method = 'assume-role',
        refresh_using = fetcher.fetch_credentials,
        time_fetcher = lambda: datetime.datetime.now(tzlocal())
    )
    botocore_session = botocore.session.Session()
    botocore_session._credentials = creds
    return boto3.Session(botocore_session = botocore_session)

# usage:
session = assumed_role_session('arn:aws:iam::ACCOUNTID:role/ROLE_NAME')
ec2 = session.client('ec2') # ... etc.

The resulting session's credentials will be automatically refreshed when required which is quite nice.

Note: my previous answer was outright wrong but I can't delete it, so I've replaced it with a better and working answer.

Upvotes: 34

Related Questions