everspader
everspader

Reputation: 1700

How to run Python Azure SDK in a Github Actions workflow with federated credentials authentication

I have a Python tool that wants to interact with Azure SDK (list and upload blobs). Typically I would use something like:

from azure.identity import DefaultAzureCredential
from azure.storage.blob import BlobServiceClient

creds = DefaultAzureCredential()
client = BlobServiceClient(..., credential=creds)

Considering that I am relying on the Authentication to be via the current logged in user in the CLI, I would like to run this script in a GitHub Action workflow.

The catch is that the authentication method in the GitHub action MUST be via federated credentials.

I have a service principal with the federated identity configured, I can authenticate and login using the azure/login action. The problem is that in the step where the Python script must be run, it seems that the context of the logged-in CLI from the previous step is not carried over because I see the following error:

azure.core.exceptions.HttpResponseError: This request is not authorized to perform this operation using this permission.

My workflow looks like this (simplified):

jobs:
  execute:
    runs-on: ubuntu-latest
    steps:
      # checkout and other stuff
      # ...

      - name: AZ Login
        uses: Azure/login@v1
        with:
          client-id: ${{ vars.AZURE_CLIENT_ID }}
          subscription-id: ${{ vars.AZURE_SUBSCRIPTION_ID }}
          tenant-id: ${{ vars.AZURE_TENANT_ID }}

      # setup python and install packages
      # ...

      - name: Run script
        run: |
          .venv/bin/python script.py

Quetion is: how can I propagate the credentials from the azure/login step into the python runtime environment?

Upvotes: 0

Views: 441

Answers (1)

Bhavani
Bhavani

Reputation: 5262

This request is not authorized to perform this operation using this permission.

If the service principal doesn't have permissions to read or perform the operations, you will get the above error. To resolve the error add storage data contributor role to the service principle to blob storage account as follows: Step1: Go to IAM of storage account, click on Add and select Add role assignment as shown below:

enter image description here

Step2: Search for storage data contributor role and select it as shown below:

enter image description here

Step3: Select the service principle as mentioned below:

enter image description here

After successful role assignment you will be able to read file or list the files in container using below code:

import io
from azure.storage.blob import BlobServiceClient
import pandas as pd
from azure.identity import ClientSecretCredential

account_url = "https://<storageAccountName>.blob.core.windows.net"
container_name = "<containerNmae>"
file_path = "<filePath>"
TENANT_ID = "<tenantId>"
CLIENT_ID = "clientId"
CLIENT_SECRET = "clientSecret"
credentials = ClientSecretCredential(TENANT_ID, CLIENT_ID, CLIENT_SECRET)
blob_service_client = BlobServiceClient(account_url=account_url, credential=credentials)
container_client = blob_service_client.get_container_client(container_name)
blobs_list = container_client.list_blobs()
for blob in blobs_list:
    print(blob.name)
blob_client = container_client.get_blob_client(file_path) 
blob_data = blob_client.download_blob()
csv_data = blob_data.content_as_text()
df = pd.read_csv(io.StringIO(csv_data))
print("Sample.CSV data:")
print(df)

enter image description here

Upvotes: 0

Related Questions