Reputation: 2583
I have a Azure storage account and in that there are multiple containers. I need to give access to particular container using security group ( /via access package).
Considering least privileged access in Azure, how can I enable giving access to my Data storage/container ( for example, blob) so that users can PIM up before accessing a specific container ( e.g. Blob) from Azure?
The roles that needs to go via Privileged Identity Management(PIM) could be:
Storage blob data reader
Storage blob date contributor
I was going via MS tutorials here( Link1, link2, Link3 ). However, unable to figure out the right and best approach for this. Is there any other step by step guide? Thanks
Upvotes: 0
Views: 1352
Reputation: 10859
To start with you can ,use the Azure portal itself for the Storage Blob Data Contributor ,Storage Blob Data Owner
where Azure role assignment condition can be done as an additional check that you can optionally add to your role assignment to provide more fine-grained access control
.
Add or edit Azure role assignment conditions :PORTAL which is preview account.
and yes the scope can be container level.
Click Add condition to further refine the role assignments based on storage attributes.
You can then add resource and conditions like ex: Ifselected user tries to read a blob without the Project=Cascade tag, access will not be allowed.
• Resource indicates that the attribute is on the resource, such as container name.
You have to type the name and drop down wont be listed to select the blob.
This can be done from ARM template
also.
The following template shows how to assign the Storage Blob Data Reader role with a condition. The condition checks whether the container name equals 'blobs-example-container' which can be given with resource @Resource[Microsoft.Storage/storageAccounts/blobServices/containers:name] StringEquals 'blobs-example-container'
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"principalId": {
"type": "string",
"metadata": {
"description": "Principal ID to assign the role to"
}
},
"principalType": {
"type": "string",
"metadata": {
"description": "Type of principal"
}
},
"roleAssignmentGuid": {
"type": "string",
"defaultValue": "[newGuid()]",
"metadata": {
"description": "New GUID used to identify the role assignment"
}
}
},
"variables": {
"StorageBlobDataReader": "[concat(subscription().Id, '/providers/Microsoft.Authorization/roleDefinitions/2a2b9908-6ea1-4ae2-8e65-a410df84e7d1')]" // ID for Storage Blob Data Reader role, but can be any valid role ID
},
"resources": [
{
"name": "[parameters('roleAssignmentGuid')]",
"type": "Microsoft.Authorization/roleAssignments",
"apiVersion": "2020-04-01-preview", // API version to call the role assignment PUT.
"properties": {
"roleDefinitionId": "[variables('StorageBlobDataReader')]",
"principalId": "[parameters('principalId')]",
"principalType": "[parameters('principalType')]",
"description": "Role assignment condition created with an ARM template",
"condition": "((!(ActionMatches{'Microsoft.Storage/storageAccounts/blobServices/containers/blobs/read'})) OR (@Resource[Microsoft.Storage/storageAccounts/blobServices/containers:name] StringEquals 'blobs-example-container'))", // Role assignment condition
"conditionVersion": "2.0"
}
}
]
}
But additionally to secureand limit user with a time-bound setting, approval workflow, audit trail, and so on
along with fine-grained conditions you may need to use PIM and add additional properties like scheduleInfo, admin access
just like the one you have linked to here using portal or ARM pim-resource-roles-assign-roles
{
"properties": {
"principalId": "a3bb8764-cb92-4276-9d2a-ca1e895e55ea",
"roleDefinitionId": "/subscriptions/dfa2a084-766f-4003-8ae1-c4aeb893a99f/providers/Microsoft.Authorization/roleDefinitions/c8d4ff99-41c3-41a8-9f60-21dfdad59608",
"requestType": "AdminAssign",
"scheduleInfo": {
"startDateTime": "2022-07-05T21:00:00.91Z",
"expiration": {
"type": "AfterDuration",
"endDateTime": null,
"duration": "P365D"
}
},
"condition": "@Resource[Microsoft.Storage/storageAccounts/blobServices/containers:ContainerName] StringEqualsIgnoreCase 'foo_storage_container'",
"conditionVersion": "1.0"
}
}
You can check the syntax formats here
You can also make use of storage-sas for container level access control where you can set expiry time and invoke permissions for those s=resources.
Reference:Azure attribute-based access control (Azure ABAC)
Upvotes: 0