Reputation: 151
We are trying to re-hydrate some compressed backup files from Azure Archive tier to Cool tier. We are not concerned about the price of the data size itself (10TB). What we are concerned about is the potential I/O charges of moving tiers of individual files. The back-up process saves each job in "chunks" (i.e. folders) which contain a file tree of individual compressed blocks, varying in size. There could be a 100 thousand or a 100 billion of these individual blocks in the entire 10TB, we don't know.
Are there any PowerShell scripts/commands that could give the count of individual objects in the entire 10TB blob container? If we could determine the count of objects we could then calculate the I/O costs. We have command-line access and know that some of the API calls only return results in 5000 item blocks. We also know that any recursive search could take a lot of time.
Upvotes: 0
Views: 2404
Reputation: 72171
$storageAccount = Get-AzStorageAccount `
-ResourceGroupName $resourceGroup `
-Name $storageAccountName
$blobCount = 0
$token = $null
$MaxReturn = 5000
do {
$Blobs = Get-AzStorageBlob -Context $storageAccount.Context -Container $containerName `
-MaxCount $MaxReturn -ContinuationToken $token
if ( -not $Blobs ) { break } # not sure if this is needed
$blobCount = $blobCount + $Blobs.Count
$token = $Blobs[$Blobs.Count - 1].ContinuationToken;
} while ( $token )
this is the "script" I came up with. based on this
Upvotes: 1