Imagine you had a whole lot of data stored in Azure, you also want to save the most money in storage costs. By default, Azure Blob Storate is set to the Hot tier for all blobs, the most expensive storage costs, but the cheapest to read.
To give you an idea of the cost savings, here are General Purpose v2 storage account pricing below as of time of publication in US dollars:
|Storage | First 50 terabyte (TB) / month||$0.0208 per GB||$0.0152 per GB||$0.0025 per GB|
|Storage | Next 450 TB / Month||$0.02 per GB||$0.0152 per GB||$0.0025 per GB|
|Storage | Over 500 TB / Month||$0.0192 per GB||$0.0152 per GB||$0.0025 per GB|
|Read Operations** (per 10,000)||$0.0044||$0.01||$5.50|
To upgrade to a v2 storage account, it’s super easy.
With a v2 storage account, you only have the option of setting the Storage Tier at the entire storage account level, or manually by individual blobs (which can be an administration nightmare) – until now.
I have written a PowerShell script that will automate the process of setting the Tier on files (blobs) older than a specific number of days which you specify. Then you let the script to the work, it can be run interactively or fully automatic using Azure Automation.
This script will scan a designated storage account for all blobs, then it will then set the tier (Hot, Cool or Archive) for each blob that is older then a retention period of days in which you specify.
Setting object level access tier is only supported for Standard (LRS/GRS/RA-GRS) Blob Storage and General Purpose V2 Accounts. See https://aka.ms/blobtiering
There’s two scripts, the script below which can be run interactively, it will ask you questions for the storage account, tier & days retention. Or here, there is an Azure Automation runbook which I published, you can setup a schedule and run this on a daily basis across your storage accounts.