Azure Storage Powershell Module

messy desk

This may not come as a surprise but there is no shortage of children born and raised these days. My wife and I are guilty of taking large amounts of pictures for all the memories too. We have years of digital images piling up all over the place. Some of the places the pictures pile up:

  • Google Photos
  • Dropbox
  • Jump drives
  • Old computers
  • New computers

They are everywhere! Since purchasing a new machine, I have consolidated all of them into the appropriate year directory. Then compressing the directory for archival purposes. They have sat in this state for the past couple of months. Until a thought passed me by, "What if the drive on this machine has an issue? What happens if I lose these archives?". While that is not likely, it is possible. Given the sensitivity of these images, something had to be done.

In steps Azure Storage to save us. If you have not used it before, it disc space that you pay for in the cloud. There are a handful of ways to get the files into Azure Storage. Choosing one of them would present me with the opportunity to work with the Azure Powershell Modules. So how did I do this? Great question, let's dig into the commands issued to move all this data!

First, the modules have to be installed. That is quickly done by executing Install-Module -Name Az -Scope CurrentUser -Repository PSGallery -Force. Note, you may have to set to a less restrictive policy, so if there is an error try executing Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser. Then, re-execute the first command. Don't be alarmed, this will take some time. Several tools are being added to your machine with this installation.

Now that the tools are there, we next need to authenticate ourselves by executing Connect-AzAccount -Tenant $tenantId. Your tenant id can be found in Azure's portal. All the setup is now complete and it's time to deploy a new Azure Storage Account. To describe all these commands, I will first show you everything. Then through the comments in the scripts describe what the command performs.

# Creates a new storage account in the provided resource group. This is the command 
# that deploys the Azure Storage service. The $storageAccountContext will be used in
# subsequent commands to create containers and upload file content. So make sure to
# capture this information. This particular Azure Storage service will be the standard
# storage service located in the Central US region, defaulting storage containers to be
# cool storage. For more information on the available access tiers check 
# here: https://docs.microsoft.com/en-us/azure/storage/blobs/access-tiers-overview
 $storageContext = New-AzStorageAccount `
 -ResourceGroupName $resourceGroupName `
 -Name $storageAccountName `
 -SkuName Standard_LRS `
 -Location "centralus" `
 -AccessTier "cool"

 # Create a new container within the Azure Storage service provisioned above with 
 # the provided name. A container is synonymous with a directory. It is a container 
 # for the storage of information.  
 New-AzStorageContainer `
 -Name $containerName `
 -Context $storageContext `
 -Permission "blob"

 # Upload a file from the executing machine to the provided container within the
 # Azure Storage service provisioned above. The -File property is the path to the 
 # file on disc. Where the -Blob property is the name to be assigned to the asset
 # within the Azure Storage container.
 Set-AzStorageBlobContent `
 -File '.\image.png' `
 -Container $containerName `
 -Blob 'image.png' `
 -Context $storageContext

Great, at this point I have been able to upload single files to Azure. This could all be done within Azure Portal. Only, this route took longer than doing it via the web tooling. So what can be done to speed this up? I had a handful of files that were all of the same file extension. How about scripting the upload of all those files? For this two commands have to be piped together. The first:

 Get-ChildItem `
 -Path .\ `
 -Filter *.png `
 -File 

The above is a common Powershell command to get all the items within a directory. In this case, the command is being executed in the same directory as the containing files. This is communicated by -Path .\ property, where .\ is short for the current directory. The command is then filtering for all items that end with .png. Finally, this command is only interested in items that are file types.

ForEach-Object { `
Set-AzStorageBlobContent `
-File (Join-Path -Path '.\' -ChildPath $_) `
-Container $containerName `
-Blob $_ `
-Context $storageContext }

Once we have all the files we are interested in, we then want to issue a command to iterate over each. For each item, we then issue a command to upload the content to Azure storage. In the final command to upload the file, there is some path joining to dynamically create the relative path. $_ can be found in 2 locations in this upload script. this is a special variable in the ForEach-Object command, where the variable is the item being iterated over. Putting it all together, this is the command to execute:

Get-ChildItem `
 -Path .\ `
 -Filter *.png `
 -File  | ForEach-Object { `
Set-AzStorageBlobContent `
-File (Join-Path -Path '.\' -ChildPath $_) `
-Container $containerName `
-Blob $_ `
-Context $storageContext }

Notice the pipe character before the ForEach-Object? This takes the output of the previous command and passes the value as input to the following command. Where it is placed in our command here, the output of the file search is passed as input to the ForEach-Object to be iterated over. Pretty slick, huh?

Given the amount of power that one has in front of a Powershell terminal, this scripting could have been taken a step further. But, everything described solved what I was interested in completing with Powershell. It was a quick and easy tool to execute in several directories on my machine to move all those precious memories of children growing to a more stable archive location.