Configuration of Azure Blob storage for application access

Following are the steps can be useful if you have any application (service) in your on premise that need to access (download/upload) files to Microsoft Azure blob storage.

Open Azure web portal Configure free tire account if not configured.

Create a blob storage resource.



Give appropriate name for Storage account, remember this should be unique in AWS infrastructure, and select storage kind as ‘Blob storage’, select ‘New’ in ‘Resource group’ section. If you have already created any resource group then select one of old one. Select ‘Pin to dashboard’ so you can access it directly from dashboard.



From the dash board, click on storage account ‘appazurestorage1’ you just created.



Create a new ‘Container’. It’s kind of folder, in AWS & Google cloud it’s called as a bucket. Keep the access level as private.



Click on ‘Access key’ tab, note down storage account name & key.



The basic configuration of blob storage is ready. You can now upload few files manually using Azure portal by click on bucket you just created.



Testing bucket access:-

Method 1 using cloud Berry
a. Download and Install Cloudberry (freeware) for Azure in your on premise server.
b. Connect to Azure blob using access key & account name.



c. Test copy (/cut) paste of file (/s) using cloud berry console.



Method 2 Using MS AzCopy command line tool.
Download and install AzCopy on server. Open command prompt, switch to AzCopy directory which is most likely “C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy“

Download all blobs (files) of a container to local disk.
Sample Syntax

AzCopy /Source: /Dest:C:\myfolder /SourceKey:key /S


AzCopy /Source: /Dest:C:\temp\Azure /SourceKey:pQrvBr+rwoI9psWKx73SKcrE8M0JW+ZUQeIY05CJ+PJMGSFMpXV+U9Maygbtiwtc69+aPkabmZna6hxfhuw2NA== /S /sourceType:blob




Upload all blobs (files) in a container (bucket)

AzCopy /Source:C:\myfolder /Dest: /DestKey:key /S


AzCopy /Source:C:\temp\Azure\upload /Dest: /DestKey:pQrvBr+rwoI9psWKx73SKcrE8M0JW+ZUQeIY05CJ+PJMGSFMpXV+U9Maygbtiwtc69+aPkabmZna6hxfhuw2NA== /S



More details about AzCopy tool.

Configuration of AWS S3 (Simple Storage Service) for application access.

Following are the steps can be useful if you have any application (service) in your on premise that need to access (download/upload) files to AWS S3 storage.

Sign in with root a/c credential to AWS Console


Select IAM under ‘Security, identity and compliance’ container


Add a new user for API or Console access.


Give appropriate ‘User name’, Select Access Type.

Please note, selection of both access type isn’t recommended for production use due to accessibly issues. This demo require only ‘Programmatic acces. You can use same user account for delegation of AWS storage related stuff managed via AWS console.


Select ‘Attach existing policies directly’ then search for S3, attach ‘AmazonS3FullAccess’


Review the setting and then click on ‘Create user

Note down user name, access Key ID, Secret Access Key and Sign-in URL. You can additionally download CSV file for all these information.


Select ‘S3’ from ‘Storage’ section.



Click on ‘Create Bucket’, give appropriate name and select ‘Region’, the bucket name should be unique in AWS infrastructure.  Then click on ‘Create’. I have skipped remaining criteria such as version, permission and website related stuff for this test. However if you need to have specific settings please refer.


You can upload files manually using AWS Console.



Testing bucket access using on premise application.

Method 1-using CloudBerry

Install Cloudberry (freeware), Connect to AWS S3 bucket


You can copy (/cut)-paste files from local machine to S3 OR vice versa.



Method 2 Using-Powershell

Install AWS tools Open PowerShell, use following commands to test bucket access.

Set Credential.

Set-AWSCredentials -AccessKey AKIAI3ZDRI4HGSD4NOGQ -SecretKey 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

Get-all buckets



Show all contents of specified bucket.

Get-S3Object -BucketName appdatatest1 -MaxKey 100 |Format-Table


Refer  for more details on PowerShell commands AWS.

Enable Multi Factor authentication for your AWS account

If you are beginner of AWS like me and wondering how the account can be secure for unauthorized access then enable multi factor authentication using smart phone app code.

Sign in to AWS console with root account.



Select IAM (Identity & Access Manager) under ‘Security, Identity & Compliance’ Section.



Under Dashboard, select ‘Activate MFA on your root account’



Click on ‘Manage MFA device’, then select ‘A virtual MFA device’.


Before click on ‘Next Step’, install ‘Google authenticator’ app in your smart phone. Following is the list of APP supported by AWS.

 Android Google Authenticator; Authy 2-Factor Authentication
 iPhone Google Authenticator; Authy 2-Factor Authentication
 Windows Phone Authenticator
 Blackberry Google Authenticator


Click on ‘Next Step’ then scan the bar code from ‘Google Authenticator’ mobile app.

Barcode1    Barcode2


Once bar code activated successfully on phone then place two consecutive authentication code (each code generates in interval of a minute) and click on ‘Activate virtual MFA’ then click on Finish.

activate MFA.jpg


Refresh AWS page and then you will see MFA is activated.



If you Sign out then Sing in again into AWS console. You need to supply username, password and authentication code generated from mobile app.


Refer following links for more information.

Multi-Factor Authentication

Using Multi-Factor Authentication (MFA) in AWS