Configuration of Google cloud storage for application access

Following are the steps can be useful if you have any application (service) in your on premise that need to access (download/upload) files to Google cloud storage.

Open https://cloud.google.com/  and configure your free account if not done so far.
Open console and create a new Project.

CreateProject.jpg

 

Select the correct project if you have access to multiple projects.

Select Project.jpg

Select storage from ‘Product and services’ menu.

Storage

 

Click on ‘Create Bucket’ and give name, storage class and location.

CreateBucketOption.jpg

CreateBucketOption2.jpg

 

Upload some of files/folders manually.

UploadFiles

 

Click on Settings then Interoperability, note down cloud storage URL, access key and secret key. If secret key isn’t present then click on ‘Create a new Key

AccessKey.jpg

 

Testing bucket access:-

Install Cloudberry (freeware) for Google cloud in your test server.
Connect to Azure blob using access key, secret key. Make sure authentication is ‘Access & secret key’ is selected.

CloudBerrySetup.jpg

 

Test copy (/cut) paste of file (/s) using cloud berry console.

UploadFiles.jpg

 

 

Advertisements

Configuring Rackspace Cloud Files Storage for API access

Following are the steps can be useful if you have any application (service/API) in your environment that need to access (download/upload) files to Rackspace cloud file storage.

Signup to Rackspace cloud
Go to Rackspace control panel then provide root user a/c you configured during signup process.

loginconsole.jpg

 

Create a New User account for API access. Go to User Management from Account tab

UserManagement.jpg

 

Click on Create user, this console will give you list of all users created so far. Once default (root) user will be available by default.

CreateUser.jpg

Give user details such as FirstName, LastName, email, phone….Etc. Contact type must be Technical then select appropriate permission on Rackspace cloud.

UserPermission.jpg

 

 

Once user created successfully, go to the properties of user a/c copy the Rackspace API key.

UserAPIKey.jpg

 

 

From the control panel, select Rackspace cloud from the Product list then select Files from Storage list.

storage-files.jpg

 

Create a New container, select appropriate Region. keep the type as Private

ContainerCreate.jpg

 

Manually upload some of the files using console itself.

UploadFiles.jpg

 

Testing bucket (container) access

Download Rackspace Command Line Interface
Go to the directory where you downloaded the rack binary and run the following command to connect with container.

rack.exe configure

cmd1.jpg

Retrieves a list of containers.

Rack files container list

cmd2.jpg

Lists all of the objects contained in a specified container.

Rack files object list --container StorageAccess

cmd3.jpg

Uploads an object directory into a specified container

Rack files object upload-dir --container StorageAccess --dir \temp\pictures

cmd4.jpg

you can also try checking cloud access using cloud berry backup tool

 

Configuration of Azure Blob storage for application access

Following are the steps can be useful if you have any application (service) in your on premise that need to access (download/upload) files to Microsoft Azure blob storage.

Open Azure web portal Configure free tire account if not configured.

Create a blob storage resource.

CreateResource.jpg

 

Give appropriate name for Storage account, remember this should be unique in AWS infrastructure, and select storage kind as ‘Blob storage’, select ‘New’ in ‘Resource group’ section. If you have already created any resource group then select one of old one. Select ‘Pin to dashboard’ so you can access it directly from dashboard.

StorageAccount.jpg

 

From the dash board, click on storage account ‘appazurestorage1’ you just created.

DashBoard.jpg

 

Create a new ‘Container’. It’s kind of folder, in AWS & Google cloud it’s called as a bucket. Keep the access level as private.

Container.jpg

 

Click on ‘Access key’ tab, note down storage account name & key.

AccessKey.jpg

 

The basic configuration of blob storage is ready. You can now upload few files manually using Azure portal by click on bucket you just created.

Upload.jpg

 

Testing bucket access:-

Method 1 using cloud Berry
a. Download and Install Cloudberry (freeware) for Azure in your on premise server.
b. Connect to Azure blob using access key & account name.

CloudBerry.jpg

 

c. Test copy (/cut) paste of file (/s) using cloud berry console.

CloudBerryCopypaste.jpg

 

Method 2 Using MS AzCopy command line tool.
Download and install AzCopy on server. Open command prompt, switch to AzCopy directory which is most likely “C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy“

Download all blobs (files) of a container to local disk.
Sample Syntax

AzCopy /Source:https://myaccount.blob.core.windows.net/mycontainer /Dest:C:\myfolder /SourceKey:key /S

Example.

AzCopy /Source:https://appazurestorage1.blob.core.windows.net/myappbucket /Dest:C:\temp\Azure /SourceKey:pQrvBr+rwoI9psWKx73SKcrE8M0JW+ZUQeIY05CJ+PJMGSFMpXV+U9Maygbtiwtc69+aPkabmZna6hxfhuw2NA== /S /sourceType:blob

 

c1.jpg

 

Upload all blobs (files) in a container (bucket)
Syntax

AzCopy /Source:C:\myfolder /Dest:https://myaccount.blob.core.windows.net/mycontainer /DestKey:key /S

Example

AzCopy /Source:C:\temp\Azure\upload /Dest:https://appazurestorage1.blob.core.windows.net/myappbucket /DestKey:pQrvBr+rwoI9psWKx73SKcrE8M0JW+ZUQeIY05CJ+PJMGSFMpXV+U9Maygbtiwtc69+aPkabmZna6hxfhuw2NA== /S

 

c2.jpg

More details about AzCopy tool.

Configuration of AWS S3 (Simple Storage Service) for application access.

Following are the steps can be useful if you have any application (service) in your on premise that need to access (download/upload) files to AWS S3 storage.

Sign in with root a/c credential to AWS Console

loginroot.jpg

Select IAM under ‘Security, identity and compliance’ container

IAM

Add a new user for API or Console access.

CreateUser.jpg

Give appropriate ‘User name’, Select Access Type.

Please note, selection of both access type isn’t recommended for production use due to accessibly issues. This demo require only ‘Programmatic acces. You can use same user account for delegation of AWS storage related stuff managed via AWS console.

UserProperties.jpg

Select ‘Attach existing policies directly’ then search for S3, attach ‘AmazonS3FullAccess’

Permission.jpg

Review the setting and then click on ‘Create user

Note down user name, access Key ID, Secret Access Key and Sign-in URL. You can additionally download CSV file for all these information.

UserDetails.jpg

Select ‘S3’ from ‘Storage’ section.

S3Page.jpg

 

Click on ‘Create Bucket’, give appropriate name and select ‘Region’, the bucket name should be unique in AWS infrastructure.  Then click on ‘Create’. I have skipped remaining criteria such as version, permission and website related stuff for this test. However if you need to have specific settings please refer.

CreateBucket.jpg

You can upload files manually using AWS Console.

Upload.jpg

 

Testing bucket access using on premise application.

Method 1-using CloudBerry

Install Cloudberry (freeware), Connect to AWS S3 bucket

CloudBerryTest.jpg

You can copy (/cut)-paste files from local machine to S3 OR vice versa.

CloudBerry-copypaste.jpg

 

Method 2 Using-Powershell

Install AWS tools Open PowerShell, use following commands to test bucket access.

Set Credential.

Set-AWSCredentials -AccessKey AKIAI3ZDRI4HGSD4NOGQ -SecretKey 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

Get-all buckets

Get-S3Bucket

Ps1.jpg

Show all contents of specified bucket.

Get-S3Object -BucketName appdatatest1 -MaxKey 100 |Format-Table

PS2.jpg

Refer  for more details on PowerShell commands AWS.

Enable Multi Factor authentication for your AWS account

If you are beginner of AWS like me and wondering how the account can be secure for unauthorized access then enable multi factor authentication using smart phone app code.

Sign in to AWS console with root account.

Signin

 

Select IAM (Identity & Access Manager) under ‘Security, Identity & Compliance’ Section.

IAM.jpg

 

Under Dashboard, select ‘Activate MFA on your root account’

DashBoard.jpg

 

Click on ‘Manage MFA device’, then select ‘A virtual MFA device’.

ManageMFA.jpg

Before click on ‘Next Step’, install ‘Google authenticator’ app in your smart phone. Following is the list of APP supported by AWS.

 Android Google Authenticator; Authy 2-Factor Authentication
 iPhone Google Authenticator; Authy 2-Factor Authentication
 Windows Phone Authenticator
 Blackberry Google Authenticator

 

Click on ‘Next Step’ then scan the bar code from ‘Google Authenticator’ mobile app.

Barcode1    Barcode2

 

Once bar code activated successfully on phone then place two consecutive authentication code (each code generates in interval of a minute) and click on ‘Activate virtual MFA’ then click on Finish.

activate MFA.jpg

 

Refresh AWS page and then you will see MFA is activated.

AfterEanble.jpg

 

If you Sign out then Sing in again into AWS console. You need to supply username, password and authentication code generated from mobile app.

login2.jpg

Refer following links for more information.

Multi-Factor Authentication

Using Multi-Factor Authentication (MFA) in AWS

Retrieve Active Directory object properties using VBScript.

In many circumstance, you may need to isolate application and Active directory issues that may popup because of bad network OR configuration of environment. In one of last challenge to isolate application performance issues with network while retrieving AD object I used following VBScript rather than custom application.

Create a text file eg. RetrieveProxy.vbs in your desired location (mine is c:\tools). Paste following code into notepad.

This will retrieve ‘user1’ proxy address from Active directory.

StartTime=Now
REM wscript.echo Starttime
Set objUser=GetObject("LDAP://192.168.2.100/CN=user1,OU=Myusers,DC=domain,DC=local")
ProxyAddress=objUser.proxyAddresses
EndTime=Now
TimeTaken=DateDiff("s",StartTime,EndTime)
wscript.echo("Proxy Address: "&ProxyAddress", Bind took "&TimeTaken&" Seconds, From "&StartTime&" To "EndTime)

You can run either using command line.

commandout.jpg

Or by simply double clicking on “RetriveProxy.vbs”.

window1out.jpg

In above script,

  • You can modify the property name you wish to retrieve from AD. I have used “proxyAddress” you can have “whenCreated”, “whenChanged” OR any other available object properties.
  • You can change IP OR FQDN of Domain controller. It can be without IP/FQDN as well, Eg. “LDAP:// CN=user1,OU=Myusers,DC=domain,DC=local”, in that case it will connect nearest available domain controller.
  • Adding variable ‘StartTime’ OR ‘EndTime’ is completely optional, I have used it so I can see time taken during retrieval so that I can compare it other application doing similar stuff. If it take longer time then expected then definitely some of n/w resources are at fault somewhere.

Following is the modified version so it can run against all objects of an OU (Organizational unit).

StartTime=Now
REM wscript.echo Starttime
Set objUsers=GetObject("LDAP://192.168.2.100/OU=MyUsers,DC=domain,DC=local")
objUsers.Filter = Array("User")
Dim AllUsersProxy
For Each obj In objUsers
   AllUsersProxy=AllUsersProxy&obj.cn&"  "&obj.proxyAddresses&vbnewline
Next
EndTime=Now
TimeTaken=DateDiff("s",StartTime,EndTime)
wscript.echo("Proxy Addresses: "&vbnewline&AllUsersProxy&"Bind took "&TimeTaken&" Seconds, From " &StartTime&" To "&EndTime)

Please note, this script retrieve two properties and output will be new line for each objects.

commandout3

windowout2.jpg

 

Analyze IIS log using PowerShell

In my last blog, you learn how IIS (or similar) log can be uploaded to SQL DB. Once logs are uploaded you can analyze it using PowerShell and create fancy report as well. Let’s see how this can be done.

Following script gives number of request per IIS page with response code.

$Query1="
Use IISLogReview
     SELECT TOP 20 [sc-STATUS] Response, [cs-uri-stem] Access_Page,  count(*) Total_Request from IISLOG
     GROUP BY [sc-STATUS],  [cs-uri-stem]
     ORDER BY COUNT(*) DESC"
$SqlOut1=Invoke-Sqlcmd -ServerInstance SQL1 -Query $Query1 -QueryTimeout 65535
$PrintOut1=$SqlOut1|Select-Object Response,Access_Page,Total_Request|Format-Table -AutoSize
$PrintOut1

iis1.jpg

Following script gives number of request per user.

$Query2="
Use IISLogReview
    SELECT COUNT(*) Total_Request, [s-username] UserName  FROM IISLOG
    GROUP BY [s-username]
    ORDER BY Total_Request DESC"
$SqlOut2=Invoke-Sqlcmd -ServerInstance SQL1 -Query $Query2 -QueryTimeout 65535
$PrintOut2=$SqlOut2|Select-Object Total_Request,UserName |Format-Table -AutoSize
$PrintOut2

iis2.jpg

This is the complete tool/function however this time it will generate a HTML output.

Function AnalyseIIS
{
    PARAM
    (
    #Default SQL will be local host
    [string]$SQLServer=$env:computername,
    #Default HTML File will be c;\temp\yyyymmdd.html
    [string]$HTMLFile='C:\TEMP\'+(Get-Date).tostring("yyyyMMddhhmm")+'.html'
    )

    $Query1=
        "Use IISLogReview
        SELECT TOP 20 [sc-STATUS] Response, [cs-uri-stem] Access_Page,  count(*) Total_Request from IISLOG
        GROUP BY [sc-STATUS],  [cs-uri-stem]
        ORDER BY COUNT(*) DESC"

    $Query2=
        "Use IISLogReview
        SELECT COUNT(*) Total_Request, [s-username] UserName  FROM IISLOG
        GROUP BY [s-username]
        ORDER BY Total_Request DESC"

    $a = ""
    $a = $a + "BODY{background-color:peachpuff;font-family: Calibri; font-size: 12pt;}"
    $a = $a + "TABLE{border-width: 1px;border-style: solid;border-color: black;border-collapse: collapse;}"
    $a = $a + "TH{border-width: 1px;padding: 0px;border-style: solid;border-color: black;}"
    $a = $a + "TD{border-width: 1px;padding: 0px;border-style: solid;border-color: black;}"
    $a = $a + ""

    $SqlOut1=Invoke-Sqlcmd -ServerInstance $SQLServer -Query $Query1 -QueryTimeout 65535
    $PrintOut1=$SqlOut1|Select-Object Response,Access_Page,Total_Request| ConvertTo-HTML -PreContent '</pre>
<h2>Statistics Based on IIS Reponse, Page and Count</h2>
<pre>
'  -head $a |Out-String

    $SqlOut2=Invoke-Sqlcmd -ServerInstance $SQLServer -Query $Query2 -QueryTimeout 65535
    $PrintOut2=$SqlOut2|Select-Object Total_Request,UserName | ConvertTo-HTML -PreContent '</pre>
<h2>Statistics Based on UserName and Count</h2>
<pre>
'  -head $a |Out-String

    ConvertTo-Html -Title "IIS Log Analysis" -PostContent $PrintOut1,$PrintOut2 |Out-File $HTMLFile
    Invoke-Item $HTMLFile #open html after processing
}

AnalyseIIS
#OR Run with Parameter, Eg.
#AnalyseIIS -SQLServer SQL1 -HTMLFile C:\TEMP\IISReview.html 

iis3.jpg

You can have different select queries based on analysis you need and I believe my last two blogs can give you direction on how large set of text files can be uploaded to SQL then analyze it using PowerShell.

 

#default