Analyze Enterprise Vault Dtraces using PowerShell-SQL tool.

This is for Enterprise Vault Engineers, however slight modification on this tool can help guys those are working on other technologies as well. The most difficult work for a support engineer is to analyze performance issue, where thousands of traces collected over time. In past, I developed a tool (combination of Power Shell & SQL script) that can upload Enterprise Vault dtrace to SQL server then assist on most common use cases such as finding out delay in specific function, searching a given keyword OR extracting entire thread.

You can either download executable file OR use PowerShell script on SQL 2016 server. Download link

Prerequisite

  • Microsoft SQL 2016 installed on Windows 2012 R2.
  • This works with windows authentication so logged in user must have DB creator OR equivalent permission on SQL instance.
  • The tool can be executed on server where SQL server binary is installed. For example, you can run it on SQL1 and connect to SQL2. Both servers should have SQL installed.
  • Copy latest dtrace_analyser exe file to SQL server.
  • Upload & Analyse, both are different operation. For example, you can upload dtrace today and analyse it tomorrow.

Upload D-trace

Double click on dtrace_analyser.exe, a windows form and console (command prompt) will open. Windows form enable user to upload /analyze the data. Console shows the status of operation it executing in background. During upload/analyze, form will free and all options will be disable till execution completes.

Just place all the dtrace in a single directory then supply name of that directory.

First.jpg

Click on Upload Traces, copy dtrace folder location OR click on Browse to select the location.

If the Database (DtraceReview) and Table (DtraceContent) already exists in SQL then it will skip the creation of database/table but truncate existing set of upload dtrace for consistency and accuracy during analysis.

Upload speed depends on resources. Approximately 10-15 seconds will take to upload single 100 MB dtrace file when SQL server have 12 GB RAM & 8 CPU.

Analyze Traces

Click on Find Delay option, by default the location of output HTML file will be C:\temp\dtrace-YYYYMMDDMMSS.html, you can change the location by clicking on Browse option or just type the location in text bar.

second.jpg

This tool automatically review each thread and prepare HTML report. If the numbers of lines in thread is huge, form may go to Not Responding mode, leave it running (will fix in next version) however actual work can be seen from console window.

Third.jpg

Once processing finish, HTML file will automatically open. Delay in seconds can be seen in last column. Any thread where delay is more than 2 second will only be visible in HTML output.

Fourth.jpg

Please note, this tool isn’t intelligent to understand the lines written for d-trace logging. Few function are expected to have delay such as one below. Engineers should understand the limitation because some function completes when their sub function completes.

Fifth.jpg

All process id, process name and thread id can be seen in HTML report.

Six.jpg

Last section of HTML will show events captured across all d-trace files.

Seven.jpg

Use Search Keyword option to search specific function, exception OR line. By default output will be located in C:\temp\search_output_YYYYMMDDHHMMSS.txt. You can change location and file name by clicking on Browse OR manually type the location.

Eight.jpg

Use Extract thread option to download all lines of specific thread to text file. By default output will save in text file located in C:\temp\Dtrace_thread_ThreadID.txt. You can change location and file name by clicking on Browse OR manually type the location.

Ninth.jpg

Figure out who is deleting files from windows operating system

File OR folder from windows operating system (client/OS) might miss due to many different reason. A user may logon to system interactively OR remotely then delete the file OR a malicious process may also delete the file. If you are unsure who is deleting files/folder then windows auditing is the best way to figure this out.

Follow this sequence to understand the concepts.

Enable windows auditing from Local Security Policy (run – secpol.msc). If you are doing against multiple servers then edit group policies from domain controller.

SCpolicy1

You can use following PowerShell to automate this step.

secedit /export /cfg c:\secpol.cfg
(gc C:\secpol.cfg).replace("AuditObjectAccess = 0", "AuditObjectAccess = 3") | Out-File C:\secpol.cfg
secedit /configure /db c:\windows\security\local.sdb /cfg c:\secpol.cfg /areas SECURITYPOLICY
rm -force c:\secpol.cfg -confirm:$false

 

Update group policy using following command.

gpupdate /force

Select folder that needs to be audited. In my example, I am enabling auditing for Delete action on c:\temp\temp folder

SCpolicy2.jpg

You can use below PowerShell

#Uncomment if foder you intending to be audited isn’t created so far.
#New-Item -type directory -path C:\temp\temp
$Folder= "c:\temp\temp"
$ACL = Get-Acl $Folder
$ar1 = New-object System.Security.AccessControl.FileSystemAuditRule ("EveryOne","Delete","3")
$Acl.SetAuditRule($ar1)
Set-Acl $Folder $ACL

 

Now if anyone (user/process) delete your file then event will be generated in your event viewer. For e.g. I am deleting File1.txt using windows explorer (right click \delete) second file using PowerShell.

RM -Force C:\Temp\TEMP\File2.txt -Confirm:$false

 

Open Event viewer and search Security log for Event ID 4656 with “File System” task category and with “Accesses: DELETE” string. “Subject: XXXX” will show you who has deleted a file.

Log Name: Security
Source: Microsoft-Windows-Security-Auditing
Date: MM/DD/YYYY HH:MM:SS
Event ID: 4656
Task Category: File System
Level: Information
Keywords: Audit Success
User: N/A
Computer: server.domain.local
Description:
A handle to an object was requested.
Subject:
Security ID: domain\user1
Account Name: user1
Account Domain: domain
Logon ID: 0x98B5C
Object:
Object Server: Security
Object Type: File
Object Name: C:\Temp\Temp\File2.txt
Handle ID: 0x774
Resource Attributes: -
Process Information:
Process ID: 0x4c4c
Process Name: C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
Access Request Information:
Transaction ID: {00000000-0000-0000-0000-000000000000}
Accesses: DELETE
ReadAttributes
Access Reasons: DELETE: Granted by D:(A;ID;FA;;;BA)
ReadAttributes: Granted by D:(A;ID;FA;;;BA)
Access Mask: 0x10080
Privileges Used for Access Check: -
Restricted SID Count: 0

 

OR you can below basic PowerShell to query system event viewer log.

Get-EventLog -LogName Security -InstanceId 4656

Power2-OUT.jpg

#new-item, #uncomment

Windows EC2 deployment using cloud formation

Following YML script can be use to perform Windows EC2 deployment using cloud formation.

Parameters:
  EnvironmentType:
   Description: Environment Type
   Type: String
   AllowedValues: [development, production]
   ConstraintDescription: must be development or production

  KeyName:
   Description: Name of an existing EC2 KeyPair to RDP this windows instance.
   Type: AWS::EC2::KeyPair::KeyName
   ConstraintDescription: must be the name of an existing EC2 KeyPair.

Mappings:
 EnvironmentToInstanceType:
  development:
   instanceType: t2.micro
  production:
   instanceType: t2.small

Resources:
 ServerSecurityGroup:
  Type: AWS::EC2::SecurityGroup
  Properties:
   GroupDescription: Allow RDP & HTTP access from all IP ADDresses
   SecurityGroupIngress:
    -   IpProtocol: tcp
        FromPort: 80
        ToPort: 80
        CidrIp: 0.0.0.0/0
    -   IpProtocol: tcp
        FromPort: 3289
        ToPort: 3289
        CidrIp: 0.0.0.0/0

 WindowsInstance:
  Type: AWS::EC2::Instance
  Properties:
   InstanceType: !FindInMap [EnvironmentToInstanceType, !Ref 'EnvironmentType', instanceType]
   #Choose correct ImageID, ami-da003ebf belogs to base windows 2012 R2 image.
   ImageId: ami-da003ebf
   KeyName: !Ref KeyName
   SecurityGroups:
    - !Ref ServerSecurityGroup
    

 

Here are the steps.

  1. Save above code in WinEC2.YML file.
  2. Open AWS management console, In Cloud formation section, select New Template, select Upload a template to Amazon S3. Select WinEC2.YML file then follow the wizard with all default options. You will be prompted for Environment Type (Production/Development) & Key Pair.EC2.jpg
  3. Once deployment successfully completes, you would see events like below screenshot.

EC2_Success.jpg

If you wish to join newly created windows EC2 to Active directory then use following reference for YML code. https://aws.amazon.com/blogs/security/how-to-configure-your-ec2-instances-to-automatically-join-a-microsoft-active-directory-domain/

#choose

MS SQL deployment using cloud formation in AWS.

Here is the code snippets for MS SQL deployment using YML code in AWS. If you wish to make it AD integrated then review the details given in comment section.

AWSTemplateFormatVersion: '2010-09-09'
Description: Creates an empty SQL Server RDS database as an example for automated deployments.
Parameters:
  SqlServerInstanceName:
    NoEcho: 'false'
    Description: RDS SQL Server Instance Name
    Type: String
    Default: MyAppInstance
    MinLength: '1'
    MaxLength: '63'
    AllowedPattern: "[a-zA-Z][a-zA-Z0-9]*"
  DatabaseUsername:
    AllowedPattern: "[a-zA-Z0-9]+"
    ConstraintDescription: DBAdmin
    Description: Database Admin Account User Name
    MaxLength: '16'
    MinLength: '1'
    Type: String
    Default: 'DBAdmin'
  DatabasePassword:
    AllowedPattern: "^(?=.*[0-9])(?=.*[a-zA-Z])([a-zA-Z0-9]+)"
    ConstraintDescription: Must contain only alphanumeric characters with at least one capital letter and one number.
    Description: The database admin account password.
    MaxLength: '41'
    MinLength: '6'
    NoEcho: 'true'
    Type: String
    Default: Admin123
  DBEngine:
    Description: Select Database Engine
    Type: String
    AllowedValues: [Express, Enterprise]
  #Following paramter can be placed if SQL needs to be AD integrated.
  #DomainID:
  # Description: Enter the Domain ID
  # Type: String

Mappings:
 SQLTOEngineType:
  Express:
   Engine: sqlserver-ex
  Enterprise:
   Engine: sqlserver-ee

Resources:
  SQLDatabase:
    Type: AWS::RDS::DBInstance
    Properties:
      DBInstanceIdentifier:
        Ref: SqlServerInstanceName
      LicenseModel: license-included
      Engine: !FindInMap [SQLTOEngineType, !Ref 'DBEngine', Engine]
      EngineVersion: 13.00.4466.4.v1
      DBInstanceClass: db.t2.micro
      AllocatedStorage: '20'
      MasterUsername:
        Ref: DatabaseUsername
      MasterUserPassword:
        Ref: DatabasePassword
      PubliclyAccessible: 'true'
      BackupRetentionPeriod: '1'
      #If SQL RDS needs to Active Directory Integrated then uncomment following parameter.
      #Domain: !ImportValue Directory-ID
      #OR
      #!Ref DomainID
      #IAM role is mandate for AD integration
      #DomainIAMRoleName: 'rds-directoryservice-access-role'
Outputs:
   SQLDatabaseEndpoint:
     Description: Database endpoint
     Value: !Sub "${SQLDatabase.Endpoint.Address}:${SQLDatabase.Endpoint.Port}"
  1. Save above code in SQLRDS.YML file.
  2. Open AWS management console, In Cloud formation section, select New Template, select Upload a template to Amazon S3. Select SQLRDS.YML file then follow the wizard with all default options.
  3. Once deployment successfully completes, you would see events like below screenshot.

sqlrds.jpg

#domain, #domainiamrolename, #domainid, #following, #iam, #if, #or

Active Directory deployment using cloud formation in AWS

Paste following code in notepad and save file with YML extension (eg. ActiveDirectory.yml).

AWSTemplateFormatVersion: 2010-09-09
Parameters:
 ADDomainName:
  Description: "Name the AD domain, eg. Mydomain.LOCAL"
  Type: String
 AdminPassword:
  NoEcho: true
  Description: "Type the password of default 'Admin', hint Pass@me123"
  Type: String
 MyVPC:
  Description: VPC to operate in
  Type: AWS::EC2::VPC::Id
 EditionType:
  Description: "Type of AD"
  Type: String
  Default: Enterprise
  AllowedValues:
    - Standard
    - Enterprise
 PrivateSubnet1ID:
   Description: 'ID of the private subnet 1 in Availability Zone 1 (e.g., subnet-a0246dcd)'
   Type: 'AWS::EC2::Subnet::Id'
 PrivateSubnet2ID:
   Description: 'ID of the private subnet 2 in Availability Zone 2 (e.g., subnet-a0246dcd)'
   Type: 'AWS::EC2::Subnet::Id'

Resources:
  MYDIR:
    Type: 'AWS::DirectoryService::MicrosoftAD'
    Properties:
        Name: !Ref ADDomainName
        Password: !Ref AdminPassword
        Edition: !Ref EditionType
        VpcSettings:
            SubnetIds:
                - !Ref PrivateSubnet1ID
                - !Ref PrivateSubnet2ID
            VpcId: !Ref MyVPC
Outputs:
  DomainName:
    Description: Newly Created Domain name is
    Value: !Ref ADDomainName
    Export:
      Name: DomainName
  DirectoryID:
    Description: ID of AD that will be used in EC2 & SQL servers
    Value: !Ref MYDIR
    Export:
     Name: Directory-ID
  DNS:
    Description: IP address of DNS servers.
    Value: !Join
          - ','
          - !GetAtt MYDIR.DnsIpAddresses
    Export:
     Name: DnsIpAddresses

 

Open AWS console. Go to Cloud formation service then create a New stack, browse and select the YML file created for above step.

SelectFile.jpg

 

Specify Stack name, parameters such as AD name, Admin password, Edition, VPC, Subnet.

Parameter.jpg

 

AWS will prepare resource in background, status will remain Create_in_progress.

Working.jpg

 

After completion, Status will turn to complete, Output tab will show columns as return result, the value in Export Name can be used for any future cloud formation deployment such as Windows EC2, AWS RDS.. ETC.

Final.jpg

Here are the details of Managed AD in AWS.
AWS Managed Microsoft AD
AD DS on AWS

Since this is my blog on AWS cloud formation, I will try improving above code and include few more use cases such as accessing managed AD, creating AWS RDS and joining EC2 in AWS.

IBM Cloud Object Storage (COS) configuration for API access

Following are the steps can be useful if you have any application (service) in your on premise that need to access (download/upload) files to IBM Cloud Object Storage (COS).

Open IBM COS URL https://console.bluemix.net/catalog/
‘Sign up’ for free access then ‘Login’. Select Object Storage from Storage container.

SIGNIN.jpg

 

Give appropriate name to Service name then click on Create.

ServiceName.jpg

 

Once service ready then either click on Create bucket OR create your first bucket as highlighted below.

CreateBucket.jpg

 

 

Give appropriate Name to bucket, select appropriate option for Resiliency, location, storage class (standard, vault, cold vault, flex), then click on Create.

bucketproperty.jpg

 

Select Endpoints copy the endpoint name then will be used by API to access.

Endpoint.jpg

 

From Service Credential page, click on ‘New credential’, give appropriate Name to credential, if you already created service id then click Select Service ID, if not then Create New Service ID, If you creating it first time then I would create a new service id (don’t click on add as of now).

ServiceCredential.jpg

 

Give appropriate name to New Service ID Name, in Add lnline… option paste {“HMAC”:true} it will generate Access Key ID & Secret key id for API access.  Now click on Add.

NewCredential.jpg

 

Now click on View Credential then note down access & secret key.

viewCredential.jpg

 

Use Endpoint, access key, secret key to connect with bucket using CyberDuck, CloudBerry, S3Browser and any other equivalent tool.

CloudDuck.jpg

 

You can create bucket, folders, upload & download objects using this tool to confirm the storage configuration.

Upload.jpg

Configuring Dell EMC Elastic Cloud Storage (ECS) for API access

This document is designed for test environment use only. Actual steps for production use might differ. Register ECS Test Drive account  and complete the registration process. Once you have registered for you ECS Test Drive account, log in and click the credentials link at the top left of the page.

Register

 

Once you receive credential. Try one of the following tool to create S3 bucket.

Method 1 Using CyberDuck
Download and install CyberDuck
Click on Open connection, In the dialog window, choose S3 (Amazon Simple Storage Service) from the first drop-down list.

connect.jpg

 

As per ECS credential page,

Change server name = object.ecstestdrive.com
Access Key ID = 131693042649205091@ecstestdrive.emc.com
Password = fbHIum2QY3A5xSr7Vlx63S+USGw3O1ULsHS9jmom

Then click on Connect

openconnection.jpg

 

Click on blank area, click on ‘New Folder’, give name to bucket (eg. ‘storage-ecs-part1’). This name should be lower and should be unique.

bucket create.jpg

 

Method 2 using CloudBerry
Install Cloudberry (freeware) in your storage OR test server.
Select File-New S3 compatible account-S3 compatible

cloudberry-connect.jpg

 

Display name could be anything you wish, supply service point, access key, secret key same as supplied in method 1. The test connection must be successful.

CreateBucket_cloudberry.jpg

 

Create a new bucket.

cloudberry-createbucket.jpg

Following two method can also be used to test bucket access
Another GUI tool called S3Browser
EMC ECS CLI (it require registration with EMC)