Analyze Enterprise Vault Dtraces using PowerShell-SQL tool.

This is for Enterprise Vault Engineers, however slight modification on this tool can help guys those are working on other technologies as well. The most difficult work for a support engineer is to analyze performance issue, where thousands of traces collected over time. In past, I developed a tool (combination of Power Shell & SQL script) that can upload Enterprise Vault dtrace to SQL server then assist on most common use cases such as finding out delay in specific function, searching a given keyword OR extracting entire thread.

You can either download executable file OR use PowerShell script on SQL 2016 server. Download link

Prerequisite

  • Microsoft SQL 2016 installed on Windows 2012 R2.
  • This works with windows authentication so logged in user must have DB creator OR equivalent permission on SQL instance.
  • The tool can be executed on server where SQL server binary is installed. For example, you can run it on SQL1 and connect to SQL2. Both servers should have SQL installed.
  • Copy latest dtrace_analyser exe file to SQL server.
  • Upload & Analyse, both are different operation. For example, you can upload dtrace today and analyse it tomorrow.

Upload D-trace

Double click on dtrace_analyser.exe, a windows form and console (command prompt) will open. Windows form enable user to upload /analyze the data. Console shows the status of operation it executing in background. During upload/analyze, form will free and all options will be disable till execution completes.

Just place all the dtrace in a single directory then supply name of that directory.

First.jpg

Click on Upload Traces, copy dtrace folder location OR click on Browse to select the location.

If the Database (DtraceReview) and Table (DtraceContent) already exists in SQL then it will skip the creation of database/table but truncate existing set of upload dtrace for consistency and accuracy during analysis.

Upload speed depends on resources. Approximately 10-15 seconds will take to upload single 100 MB dtrace file when SQL server have 12 GB RAM & 8 CPU.

Analyze Traces

Click on Find Delay option, by default the location of output HTML file will be C:\temp\dtrace-YYYYMMDDMMSS.html, you can change the location by clicking on Browse option or just type the location in text bar.

second.jpg

This tool automatically review each thread and prepare HTML report. If the numbers of lines in thread is huge, form may go to Not Responding mode, leave it running (will fix in next version) however actual work can be seen from console window.

Third.jpg

Once processing finish, HTML file will automatically open. Delay in seconds can be seen in last column. Any thread where delay is more than 2 second will only be visible in HTML output.

Fourth.jpg

Please note, this tool isn’t intelligent to understand the lines written for d-trace logging. Few function are expected to have delay such as one below. Engineers should understand the limitation because some function completes when their sub function completes.

Fifth.jpg

All process id, process name and thread id can be seen in HTML report.

Six.jpg

Last section of HTML will show events captured across all d-trace files.

Seven.jpg

Use Search Keyword option to search specific function, exception OR line. By default output will be located in C:\temp\search_output_YYYYMMDDHHMMSS.txt. You can change location and file name by clicking on Browse OR manually type the location.

Eight.jpg

Use Extract thread option to download all lines of specific thread to text file. By default output will save in text file located in C:\temp\Dtrace_thread_ThreadID.txt. You can change location and file name by clicking on Browse OR manually type the location.

Ninth.jpg

MS SQL deployment using cloud formation in AWS.

Here is the code snippets for MS SQL deployment using YML code in AWS. If you wish to make it AD integrated then review the details given in comment section.

AWSTemplateFormatVersion: '2010-09-09'
Description: Creates an empty SQL Server RDS database as an example for automated deployments.
Parameters:
  SqlServerInstanceName:
    NoEcho: 'false'
    Description: RDS SQL Server Instance Name
    Type: String
    Default: MyAppInstance
    MinLength: '1'
    MaxLength: '63'
    AllowedPattern: "[a-zA-Z][a-zA-Z0-9]*"
  DatabaseUsername:
    AllowedPattern: "[a-zA-Z0-9]+"
    ConstraintDescription: DBAdmin
    Description: Database Admin Account User Name
    MaxLength: '16'
    MinLength: '1'
    Type: String
    Default: 'DBAdmin'
  DatabasePassword:
    AllowedPattern: "^(?=.*[0-9])(?=.*[a-zA-Z])([a-zA-Z0-9]+)"
    ConstraintDescription: Must contain only alphanumeric characters with at least one capital letter and one number.
    Description: The database admin account password.
    MaxLength: '41'
    MinLength: '6'
    NoEcho: 'true'
    Type: String
    Default: Admin123
  DBEngine:
    Description: Select Database Engine
    Type: String
    AllowedValues: [Express, Enterprise]
  #Following paramter can be placed if SQL needs to be AD integrated.
  #DomainID:
  # Description: Enter the Domain ID
  # Type: String

Mappings:
 SQLTOEngineType:
  Express:
   Engine: sqlserver-ex
  Enterprise:
   Engine: sqlserver-ee

Resources:
  SQLDatabase:
    Type: AWS::RDS::DBInstance
    Properties:
      DBInstanceIdentifier:
        Ref: SqlServerInstanceName
      LicenseModel: license-included
      Engine: !FindInMap [SQLTOEngineType, !Ref 'DBEngine', Engine]
      EngineVersion: 13.00.4466.4.v1
      DBInstanceClass: db.t2.micro
      AllocatedStorage: '20'
      MasterUsername:
        Ref: DatabaseUsername
      MasterUserPassword:
        Ref: DatabasePassword
      PubliclyAccessible: 'true'
      BackupRetentionPeriod: '1'
      #If SQL RDS needs to Active Directory Integrated then uncomment following parameter.
      #Domain: !ImportValue Directory-ID
      #OR
      #!Ref DomainID
      #IAM role is mandate for AD integration
      #DomainIAMRoleName: 'rds-directoryservice-access-role'
Outputs:
   SQLDatabaseEndpoint:
     Description: Database endpoint
     Value: !Sub "${SQLDatabase.Endpoint.Address}:${SQLDatabase.Endpoint.Port}"
  1. Save above code in SQLRDS.YML file.
  2. Open AWS management console, In Cloud formation section, select New Template, select Upload a template to Amazon S3. Select SQLRDS.YML file then follow the wizard with all default options.
  3. Once deployment successfully completes, you would see events like below screenshot.

sqlrds.jpg

#domain, #domainiamrolename, #domainid, #following, #iam, #if, #or

Analyze IIS log using PowerShell

In my last blog, you learn how IIS (or similar) log can be uploaded to SQL DB. Once logs are uploaded you can analyze it using PowerShell and create fancy report as well. Let’s see how this can be done.

Following script gives number of request per IIS page with response code.

$Query1="
Use IISLogReview
     SELECT TOP 20 [sc-STATUS] Response, [cs-uri-stem] Access_Page,  count(*) Total_Request from IISLOG
     GROUP BY [sc-STATUS],  [cs-uri-stem]
     ORDER BY COUNT(*) DESC"
$SqlOut1=Invoke-Sqlcmd -ServerInstance SQL1 -Query $Query1 -QueryTimeout 65535
$PrintOut1=$SqlOut1|Select-Object Response,Access_Page,Total_Request|Format-Table -AutoSize
$PrintOut1

iis1.jpg

Following script gives number of request per user.

$Query2="
Use IISLogReview
    SELECT COUNT(*) Total_Request, [s-username] UserName  FROM IISLOG
    GROUP BY [s-username]
    ORDER BY Total_Request DESC"
$SqlOut2=Invoke-Sqlcmd -ServerInstance SQL1 -Query $Query2 -QueryTimeout 65535
$PrintOut2=$SqlOut2|Select-Object Total_Request,UserName |Format-Table -AutoSize
$PrintOut2

iis2.jpg

This is the complete tool/function however this time it will generate a HTML output.

Function AnalyseIIS
{
    PARAM
    (
    #Default SQL will be local host
    [string]$SQLServer=$env:computername,
    #Default HTML File will be c;\temp\yyyymmdd.html
    [string]$HTMLFile='C:\TEMP\'+(Get-Date).tostring("yyyyMMddhhmm")+'.html'
    )

    $Query1=
        "Use IISLogReview
        SELECT TOP 20 [sc-STATUS] Response, [cs-uri-stem] Access_Page,  count(*) Total_Request from IISLOG
        GROUP BY [sc-STATUS],  [cs-uri-stem]
        ORDER BY COUNT(*) DESC"

    $Query2=
        "Use IISLogReview
        SELECT COUNT(*) Total_Request, [s-username] UserName  FROM IISLOG
        GROUP BY [s-username]
        ORDER BY Total_Request DESC"

    $a = ""
    $a = $a + "BODY{background-color:peachpuff;font-family: Calibri; font-size: 12pt;}"
    $a = $a + "TABLE{border-width: 1px;border-style: solid;border-color: black;border-collapse: collapse;}"
    $a = $a + "TH{border-width: 1px;padding: 0px;border-style: solid;border-color: black;}"
    $a = $a + "TD{border-width: 1px;padding: 0px;border-style: solid;border-color: black;}"
    $a = $a + ""

    $SqlOut1=Invoke-Sqlcmd -ServerInstance $SQLServer -Query $Query1 -QueryTimeout 65535
    $PrintOut1=$SqlOut1|Select-Object Response,Access_Page,Total_Request| ConvertTo-HTML -PreContent '</pre>
<h2>Statistics Based on IIS Reponse, Page and Count</h2>
<pre>
'  -head $a |Out-String

    $SqlOut2=Invoke-Sqlcmd -ServerInstance $SQLServer -Query $Query2 -QueryTimeout 65535
    $PrintOut2=$SqlOut2|Select-Object Total_Request,UserName | ConvertTo-HTML -PreContent '</pre>
<h2>Statistics Based on UserName and Count</h2>
<pre>
'  -head $a |Out-String

    ConvertTo-Html -Title "IIS Log Analysis" -PostContent $PrintOut1,$PrintOut2 |Out-File $HTMLFile
    Invoke-Item $HTMLFile #open html after processing
}

AnalyseIIS
#OR Run with Parameter, Eg.
#AnalyseIIS -SQLServer SQL1 -HTMLFile C:\TEMP\IISReview.html 

iis3.jpg

You can have different select queries based on analysis you need and I believe my last two blogs can give you direction on how large set of text files can be uploaded to SQL then analyze it using PowerShell.

 

#default

Upload large logs files to SQL database

Most of us will agree if I say “SQL server is one of the powerful engine for query, reporting and statistic purpose”. As part of application troubleshooting, we come across many different set of logs those can be open in Notepad OR similar tools however it’s very different to take stats out it OR even open if it large. IIS log from web server is one of the nice example. Similarly you may have logs for different application such as D-trace for Enterprise vault.

Here in my blog, I am using IIS logs those can be uploaded to SQL server database using PowerShell script.

Create database (using PS script). If database already exists then it wouldn’t execute the command.

    $PrepareDB= "
    USE master
    DECLARE @DBname VARCHAR(100) ='IISLogReview'
    DECLARE @DBcreate varchar(max) ='CREATE DATABASE ' + @DBNAME
    IF  NOT EXISTS (SELECT name FROM sys.databases WHERE name = @DBname)
	    BEGIN
	    EXECUTE(@DBcreate)
	    END;
    GO
    "
    Invoke-Sqlcmd -ServerInstance SQL1 -Query $PrepareDB

Create table. If table already exists then it will skip but truncate existing rows so it hold only newest sets.

    $PrepareTable= "
    USE IISLogReview
    DROP TABLE IF EXISTS dbo.IISLOG
    CREATE TABLE dbo.IISLOG (
     [DATE] [DATE] NULL,
     [TIME] [TIME] NULL,
     [s-ip] [VARCHAR] (16) NULL,
     [cs-method] [VARCHAR] (8) NULL,
     [cs-uri-stem] [VARCHAR] (255) NULL,
     [cs-uri-query] [VARCHAR] (2048) NULL,
     [s-port] [VARCHAR] (4) NULL,
     [s-username] [VARCHAR] (16) NULL,
     [c-ip] [VARCHAR] (16) NULL,
     [cs(User-Agent)] [VARCHAR] (1024) NULL,
     [cs(Referer)] [VARCHAR] (4096) NULL,
     [sc-STATUS] [INT] NULL,
     [sc-substatus] [INT] NULL,
     [sc-win32-STATUS] [INT] NULL,
     [time-taken] [INT] NULL,
     INDEX cci CLUSTERED COLUMNSTORE
    )
    TRUNCATE TABLE [IISLogReview].dbo.IISLOG
    "
   Invoke-Sqlcmd -ServerInstance SQL1 –Query $PrepareTable

Once database & table is prepared then you can upload multiple IIS log files placed at given location using either ‘Bulk insert’ OR ‘BCP’.

Bulk Insert script

    $FolderPath='F:\IIS'
    $IISFiles=Get-ChildItem -Path $FolderPath -Name
    foreach ($File in $IISFiles)
        {
        $PATH=$FolderPath+'\'+$File
        $UploadQuery="
        BULK INSERT [IISLogReview].dbo.IISLOG
        FROM '$($PATH)'
        WITH (
            FIRSTROW = 2,
            FIELDTERMINATOR = ' ',
            ROWTERMINATOR = '\n'
              )"
        Invoke-Sqlcmd -ServerInstance SQL1 -Query $UploadQuery
        }

Please note, you may see “Bulk load data conversion” error those can ignored as IIS logs have few middle lines those are not well formatted and doesn’t need to be upload.

The upload activity can also be performed using BCP,  my favorite for import/export to/from SQL server.

    $FolderPath='F:\IIS'
    $IISFiles=Get-ChildItem -Path $FolderPath -Name
    foreach ($File in $IISFiles)
        {
        $PATH=$FolderPath+'\'+$File
        $PATH
        bcp [IISLogReview].dbo.IISLOG in $PATH -c -t" " -r\n -T -S SQL1
        }

Here is complete script, make sure to change SQL server name & IIS folder location at the end of the script.

Function UploadIIS
{
    PARAM
    (
    [Parameter(Mandatory=$true)]
    [string]$SQLServer,
    [Parameter(Mandatory=$true)]
    [string]$IISlogFolder
    )
    $PrepareDB= "
    USE master
    DECLARE @DBname VARCHAR(100) ='IISLogReview'
    DECLARE @DBcreate varchar(max) ='CREATE DATABASE ' + @DBNAME
    IF  NOT EXISTS (SELECT name FROM sys.databases WHERE name = @DBname)
	    BEGIN
	    EXECUTE(@DBcreate)
	    END;
    GO
    "
    $PrepareTable= "
    USE IISLogReview
    DROP TABLE IF EXISTS dbo.IISLOG
    CREATE TABLE dbo.IISLOG (
     [DATE] [DATE] NULL,
     [TIME] [TIME] NULL,
     [s-ip] [VARCHAR] (16) NULL,
     [cs-method] [VARCHAR] (8) NULL,
     [cs-uri-stem] [VARCHAR] (255) NULL,
     [cs-uri-query] [VARCHAR] (2048) NULL,
     [s-port] [VARCHAR] (4) NULL,
     [s-username] [VARCHAR] (16) NULL,
     [c-ip] [VARCHAR] (16) NULL,
     [cs(User-Agent)] [VARCHAR] (1024) NULL,
     [cs(Referer)] [VARCHAR] (4096) NULL,
     [sc-STATUS] [INT] NULL,
     [sc-substatus] [INT] NULL,
     [sc-win32-STATUS] [INT] NULL,
     [time-taken] [INT] NULL,
     INDEX cci CLUSTERED COLUMNSTORE
    )
    TRUNCATE TABLE [IISLogReview].dbo.IISLOG
    "
    $Queries = $PrepareDB,$PrepareTable
    foreach ($Query in $Queries)
        {
        Invoke-Sqlcmd -ServerInstance $SQLServer -Query $Query
        }
    $IISFiles=Get-ChildItem -Path $IISlogFolder -Name
    foreach ($File in $IISFiles)
        {
        $PATH=$IISlogFolder+'\'+$File
        bcp [IISLogReview].dbo.IISLOG in $PATH -c -t" " -r\n -T -S $SQLServer
        }
}

UploadIIS -SQLServer SQL1 -IISlogFolder F:\IISFolder

Although few of next blogs will be on analysis of IIS log uploaded on SQL but there is always a way to manually query the uploaded data using adhoc queries via SQL management studio.

 

SQL table export using PowerShell

Most of the PowerShell guys may need to extract data from SQL databases. There could be two or more ways to do the same task however as per my latest findings in one of the project I figured out including BCP in our script is the best way to achieve that. In my following two examples I am using AdventureWorksDB and a SQL query that write data into a text file.

Example 1 using Invoke-SQLCmd.

$SQLQueryy= "USE AdventureworksDW2016CTP3
SELECT  ProductKey, OrderDateKey, ShipDate FROM [dbo].[FactResellerSalesXL_CCI]
WHERE ProductKey =532"
$StartTime = (Get-Date)
Invoke-Sqlcmd -ServerInstance SQL2 -Query $SQLQueryy -QueryTimeout 65535 |Format-Table -HideTableHeaders -AutoSize |Out-File 'C:\temp\via_InvokeSQL.txt'  -Width 500
$Endtime=(Get-Date)
$TotalTime = "Total Elapsed Time : $(($Endtime-$StartTime).totalseconds) seconds"
$TotalTime

UsingSQLCmd.jpg

This script takes almost 14 seconds to extract the rows from SQL.

Example 2 Using BCP

$StartTime = (Get-Date)
bcp  "USE AdventureworksDW2016CTP3 SELECT  ProductKey, OrderDateKey, ShipDate FROM [dbo].[FactResellerSalesXL_CCI] WHERE ProductKey =532"  queryout 'C:\temp\via_BCP.txt' -T -c -S SQL2
$Endtime=(Get-Date)
$TotalTime = "Total Elapsed Time : $(($Endtime-$StartTime).totalseconds) seconds"
$TotalTime

UsingBCP.jpg

This query take less than 1 seconds with neat & clean output file.

How to configure SQL server Fail over clustering Instance (FCI).

This blog is part of a SQL HA-DR solution series , In my previous blogs I mentioned how Log Shipping, Mirroring & AlwaysOn Availability Group can be configured, now here you will get step by step procedure for SQL Fail Over Cluster Instance (FCI) high Availability solution. SQL FCI is sometime also known as AlwaysOn FCI and it’s bit different then AlwaysOn AG (Availability group). Always ON FCI need shared storage that is accessible from all the participant node and it provide instance level high availability.  If your primary (or active) server is down then secondary (passive) take responsibilities for all SQL operation.

The details about SQL FCI can be found here.

Continue reading

Unable to re-create Availability group with same name

While working in my test environment, we have delete existing Availability group and try creating a new one with same name. Unfortunately It was failing with error  “Failed to create availability group ‘SQL AAG’, because a Windows Server Failover Cluster (WSFC) group with the specified name already exists.”

erroraag

We successfully deleted existing configuration as per Microsoft & Fail-over cluster log does not give any clue about existing AAG group.

Here is the solution.

  1. Drop the availability group if not in previous attempt, refer
  2. Take backup of registry then delete key “HKEY_LOCAL_MACHINE\Cluster \HadrAgNameToldMap” from all nodes participating in cluster.
  3. From SQL configuration manager, uncheck ‘Enable Alwayson Availability Groups’, apply OK. Restart SQL service.config
  4. Check the box ‘Enable Alwayson Availability Groups’ again, Apply-ok, restart SQL service.

Now you should be able to create availability group with same existing name.

successaag

#aag, #alwayson