Installing AzureTools Remotely via WinRM


and so wanted to add to our standard Powershell for building VMs that does various standard config. However discovered that BITS doesnt work / isnt supported via WinRM so used alternative method as using Net.WebClient namespace as per below


1. You have Azure Powershell Cmdlets installed and set up

2. Have installed Cert for WinRM as per below (InstallWinRMCert.ps1) for the VMs you want to update

Do it :

Run below in Powershell which will create funcs and then call for your VM changing params below

AddAzureToolsToVM –VMName SomeVM -UserName “bob” -Password “doofus”

# Returns a PSCredential, ie for use with WinRM etc
Function GetCredential 

            [Parameter(Mandatory=$true)] [string]$UserName,
            [Parameter(Mandatory=$true)] [string]$Password

            $pass = ConvertTo-SecureString -AsPlainText $Password -Force
            $Credential = New-Object System.Management.Automation.PSCredential -ArgumentList $UserName , $pass
} # Function GetCredential 

# AddAzureToolsToVM downloads the AzureTools using WinRM , assumes Cert installed
# AddAzureToolsToVM –VMName "SomeVM" -UserName "bob" -Password "doofer"
Function AddAzureToolsToVM
            [Parameter(Mandatory=$true)] [string]$VMName,
            [Parameter(Mandatory=$false)] [string]$VMServiceName = $VMName,
            [Parameter(Mandatory=$true)] [string]$UserName,
            [Parameter(Mandatory=$true)] [string]$Password, 
            [Parameter(Mandatory=$false)] [string]$AzureToolsDir = "c:\install\Azuretools" 
    Write-Host "*** Getting VM = $VMName ***"
    $VM = Get-AzureVM -Name $VMName -ServiceName $VMName
     If($VM -eq $Null)
               Write-Host "!!! Error Getting VM = $VMName , quitting !!!"

    # Uses GetCredential to set Credential object
    $Credential = GetCredential -UserName $UserName  -Password $Password
    $uri = Get-AzureWinRMUri -ServiceName $VMName -Name $VMName
    Write-Host "*** Connecting to $VMName, running hostname.exe to test WinRM ***"
    # Quick test to prove WinRM is working
    $Results = Invoke-Command -ConnectionUri $uri.ToString() -Credential $credential -ScriptBlock {HostName}
    If($Results -eq $Null)
             Write-Host "!!! Error Running Hostname.,exe on VM = $VMName , quitting !!!"

    # Main doing part , output downloaded file object into $WinRMResults
    $WinRMResults = Invoke-Command -ConnectionUri $uri.ToString() -Credential $credential -ScriptBlock {
        # assign passed in arguments to friendly variable names
        $LocalAzureToolsDir = $args[0]
        # 1. Create Dir
        If (!(test-path $LocalAzureToolsDir))
             Write-Host "*** Creating directory $LocalAzureToolsDir ***"
             mkdir $LocalAzureToolsDir 
            Write-Host "*** $LocalAzureToolsDir already exists ***"

         # 2. Download tools 
         $AzureToolsSource = ""
         $AzureToolsFullPath = $LocalAzureToolsDir + "\AzureTools.exe"

         Write-Host "*** Creating Object Net.WebClient ***"
         $WebClient = New-Object Net.WebClient 
         Write-Host "*** Starting download of $AzureToolsSource to $AzureToolsFullPath ***"
         $WebClient.DownloadFile($AzureToolsSource, $AzureToolsFullPath)
         # Finally get and return File object 
         $DownloadedFile = Get-Item $AzureToolsFullPath

    } -ArgumentList $AzureToolsDir 

     If ($WinRMResults -eq $Null)
                Write-Error "!!! $AzureToolsDir\AzureTools.exe not found !!!" 
                 # If Works downloaded file details



New DMVs in SQL 2014 CTP 1

Just did a quick compare with a 2012 SP1 CU4 box, below are the ones only in SQL2014  CTP 1. Enjoy


The last thing to try before restore when you have a SQL Server 823 Error / OS Error 21 corruption

Short Version

Take database Offline and Online again. Do at your own risk.

Long Version

  • SAN Based Storage , Disks / LUNs disappeared from Windows / Reappeared again within about 1 minute
  • No Errors in SQL Server when this happened
  • 12 hours later 823 / Error 21s start appearing when SQL is being used more.
  • Narrow down corruption down to 2 File/Filegroups , 1 Volume (around 10 volumes / 15 Filegroups)
  • Problem files around 400 Gbytes in Size / 300 Gbytes used.
  • All other CHECKFILEGROUPs come back clean
  • CHECKFILEGROUP on problem filegroups fails instantly with error 823 / Latch error
  • Any SELECT statement against problem Filegroups data comes back with similar 823 / OS 21 Error.
  • Able to create New 10 Gbyte DB / Log file on alleged problem volume.
  • Able to write 100Gbytes / read fine to alleged problem volume by doing backup / verify only of another db (data on different volume with no issues). Showed that SQL Server had no general problems accessing / writing / reading to it
  • Was just considering moving problem files to different volume , to see if problem was being caused by a problem with single volume.
  • Thought would try offline / online as the 21 = “Device not ready” , suggested that SQL Server was having problems with these particular .ndf’s / thought there was a problem with disk
  • To my surprise db recovered successfully , half expected it to go suspect.
  • CHECKFILEGROUP on problem files now run for expected time / clean
  • CHECKDB on whole DB Clean
  • Backup of DB / VerifyOnly Clean
  • SQL Server 2008 R2 SP1 CU5 / Windows Server 2003

Follow up

  • Obviously find root cause

Event Table in SQL Azure / sys.event_log

Stumbled across this excellent Teched Preso on SQL Admin and Self Servicing and was

Technorati Tags: ,

intrigued about the Event Table feature that records connection issues / timeouts etc. Found it has been implemented by new portal @ by :

  1. From , clicking on SQL Databases on left then Manage or go straight to https://<yourinstancename>
  2. Type in your SQL Admin credentials , leave Database blank
  3. Click on Administration bottom left
  4. And then Events . You get a nice graph like below which is aggregated by day

To see more granular detail , connect to master database in Management Studio and do

SELECT * FROM sys.event_log

Quite useful as had some weird stuff happen on web site and think it may have been down to idle connection timeouts


Potential Bill Shock in SQL Azure when creating / dropping same named databases on same day.

Azure eval account got disabled recently as I had exceeded the monthly limit. When I looked closer I had been charged 4 database units on a single day. I was bemused because only ever had 2 databases (Prod and Staging) so I raised a support case with Microsoft Azure team.

To cut a long story short , Microsoft have confirmed that each time you create a database you will be billed for 1 database / day unit , regardless of whether you are creating the same database each time !

In my case I drop / re-creating Staging db several times within a few minutes as I was testing out a release process that went wrong.

The support rep couldn’t show me a link that showed this as being documented. They have promised to follow this up.

You can query below from master db to see the usage per day as well as the looking at billing in the Azure portal

SELECT * FROM sys.database_usage

Backing up databases from SQL Azure to Amazon RDS

Been developing a SQL Azure based solution and so starting to think about Operational aspects like backup / offsite copies etc.

So far have been using the Redgate SQL Azure Backup tool to backup the database to a .bacpac and restore to a local SQL Express database on my laptop as a temp measure.

Saw that Amazon ECS had now has a SQL Server offering and so decided to change to try as a target for the restore. Also wanted a contingency plan for getting off Azure platform (in case of outage, bill shock,etc).

Obviously this involves pulling data out of the Azure datacenter that will potentially increase data charges.

1. Get a Amazon ECS a account and create a SQL Server instance

See here . Be sure to set up database access group to allow your IP Address / client access to the Instance* , which is like the azure firewall. Test connectivity from Management Studio.

(* to find your public facing IP Address , goto and type in “What is my Ip Address” – tells u at the top of the page)

2. Download and Install Red-Gate SQL Azure Backup Tool

3. Run below script , changing params <> where required

RedGate.SQLAzureBackupCommandLine.exe /AzureServer:<AzureSQLInstanceName>

/AzureDatabase:<AzureDBName> /AzureUserName<AzureSQLogin> /AzurePassword:<AzureSQLPassword>

/LocalServer:<AmazonRDSInstanceName> /LocalDatabase:<AmazonDB> /DropLocal /Verbose

/LocalUserName:<AmazonUserID> /LocalPassword:AmazonPassword

1. User is not permitted to perform requested operation

When I ran , it took a while and eventually got output below

Error: User does not have permission to perform this action.
System.AggregateException: One or more errors occurred. —> System.Data.SqlClient.SqlException: User does not h
ission to perform this action.
   at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection)
   at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection)

But it had created database and objects. So I ran SQL Profiler from my laptop against the Amazon instance (!) with the “User Error” Event class added and found 2 statements below caused the error

A) sp_updatestats – quite easy to replace with Own code


reproduced the error in Query window. I don’t know at this stage whether these commands are specific to the .bacpac process or Redgates tool . I’m guessing the former.

These are not show stoppers , apart from the process produces an error.

Key takeaways

1. Using .bacpac allows you to backup between SQL 2012 (azure) and SQL 2008 R2 (Amazon)

This is quite cool. Obviously DTS / SSIS has had this for years but the .dacpac method seems far less fiddly / more reliable so far when transferring a whole db

2. Profiler works against Amazon

I was quite happy when this worked. As some who’s been using SQL Trace since Oasis were just roadies, been feeling a bit exposed / nervous of supporting an instance without having it available.

3. Amazon has database backups built in

Noticed this when setting up my instance , up to 5 days, scheduled etc.

Next Steps.

1. Perform more detailed comparison of the 2 offerings
Nuff said

2. Attempt to migrate web site to Amazon

I.e so we have a fully working alternative on another provider. We are using ACS 2.0 for auth so might need a bit of work 😉

3. Use Powershell to create / restore .bacpac

As the Redgate tool is discontinued

4. Azure data sync to Amazon RDS ?

See here – . Don’t think this will be possible as an agent has to be installed on the target SQL sever – ie amazon

New Perfmon Counters for SQL Server 2012 RTM

Doing some work with SQL 2012 and came across a couple of System Monitor Perfmon counters that I hadn’t seen before so I decided to do a complete list by comparing with an 2008 R2 Instance . Half finished , but I wanted to highlight their presence.

I would link to documentation but doesn’t look like the new counters are undocumented in BOL for SQL Server 2012 has been updated yet ! (yes I’ve provided corrective feedback…)

Object_Name counter_name Instance Comments
Access Methods  InSysXact waits/sec N\A Not documented ! “Number of times a reader needs to wait for a page because the InSysXact bit is set”
Batch Resp Statistics Multiple time ranges
>=000000ms & <000001ms

Batches >=000001ms & <000002ms etc

CPU Time:Requests   Cool ! So can measure in Perfmon long running batches .
Seems to be since SQL Server started though.
Explain = “Number of SQL Batches having response time greater than or equal to 5ms but less than 10ms”**
Buffer Manager Background writer pages /sec   “Number of pages flushed to enforce the recovery interval settings.” Part of new TARGET_RECOVERY_TIME
per database feature that allows finer grained control on recovery / checkpoints
  Integral Controller Slope   “The slope that integral controller for the buffer pool last used, times -10 billion” er yeah
Databases   Log Flush Write Time (ms)  
FileTable Numerous N\A For measuring performance of new Filetable feature of FILESTREAM . Allows you to copy data to FILESTREAM via UNC, automatically populates table with attributes etc .
Memory Broker Clerks Numerous   TBD
Memory Manager Database Cache Memory (KB)    
  External benefit of memory    
  Free Memory (KB)    
  Log Pool Memory (KB)    
  Reserved Server Memory (KB)    
  Stolen Server Memory (KB)    
Memory Node

Database Node Memory (KB)

  NUMA Related
  Foreign Node Memory (KB)    
  Free Node Memory (KB)    
  Stolen Node Memory (KB)    
  Target Node Memory (KB)    
  Total Node Memory (KB)    
Query Execution Total Node Memory (KB)    
  Rem Req Cache Hit Ratio    
  Rem Req Cache Hit Ratio Base    
  Remote activations/sec    
  Remote requests/sec    
  Remote resend requests/sec