Running PowerShell Scripts in SQL Agent Using CmdExec

If you have tried to execute PowerShell scripts through SQL Server Agent you may have noticed that the “version” (technically it’s not a version – see the links below for more info) of PowerShell run by the agent is not the same as the command-line PowerShell you may have tested your scripts with. The differences are explained in quite some depth by Chad Miller in his “The Truth about SQLPS and PowerShell V2” post and in his shorter answer to a question on StackOverflow. In brief, in all versions of SQL Server (including 2012), SQL Server Agent runs PowerShell scripts through a sqlps.exe process instead of the powershell.exe we get on the command prompt. Most annoyingly there are core syntax differences which do cause issues when running scripts with SQL Agent over sqpls. For example, if you have a construct like $(), or a Get-Date call, SQL Server Agent will not recognise the syntax and you’ll get a syntax error. It looks like SQL Agent’s Powershell task doesn’t like statements which can be interpreted as sqlcmd ones. Luckily, there is a very simple workaround if we want to circumvent this behavior. All we have to do is save the script somewhere on the filesystem (which SQL Server Agent’s service account can access, of course) and execute it through an Operating System (CmdExec), instead of a PowerShell step:

There we can simply call our script by:

powershell <script>

For example:

powershell “C:test.ps1”

This will execute the test.ps1 script. All we have to do is make sure that the SQL Server Agent account can see the script (permissions) and also that it can read/write to the locations touched by the PowerShell script. This way if you test scripts through the powershell.exe PowerShell environment you will get the same results in SQL Server Agent. And if you want to have the SQL Server cmdlets available, you can load them as a module as described here.

Cleaning up Hanging Traces with PowerShell

This post is a brief follow-up to my previous post Counting Number of Queries Executed in SSAS. There we saw how we can create and save server-side traces in PowerShell. If you have played with those scripts chances are you have a few hanging traces on your servers. Other possible sources of hanging traces could be Profiler crashes (not on close as it usually cleans after itself), or other applications which do not drop traces they create. To check what’s running on your servers have a look at the XMLA scripts in this blog post by Karen Gulati. It could be nice to have a PowerShell script which drops all traces other than the ones you need. The following script does just that. As it loops over a number of servers you don’t need to install it on multiple machines (provided you have a suitable account). Note that if you are an ASTrace user you may not want to use this as the traces ASTrace creates do not have distinctive names which you can add to the $ExcludeTraces array, so the script would just clean them up as well.

[Reflection.Assembly]::LoadWithPartialName(
    "Microsoft.AnalysisServices") > $null
$SSASServers = @("myserver1", "myserver2")
$ExcludeTraces = @("FlightRecorder", "Important Trace")

foreach($srv in $SSASServers) {
    $conn = 
        New-Object Microsoft.AnalysisServices.Server
    $conn.Connect($srv)

    $traces = @($conn.Traces | 
        where {!($ExcludeTraces -contains $_.Name)})
    $cnt = $traces.Count-1

    while($cnt -ge 0) {
        $traces[$cnt].Drop()
        $cnt--
    }

    $conn.Disconnect()
}

Thanks to Darren Gosbell for his assistance with all things PowerShell 🙂

7 Ways to Process Analysis Services Objects

Being asked a bit too often how we can process Analysis Services databases (or cubes and dimensions) here is a list of 7 different methods:

1. Through the GUI

This one is obvious. We can do it through both SSMS and BIDS.

2. XMLA Script

To generate the script we can use the hefty Script button in SSMS. Simply configuring the processing settings and then instead of clicking the all too usual OK, we can as well click on the little button in the top left corner of the Process window:

xmla_script

Then, we can just execute the generated query.

3. SSIS Analysis Services Processing Task

This Control Flow task allows us to configure any settings and then add it to our ETL process. Quite handy.

image

4. SQL Server Agent Job

This one is really an automation of Method #2 – XMLA Script. We can encapsulate it into a job of SQL Server Analysis Services Command type:

image

5. .NET Code

This allows us to process cubes as a part of an application. Nice if we want to let our users process our cubes on-demand. Of course, better left to application developers, but still a good trick to know. Especially if we want to seem all-knowing when it comes to databases of any type. To achieve this objective, we use AMO (Analysis Management Objects). An API can be found here:

http://technet.microsoft.com/en-us/library/microsoft.analysisservices(SQL.90).aspx

6. Command Line – ascmd

The command line utility can do a lot – including processing SSAS objects. For a full readme you can go here:

http://msdn.microsoft.com/en-us/library/ms365187.aspx

7. Command Line – PowerShell

This PowerShell script will perform a Full Process of Adventure Works DW 2008 on localhost:

[Reflection.Assembly]::LoadWithPartialName(“Microsoft.AnalysisServices”)
$servername=New-Object Microsoft.AnalysisServices.Server
$servername.connect(“localhost”)
$databasename=New-Object Microsoft.AnalysisServices.Database
$databasename=$servername.Databases.GetByName(“Adventure Works DW 2008”)
$databasename.Process(“ProcessFull”)

Using AMO we can do any maintenance tasks through PowerShell, including an object process.

Probably not a fully exhaustive list, but I hope it helps with giving developers some options when it comes to this trivial and crucial part of the development and deployment process.