Tuesday, January 26, 2016

VM IO stalls on HP DL360 G7 ESXi 5.5.0 2403361 with a HP Smart Array p410 and a spare drive



We have a product that is very sensitive to network issues. So much so that it logs an error whenever it goes 5 seconds without communication. Just so happened that we started logging this error after we moved that product to a HP DL360 G7 running on ESXi 5.5. We first went down the development route to track down the issue because we were doing load testing on that product at the time.

At some point, we noticed these errors even when the system was not under load. We could see these perfectly timed latency spikes at the same time as we had those errors. Then we shifted our focus to system/infrastructure.

This is what our datastore latency looked like:



Then we discovered the strangest thing. We had this latency even when no VMs were running. We configured a second system just like the first to rule out the hardware. Sure enough, both systems had the problem.

I started juggling drivers. I was quick to load new drivers and firmware so that felt like a good place to start. I slowly walked the storage controller driver backwards until the issue went away. I settled on this driver:

Type: Driver - Storage Controller
Version:    5.5.0.60-1(11 Jun 2014)
Operating System(s):    VMware vSphere 5.5
File name:  scsi-hpsa-5.5.0.60-1OEM.550.0.0.1331820.x86_64.vib (65 KB)

  esxcli software vib install -v file:/tmp/scsi-hpsa-5.5.0.60-1OEM.550.0.0.1331820.x86_64.vib --force --no-sig-check --maintenance-mode

I didn’t like this as a solution because I had a new storage controller firmware with a very old storage controller driver. I found another DL360 G7 that was configured by someone else for another project and it was running good. They didn’t update anything. Just loaded the ESXi DVD and ran with it.

I came to the conclusion that it had a valid firmware/driver combination. I like that setup better than what I had so that became the recommended deployment. Don’t update the firmware and use the DVD so you don’t have to mess with drivers.

Then our next deployment that used that configuration and the issue resurfaced. After ruling out some other variables that could have impacted the results, I ended up rolling the driver back to fix it.


After pinning the driver down, I went hunting for more information on it. Searching for latency issue with storage gives you all kinds of useless results. I was hopeful that searching for the driver would lead me to some interesting discussions.





I ended up with an answer to my issue in that last link. But there were a lot of other issues
In summary, HPSA driver later than v60 driving a HP Smart Array p410 controller of any firmware vintage with a spare drive configured. The spare drive configuration seems to cause an i/o back-feed into the driver blowing it up at irregular intervals. This issue will not surface unless a spare drive is configured on the array controller.

At the time of writing this, the v114 driver also does not fix the issue.





Friday, January 22, 2016

Powershell: Write-Host vs Write-Output in Functions?

This comes up every so often. This is the question I saw recently:

I like to have my scripts and functions display a minimal level of progress on-screen by default, with more detailed progress listed by using -verbose. Write-Host works fine for the minimal progress display, but I keep reading horror stories of how the sky will fall if you use Write-Host instead of Write-Output in Powershell. However ... Write-Output seems to pollute the pipeline when trying to use the results of a function in a pipeline.

Is there a best practice for this? Should you use Write-Verbose for everything, or is Write-Host ok sometimes, or is there some other common process? - /u/dnisthmnace

One could argue that he has a solid understanding of it and you is on the right track. You can write a lot of code for yourself this way and everything would be OK.

The higher level discussion is that use of Write-Host is often a mental crutch that will slow your advancement of your Powershell skills. The problem is not so much that you are using it but why you think you need to be using it. But don't get hung up on that right now.

My advice is to start breaking your scripts into smaller pieces. Think of them as either tools or controller scripts. Tools are generic and reusable. The tools should be single purpose (do one thing and do that thing well). In your tools, use Write-Verbose.

Your controller scripts glue together your tools. Its OK for them to have a little more polish for the end user. This is where write-host would belong if you are using it (Purists can still use Write-Output). But the thing is that this a end of the line script. This is not a script that should be consumed by other scripts.

Personally, I use write-verbose for everything that I don't want on the pipe (even in my controller scripts). I do this so I can choose when I see the output and when I don't. When you start reusing code, this becomes more and more important. Before you know it, you have functions calling functions calling functions or functions executing with hundreds of data points. I can even set verbose on a function call but let that function suppress noisy sub functions.

I use Write-Verbose in my controller scripts because they can easily turn into tools and I don't have to go back and change the output.

I hope that sheds some light on the discussion. It's not the end of the world to use Write-Host, just know that you don't need to use it. Once you understand that, you will start writing better Powershell.


Thursday, January 21, 2016

Setting default parameters for cmdlets without changing them with $PSDefaultParameterValues

I ran across something again recently that I find to be really cool. You can add a command to your profile to add default parameters to any cmdlet. I first saw this as a way to have Format-Table execute with the –AutoSize flag every time.

$PSDefaultParameterValues.Add('format-table:autosize',$True)

That is a cool idea if you are stuck on Powershell 4 or older. Now that Powershell 5 kind of does that already, I really didn’t think much about it. Sometime later, I found myself wanting to share a cmdlet with someone and the default parameter values were specific to the way I used it. I didn’t really like that so I changed it. I made it into a better tool.

Except now I was supplying those values over and over every time I used it for myself. I decided there had to be a better way and that’s when I finally remembered this trick. And it can work on any cmdlet.

$PSDefaultParameterValues.Add('Unlock-ADAccount:Server','domain.com')

So now I can keep my cmdlets more generic and still get the benefit of default parameters that benefit myself. I can stick in common server names or credentials. Any value that I can think of really. Just add it to my profile and I am all set.







Wednesday, January 20, 2016

Powershell Error: Cannot process argument because the value of argument "value" is not valid. Change the value of the "value" argument and run the operation again.


This Powershell error Cannot process argument because the value of argument "value" is not valid. Change the value of the "value" argument and run the operation again. has tripped me up a few times now. I can usually fix it without ever really knowing how. I hate those type of solutions so this time I decided to get to the bottom of it.

I started using a new template snippet for my advanced functions. I tracked it down to one small detail in that template that was causing this issue. Here is the code below:

function New-Function
{
<#
.SYNOPSIS

.EXAMPLE
New-Function -ComputerName server
.EXAMPLE

.NOTES

#>
    [cmdletbinding()]
    param(
        # Pipeline variable
        [Parameter(
            Mandatory         = $true,
            HelpMessage       = '',
            Position          = 0,
            ValueFromPipeline = $true,
            ValueFromPipelineByPropertyName = $true
            )]
        [Alias('Server')]
        [string[]]$ComputerName
    )

    process
    {
        foreach($node in $ComputerName)
        {
            Write-Verbose $node
        }
    }
}

I start with this and fill in the rest of my code as I go. The full error message is this:

Cannot process argument because the value of argument "value" is not valid. Change the value of the "value" argument and run the operation again.
At line:16 char:11
+           [Parameter(
+           ~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: (:) [], RuntimeException
    + FullyQualifiedErrorId : PropertyAssignmentException


The function also acts kind of strange. I can tab complete the name but not any arguments. This tells me it is not loading correctly. If I try and get-help on the command, I get the exact same error message.

It took me a little trial and error but I narrowed it down to one of the parameter attributes. In this case, if the HelpMessage was left '' then it would error out. I would usually remove this line or add a real message eventually.

         # Pipeline variable
        [Parameter(
            Mandatory         = $true,
            HelpMessage       = '',
            Position          = 0,
            ValueFromPipeline = $true,
            ValueFromPipelineByPropertyName = $true
            )]
        [Alias('Server')]
        [string[]]$ComputerName


So if you are getting this error message, pay attention to the actual values in the parameter. If any of them are blank or null, you may run into this.

In the end, I updated my snippet to use a different placeholder. Problem solved.


Wednesday, November 25, 2015

Here is my custom Powershell prompt

Get-Help about_Prompts
<#
LONG DESCRIPTION
    The Windows PowerShell command prompt indicates that Windows PowerShell
    is ready to run a command:

        PS C:\>

    The Windows PowerShell prompt is determined by the built-in Prompt
    function. You can customize the prompt by creating your own Prompt
    function and saving it in your Windows PowerShell profile.
#>

One of my biggest issues with the default prompt is that I work with a lot of nested folders and network shares. It makes the path so long because the path is in there. So I change my prompt to just show the current folder and place the full path in the tittle bar.

One other thing I do is add basic command logging. I would use transcripts, but I don't want something that verbose. So I just save my last command to a text file whenever I run it.

The last thing I so is calculate where in the history the next command will be and add that to my prompt.

Here is my prompt function:

$PSLogPath = ("{0}{1}\Documents\WindowsPowerShell\log\{2:yyyyMMdd}-{3}.log" -f $env:HOMEDRIVE, $env:HOMEPATH,  (Get-Date), $PID)
Add-Content -Value "# $(Get-Date) $env:username $env:computername" -Path $PSLogPath
Add-Content -Value "# $(Get-Location)" -Path $PSLogPath

function prompt
{
    $LastCmd = Get-History -Count 1
    if($LastCmd)
    {
        $lastId = $LastCmd.Id
       
        Add-Content -Value "# $($LastCmd.StartExecutionTime)" -Path $PSLogPath
        Add-Content -Value "$($LastCmd.CommandLine)" -Path $PSLogPath
        Add-Content -Value "" -Path $PSLogPath
    }

    $nextCommand = $lastId + 1
    $currentDirectory = Split-Path (Get-Location) -Leaf
    $host.UI.RawUI.WindowTitle = Get-Location
    "$nextCommand PS:$currentDirectory>"







Monday, November 16, 2015

Powershell: Script injection with ScriptBlock.CheckRestrictedLanguage Method

I just had a post about importing hashtables from files. It basically loads the file into a script block and executes it. 

$content = Get-Content -Path $Path -Raw -ErrorAction Stop
$scriptBlock = [scriptblock]::Create($content)
$scriptBlock.CheckRestrictedLanguage([string[]]@(), [string[]]@(), $false)
Write-Output (& $scriptBlock

If that sounds dangerous, that is because it is. Your module or script my be trusted, but it may be loading files that are not trusted. This could be a stealthy way for an attacker to use your script.

I took the extra step of using $scriptBlock.CheckRestrictedLanguage([string[]]@(), [string[]]@(), $false) to make sure the hashtable is not containing Powershell commands. There is one important gotcha to be aware of with this command. The arguments are not intuitive. 

Lets take this command. Here is how the function is defined using a C# sample:

public:
void CheckRestrictedLanguage (
        IEnumerable allowedCommands,
        IEnumerable allowedVariables,
        bool allowEnvironmentVariables
)

The first argument is the allowed commands and the second is the allowed variables. One could reasonably assume that a $null value for the allowed commands would mean that nothing is allowed.

If you look at my code, I create an empty string array. That may look like a very strange thing to do and I kind of agree. This is because a $null value indicates that it should allow some default commands to execute. The only way to know this one is to read the documentation very closely. https://msdn.microsoft.com/en-us/library/system.management.automation.scriptblock.checkrestrictedlanguage(v=vs.85).aspx?cs-save-lang=1&cs-lang=cpp#code-snippet-2

By using an empty list of strings, I do not allow any Powershell commands. When importing a hashtable, this is exactly what I want.

Sunday, November 08, 2015

Powershell: Importing hashtable from file or a psd1 file

Have you ever wanted to import a hashtable from a file? A module manifest that is saved in a *.psd1 file is a hashtable. If you ever wanted to read the meta data in it, this trick may help.

You import the contents into a script block, validate the script block, execute it and capture the resulting hashtable into a variable. Here is the sample code below:


$content = Get-Content -Path $Path -Raw -ErrorAction Stop
$scriptBlock = [scriptblock]::Create($content)
$scriptBlock.CheckRestrictedLanguage([string[]]@(), [string[]]@(), $false)
Write-Output (& $scriptBlock)



If you target a module manifest, you can access all the attributes in it. 

Name                           Value
----                           -----
Copyright                      (c) 2015 Kevin.Marquette. All rights reserved.
CompanyName                    Self
GUID                           6ab379f9-41ed-4c1e-beda-7855d1c1e3c8
Author                         Kevin.Marquette
FunctionsToExport              *
VariablesToExport              *
RootModule                     .\my_module.psm1
AliasesToExport                *
CmdletsToExport                *
ModuleVersion                  1.0.1 

The CheckRestrictedLanguage will throw an error if it finds any powershell commands in the hashtable. Because you are executing code from a un-trusted source in the middle of your script, you should validate it.

There is a second quick and dirty way to do the same thing without the validation. I almost don't want to mention it because it is so dangerous. So if you see this in the wild, know that there is a better way.

$HashTable = Invoke-Expression (Get-Content $Path -raw)

This blindly executes a file as if it was a script. This is just asking to be exploited. Think CSS cross site or SQL injection type of vulnerability.