Tuesday, January 26, 2016

VM IO stalls on HP DL360 G7 ESXi 5.5.0 2403361 with a HP Smart Array p410 and a spare drive



We have a product that is very sensitive to network issues. So much so that it logs an error whenever it goes 5 seconds without communication. Just so happened that we started logging this error after we moved that product to a HP DL360 G7 running on ESXi 5.5. We first went down the development route to track down the issue because we were doing load testing on that product at the time.

At some point, we noticed these errors even when the system was not under load. We could see these perfectly timed latency spikes at the same time as we had those errors. Then we shifted our focus to system/infrastructure.

This is what our datastore latency looked like:



Then we discovered the strangest thing. We had this latency even when no VMs were running. We configured a second system just like the first to rule out the hardware. Sure enough, both systems had the problem.

I started juggling drivers. I was quick to load new drivers and firmware so that felt like a good place to start. I slowly walked the storage controller driver backwards until the issue went away. I settled on this driver:

Type: Driver - Storage Controller
Version:    5.5.0.60-1(11 Jun 2014)
Operating System(s):    VMware vSphere 5.5
File name:  scsi-hpsa-5.5.0.60-1OEM.550.0.0.1331820.x86_64.vib (65 KB)

  esxcli software vib install -v file:/tmp/scsi-hpsa-5.5.0.60-1OEM.550.0.0.1331820.x86_64.vib --force --no-sig-check --maintenance-mode

I didn’t like this as a solution because I had a new storage controller firmware with a very old storage controller driver. I found another DL360 G7 that was configured by someone else for another project and it was running good. They didn’t update anything. Just loaded the ESXi DVD and ran with it.

I came to the conclusion that it had a valid firmware/driver combination. I like that setup better than what I had so that became the recommended deployment. Don’t update the firmware and use the DVD so you don’t have to mess with drivers.

Then our next deployment that used that configuration and the issue resurfaced. After ruling out some other variables that could have impacted the results, I ended up rolling the driver back to fix it.


After pinning the driver down, I went hunting for more information on it. Searching for latency issue with storage gives you all kinds of useless results. I was hopeful that searching for the driver would lead me to some interesting discussions.





I ended up with an answer to my issue in that last link. But there were a lot of other issues
In summary, HPSA driver later than v60 driving a HP Smart Array p410 controller of any firmware vintage with a spare drive configured. The spare drive configuration seems to cause an i/o back-feed into the driver blowing it up at irregular intervals. This issue will not surface unless a spare drive is configured on the array controller.

At the time of writing this, the v114 driver also does not fix the issue.





Friday, January 22, 2016

Powershell: Write-Host vs Write-Output in Functions?

This comes up every so often. This is the question I saw recently:

I like to have my scripts and functions display a minimal level of progress on-screen by default, with more detailed progress listed by using -verbose. Write-Host works fine for the minimal progress display, but I keep reading horror stories of how the sky will fall if you use Write-Host instead of Write-Output in Powershell. However ... Write-Output seems to pollute the pipeline when trying to use the results of a function in a pipeline.

Is there a best practice for this? Should you use Write-Verbose for everything, or is Write-Host ok sometimes, or is there some other common process? - /u/dnisthmnace

One could argue that he has a solid understanding of it and you is on the right track. You can write a lot of code for yourself this way and everything would be OK.

The higher level discussion is that use of Write-Host is often a mental crutch that will slow your advancement of your Powershell skills. The problem is not so much that you are using it but why you think you need to be using it. But don't get hung up on that right now.

My advice is to start breaking your scripts into smaller pieces. Think of them as either tools or controller scripts. Tools are generic and reusable. The tools should be single purpose (do one thing and do that thing well). In your tools, use Write-Verbose.

Your controller scripts glue together your tools. Its OK for them to have a little more polish for the end user. This is where write-host would belong if you are using it (Purists can still use Write-Output). But the thing is that this a end of the line script. This is not a script that should be consumed by other scripts.

Personally, I use write-verbose for everything that I don't want on the pipe (even in my controller scripts). I do this so I can choose when I see the output and when I don't. When you start reusing code, this becomes more and more important. Before you know it, you have functions calling functions calling functions or functions executing with hundreds of data points. I can even set verbose on a function call but let that function suppress noisy sub functions.

I use Write-Verbose in my controller scripts because they can easily turn into tools and I don't have to go back and change the output.

I hope that sheds some light on the discussion. It's not the end of the world to use Write-Host, just know that you don't need to use it. Once you understand that, you will start writing better Powershell.


Thursday, January 21, 2016

Setting default parameters for cmdlets without changing them with $PSDefaultParameterValues

I ran across something again recently that I find to be really cool. You can add a command to your profile to add default parameters to any cmdlet. I first saw this as a way to have Format-Table execute with the –AutoSize flag every time.

$PSDefaultParameterValues.Add('format-table:autosize',$True)

That is a cool idea if you are stuck on Powershell 4 or older. Now that Powershell 5 kind of does that already, I really didn’t think much about it. Sometime later, I found myself wanting to share a cmdlet with someone and the default parameter values were specific to the way I used it. I didn’t really like that so I changed it. I made it into a better tool.

Except now I was supplying those values over and over every time I used it for myself. I decided there had to be a better way and that’s when I finally remembered this trick. And it can work on any cmdlet.

$PSDefaultParameterValues.Add('Unlock-ADAccount:Server','domain.com')

So now I can keep my cmdlets more generic and still get the benefit of default parameters that benefit myself. I can stick in common server names or credentials. Any value that I can think of really. Just add it to my profile and I am all set.







Wednesday, January 20, 2016

Powershell Error: Cannot process argument because the value of argument "value" is not valid. Change the value of the "value" argument and run the operation again.


This Powershell error Cannot process argument because the value of argument "value" is not valid. Change the value of the "value" argument and run the operation again. has tripped me up a few times now. I can usually fix it without ever really knowing how. I hate those type of solutions so this time I decided to get to the bottom of it.

I started using a new template snippet for my advanced functions. I tracked it down to one small detail in that template that was causing this issue. Here is the code below:

function New-Function
{
<#
.SYNOPSIS

.EXAMPLE
New-Function -ComputerName server
.EXAMPLE

.NOTES

#>
    [cmdletbinding()]
    param(
        # Pipeline variable
        [Parameter(
            Mandatory         = $true,
            HelpMessage       = '',
            Position          = 0,
            ValueFromPipeline = $true,
            ValueFromPipelineByPropertyName = $true
            )]
        [Alias('Server')]
        [string[]]$ComputerName
    )

    process
    {
        foreach($node in $ComputerName)
        {
            Write-Verbose $node
        }
    }
}

I start with this and fill in the rest of my code as I go. The full error message is this:

Cannot process argument because the value of argument "value" is not valid. Change the value of the "value" argument and run the operation again.
At line:16 char:11
+           [Parameter(
+           ~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: (:) [], RuntimeException
    + FullyQualifiedErrorId : PropertyAssignmentException


The function also acts kind of strange. I can tab complete the name but not any arguments. This tells me it is not loading correctly. If I try and get-help on the command, I get the exact same error message.

It took me a little trial and error but I narrowed it down to one of the parameter attributes. In this case, if the HelpMessage was left '' then it would error out. I would usually remove this line or add a real message eventually.

         # Pipeline variable
        [Parameter(
            Mandatory         = $true,
            HelpMessage       = '',
            Position          = 0,
            ValueFromPipeline = $true,
            ValueFromPipelineByPropertyName = $true
            )]
        [Alias('Server')]
        [string[]]$ComputerName


So if you are getting this error message, pay attention to the actual values in the parameter. If any of them are blank or null, you may run into this.

In the end, I updated my snippet to use a different placeholder. Problem solved.