Behind the PowerShell Pipeline logo

Behind the PowerShell Pipeline

Subscribe
Archives
August 12, 2025

Managing Sysinternals with PowerShell Part 2

Last time I shared some PowerShell code and techniques for installing the Sysinternals suite using PowerShell. I demonstrated how you can use a thread job to parallelize processing. This is much faster than spinning up runspaces with ForEach-Object -Parallel, which is a common approach in PowerShell 7+.

Occasionally, there is a version of an individual tool. Microsoft doesn't maintain a version of the entire suite as a package. I could simply re-run the download code and overwrite files as needed. But there's something about that idea that feels like overkill. Why should I download a file that hasn't changed?

To update files, I want to process only the files that have changed. I should be able to compare the last write time property of the local file with the last write time of the file on the Sysinternals site. If the remote file is newer, I will download it.

I still want to write code optimized for performance. For this task, I'm going to take a slightly different approach. Instead of updating each file in a thread job, I want to process them in "batches" of 10 files. Some of the files are small and spinning up a separate thread job for small files might be inefficient.

I can get all of the current local files and save them to an array.

$current = Get-ChildItem -Path $Path -File

Then I can group them into batches of 10 files using a for loop.

for ($i = 0; $i -lt $current.count; $i += 10) {
    [object[]]$files = $current[$i..($i + 9)]
    # start a thread job for each batch
}

Notice how I'm bumping the counter by 10 each time. This way, I can process 10 files at a time in a single thread job. But here's the tricky part. You might think to pass the files as an argument for the thread job like this:

Start-ThreadJob -ScriptBlock $sb -ArgumentList $files -StreamingHost $host

Even if the script block is defined to accept an array of files

$sb = {
    param([object[]]$Files)
    ...
}

This will fail.PowerShell will pass each file in $files as a separate argument. Instead, I need to pass the array as a single object. Pay close attention.

Start-ThreadJob -ScriptBlock $sb -ArgumentList @(,$files) -StreamingHost $host

Inserting the comma into the argument list forces PowerShell to treat $files as a single object, which is what I want.

Now, what about the script block?

I want to take advantage of the StreamingHost parameter of Start-ThreadJob so I can monitor what is happening in real-time. I want to use the Write-Verbose cmdlet to provide feedback on the progress of the script. I want to use Write-Verbose statements. I also want to support -WhatIf. However, the thread job runs in its own scope. If my function uses CmdletBinding to support -WhatIf and -Verbose, I need to pass those preferences into the thread job. I can do this by using the using: scope modifier.

  $sb = {
    param([object[]]$Files)
    $VerbosePreference = $using:VerbosePreference
    $WhatIfPreference = $using:WhatIfPreference
    ...
  }

You may have seen this used in remoting examples. The same premise works with the thread job. We're telling PowerShell to set a value for $VerbosePreference and $WhatIfPreference in the thread job scope, using the values from the global scope.

In my script block, I'm tweaking the verbose messaging. I am still using the Verbose stream, but I am formatting the string using ANSI escape sequences or $PSStyle. This allows me to color the text in the console to highlight different phases of the code execution.

Write-Verbose "`e[95m[$((Get-Date).TimeOfDay)] Spinning up a thread job to process $($files.count) files(s). WhatIf = $WhatIfPreference`e[0m"

I am also writing my own WhatIf messaging.

Write-Host "`e[96mWhat If: [$((Get-Date).TimeOfDay)] Updating $file`e[0m"
Want to read the full issue?
GitHub Bluesky LinkedIn About Jeff
Powered by Buttondown, the easiest way to start and grow your newsletter.