Behind the PowerShell Pipeline logo

Behind the PowerShell Pipeline

Subscribe
Archives
August 12, 2025

Managing Sysinternals with PowerShell Part 2

Last time I shared some PowerShell code and techniques for installing the Sysinternals suite using PowerShell. I demonstrated how you can use a thread job to parallelize processing. This is much faster than spinning up runspaces with ForEach-Object -Parallel, which is a common approach in PowerShell 7+.

Occasionally, there is a version of an individual tool. Microsoft doesn't maintain a version of the entire suite as a package. I could simply re-run the download code and overwrite files as needed. But there's something about that idea that feels like overkill. Why should I download a file that hasn't changed?

To update files, I want to process only the files that have changed. I should be able to compare the last write time property of the local file with the last write time of the file on the Sysinternals site. If the remote file is newer, I will download it.

I still want to write code optimized for performance. For this task, I'm going to take a slightly different approach. Instead of updating each file in a thread job, I want to process them in "batches" of 10 files. Some of the files are small and spinning up a separate thread job for small files might be inefficient.

I can get all of the current local files and save them to an array.

$current = Get-ChildItem -Path $Path -File

Then I can group them into batches of 10 files using a for loop.

for ($i = 0; $i -lt $current.count; $i += 10) {
    [object[]]$files = $current[$i..($i + 9)]
    # start a thread job for each batch
}

Notice how I'm bumping the counter by 10 each time. This way, I can process 10 files at a time in a single thread job. But here's the tricky part. You might think to pass the files as an argument for the thread job like this:

Start-ThreadJob -ScriptBlock $sb -ArgumentList $files -StreamingHost $host

Even if the script block is defined to accept an array of files

$sb = {
    param([object[]]$Files)
    ...
}

This will fail.PowerShell will pass each file in $files as a separate argument. Instead, I need to pass the array as a single object. Pay close attention.

Start-ThreadJob -ScriptBlock $sb -ArgumentList @(,$files) -StreamingHost $host

Inserting the comma into the argument list forces PowerShell to treat $files as a single object, which is what I want.

Now, what about the script block?

I want to take advantage of the StreamingHost parameter of Start-ThreadJob so I can monitor what is happening in real-time. I want to use the Write-Verbose cmdlet to provide feedback on the progress of the script. I want to use Write-Verbose statements. I also want to support -WhatIf. However, the thread job runs in its own scope. If my function uses CmdletBinding to support -WhatIf and -Verbose, I need to pass those preferences into the thread job. I can do this by using the using: scope modifier.

  $sb = {
    param([object[]]$Files)
    $VerbosePreference = $using:VerbosePreference
    $WhatIfPreference = $using:WhatIfPreference
    ...
  }

You may have seen this used in remoting examples. The same premise works with the thread job. We're telling PowerShell to set a value for $VerbosePreference and $WhatIfPreference in the thread job scope, using the values from the global scope.

In my script block, I'm tweaking the verbose messaging. I am still using the Verbose stream, but I am formatting the string using ANSI escape sequences or $PSStyle. This allows me to color the text in the console to highlight different phases of the code execution.

Write-Verbose "`e[95m[$((Get-Date).TimeOfDay)] Spinning up a thread job to process $($files.count) files(s). WhatIf = $WhatIfPreference`e[0m"

I am also writing my own WhatIf messaging.

Write-Host "`e[96mWhat If: [$((Get-Date).TimeOfDay)] Updating $file`e[0m"

This gives my output a uniform appearance.

Verbose and WhatIf output
figure 1

In the script block, I can process each file in the $files array. I can construct a path to the live web version and compare dates. If the file is newer and exists, I can download it.

foreach ($file in $files) {
    #construct a path to the live web version and compare dates
    $online = Join-Path -Path \\live.sysinternals.com\tools -ChildPath $file.name

    Write-Verbose "`e[93m[$((Get-Date).TimeOfDay)] Testing $online`e[0m"
    try {
        $get = Get-Item -Path $online -ErrorAction Stop
        #file found online
        $found = $True
    }
    catch {
        Write-Warning "[$((Get-Date).TimeOfDay)] $($_.Exception.Message)"
        $found = $false
    }

    if ($found -AND ($get.LastWriteTime.Date -gt $file.LastWriteTime.Date)) {
        if ($WhatIfPreference) {
            #write my own WhatIf message
            Write-Host "`e[96mWhat If: [$((Get-Date).TimeOfDay)] Updating $file`e[0m"
        }
        else {
            Write-Verbose "`e[92m[$((Get-Date).TimeOfDay)] Updating $file`e[0m"
            Copy-Item -Path $online -Destination $Path -Force
        }
    }
    else {
        Write-Verbose "`e[33m[$((Get-Date).TimeOfDay)] Skipping $file`e[0m"
    }
}

I am catching errors in case the file does not exist on the Sysinternals site or if something else goes wrong. Note that my messaging serves double duty as documentation. You should be able to read the code and understand what it does without needing to run it.

Update-Sysinternals.ps1

Here's the complete script file.

#requires -version 7.5
#requires -RunAsAdministrator
#requires -module Microsoft.PowerShell.ThreadJob

<#
  C:\Scripts\Update-SysInternals.ps1
  Download Sysinternals tools from the web to a local folder.
  It is assumed you have already downloaded the files to the destination folder.
#>

function Update-Sysinternals {
  [cmdletbinding(SupportsShouldProcess)]

  param(
    [Parameter(Position = 0)]
    [ValidateScript( { Test-Path $_ }, ErrorMessage = 'Cannot validate the path {0}.')]
    [ValidateNotNullOrEmpty()]
    [string]$Path = "$Env:OneDrive\tools"
  )

  #start a timer
  $sw = [System.Diagnostics.Stopwatch]::new()
  $sw.Start()
  Write-Verbose "`e[96m[$((Get-Date).TimeOfDay)] Starting $($MyInvocation.MyCommand)`e[0m"
  #$PSBoundParameters | Out-String | Write-Verbose

  #Ensure the WebClient service is running
  $svc = Get-Service WebClient
  switch ($svc.Status) {
    'Stopped' {
      #start the WebClient service if it is not running
      Write-Verbose "`e[92m[$((Get-Date).TimeOfDay)] Starting WebClient`e[0m"
      #always start the web client service even if using -WhatIf
      try {
        Start-Service WebClient -WhatIf:$false -ErrorAction Stop
        $Stopped = $True
      }
      catch {
        #it is possible the service start will fail
        throw "Cannot start the required WebClient service. $($_.Exception.Message)"
        #bail out
        return
      }
    }
    'Running' {
      <#
      Define a variable to indicate service was already running
      so that we don't stop it.
      #>
      $Stopped = $False
    }
    default {
      #service is in some other state
      throw "The WebClient service status is $($svc.Status). Cannot continue."
      #Bail out
    }
  } #close Switch

  #get current files in destination
  $Path = Convert-Path -Path $Path
  Write-Verbose "`e[96m[$((Get-Date).TimeOfDay)] Getting current listing of files from $Path`e[0m"
  $current = Get-ChildItem -Path $Path -File

  Write-Verbose "`e[96m[$((Get-Date).TimeOfDay)] Updating Sysinternals $($current.count) tools from \\live.sysinternals.com\tools to $Path`e[0m"

  #define the script block that will be run in a thread job
  $sb = {
    param([object[]]$Files)
    #$PSBoundParameters | Out-String | Write-Host -ForegroundColor Magenta
    $VerbosePreference = $using:VerbosePreference
    $WhatIfPreference = $using:WhatIfPreference

    #get the destination path from the first file
    $Path = $files[0] | Split-Path | Convert-Path
    #this is for demo or troubleshooting purposes. The line isn't necessary and can be commented out.
    Write-Verbose "`e[95m[$((Get-Date).TimeOfDay)] Spinning up a thread job to process $($files.count) files(s). WhatIf = $WhatIfPreference`e[0m"
    foreach ($file in $files) {
      #construct a path to the live web version and compare dates
      $online = Join-Path -Path \\live.sysinternals.com\tools -ChildPath $file.name

      Write-Verbose "`e[93m[$((Get-Date).TimeOfDay)] Testing $online`e[0m"

      try {
        $get = Get-Item -Path $online -ErrorAction Stop
        #file found online
        $found = $True
      }
      catch {
        Write-Warning "[$((Get-Date).TimeOfDay)] $($_.Exception.Message)"
        $found = $false
      }

      if ($found -AND ($get.LastWriteTime.Date -gt $file.LastWriteTime.Date)) {
        if ($WhatIfPreference) {
          #write my own WhatIf message
          Write-Host "`e[96mWhat If: [$((Get-Date).TimeOfDay)] Updating $file`e[0m"
        }
        else {
          Write-Verbose "`e[92m[$((Get-Date).TimeOfDay)] Updating $file`e[0m"
          Copy-Item -Path $online -Destination $path -Force
        }
      }
      else {
        Write-Verbose "`e[33m[$((Get-Date).TimeOfDay)] Skipping $file`e[0m"
      }
    }
  } #close script block

  #initialize an array for thread jobs
  $j = @()

  for ($i = 0; $i -lt $current.count; $i += 10) {
    [object[]]$files = $current[$i..($i + 9)]
    #pass the file array as an input object so that they are processed as a single unit
    $j += Start-ThreadJob -ScriptBlock $sb -ArgumentList @(, $files) -StreamingHost $Host
    #add each job to the array
  }

  Write-Verbose "`e[92m[$((Get-Date).TimeOfDay)] Waiting for $($j.count) jobs to complete.`e[0m"
  #always remove the thread job
  $j | Wait-Job | Remove-Job -Force -WhatIf:$false

  $sw.stop()
  if ($Stopped) {
    Write-Verbose "`e[92m[$((Get-Date).TimeOfDay)] Stopping web client`e[0m"
    #always stop the service even if using -WhatIf
    Stop-Service WebClient -WhatIf:$False
  }

  Write-Verbose "`e[96m[$((Get-Date).TimeOfDay)] Sysinternals update complete in $($sw.Elapsed)`e[0m"
}
#EOF
Updating Sysinternals tools
figure 2

18 seconds isn't too bad. I might be able to tweak a little more performance by increasing the throttle limit. Or playing with the number of files in each batch. Another possibility is that there is an ideal overall file size for the batch. If I have a lot of small files, I might want to increase the number of files in each batch. If I have larger files, I might want to decrease the number of files in each batch. If I were to use my code for other purposes, I might want to invest the time to research and test this further. Especially if there are a lot of files to process. I only have less than 160 files that rarely change so I can live with the current code.

Summary

I hope you found this journey informative. I don't expect you to have a need to manage Sysinternals tools in this way. But I do hope you learned something about using thread jobs to process data in parallel, and how to leverage PowerShell's capabilities to improve performance and efficiency.

(c) 2022-2025 JDH Information Technology Solutions, Inc. - all rights reserved
Don't miss what's next. Subscribe to Behind the PowerShell Pipeline:
Start the conversation:
GitHub Bluesky LinkedIn About Jeff
Powered by Buttondown, the easiest way to start and grow your newsletter.