Managing Sysinternals with PowerShell
Today's newsletter is another excuse for me to use PowerShell. You may have no practical need for the code, but I am hoping you'll learn something nonetheless.
I have been using the Sysinternals tools for years. Probably since they were introduced almost 30 years ago. Even though Mark Russinovich has his hands full with Azure, he still finds time to update tools from time to time. Or at least someone is. The tools are documented at https://learn.microsoft.com/sysinternals/
Even better, Microsoft exposes the tools at https://live.sysinternals.com/tools. The address https://live.sysinternals.com/ will also work.

You can click on a tool to download it. But wouldn't it be nicer to download all of the tools at once? Maybe using PowerShell?
Downloading
You can access the web site like a file folder by using the WebClient
service. Most likely this service is not running on your computer.
PS C:\> Get-Service WebClient
Status Name DisplayName
------ ---- -----------
Stopped WebClient WebClient
Assuming it hasn't been disabled or blocked by Group Policy, you should be able to start it.
PS C:\> Start-Service WebClient
This will allow you to reference the web site like a UNC path.
PS C:\> $file = Join-Path "\\live.sysinternals.com\tools" -childPath "whois.exe"
PS C:\> Copy-Item -path $file -Destination d:\temp -PassThru -Force
Directory: D:\temp
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a--- 4/6/2020 5:39 AM 398712 whois.exe
Of course, I want to download all of the tools and related files. Here is a PowerShell script that will do just that.
#requires -version 5.1
# Download Sysinternals tools from web to a local folder
param(
[Parameter(
Position = 0,
Mandatory,
HelpMessage = "Enter the name or path to the destination folder. It will be created if it doesn't exist."
)]
[ValidateNotNullOrEmpty()]
[string]$Path
)
if (-not (Test-Path $Path)) {
New-Item -Path $Path -ItemType Directory
}
try {
$svc = Get-Service -Name WebClient -ErrorAction Stop
}
catch {
Write-Warning "Failed to get the WebClient service. Cannot continue. $($_.Exception.Message)"
#bail out
return
}
if ($svs.Status -eq 'Running') {
<#
Define a variable to indicate service was already running
so that we don't stop it.
#>
$Stopped = $False
}
else {
#start the WebClient service if it is not running
Write-Host 'Starting WebClient' -ForegroundColor Magenta
try {
Start-Service -Name WebClient -ErrorAction Stop
$Stopped = $True
}
catch {
Write-Warning "Failed to start the WebClient service. Cannot continue. $($_.Exception.Message)"
#bail out
return
}
}
Write-Host "Downloading Sysinternals tools from \\live.sysinternals.com\tools to $Path" -ForegroundColor Cyan
#start a timer
$sw = [System.Diagnostics.Stopwatch]::new()
$sw.Start()
$files = Get-ChildItem -Path \\live.sysinternals.com\tools -File
#there may be null characters in the path which will cause problems.
foreach ($file in $files) {
$file = $file -replace "`0",""
Copy-Item -path $file -Destination $Path -PassThru -Force
}
if ( $Stopped ) {
Write-Host 'Stopping the web client' -ForegroundColor Magenta
Stop-Service WebClient
}
$sw.stop()
Write-Host "Sysinternals download complete in $($sw.Elapsed)" -ForegroundColor Cyan
I have inserted Write-Host
statements to indicate the progress of the script. You can remove them if you want a cleaner output.
PS C:\> c:\scripts\download-SysInternals2.ps1 -Path D:\temp\tools\
Starting WebClient
Downloading Sysinternals tools from \\live.sysinternals.com\tools to D:\temp\tools\
Directory: D:\temp\tools
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a--- 5/11/2022 12:49 PM 1468320 accesschk.exe
-a--- 5/11/2022 12:49 PM 810416 accesschk64.exe
-a--- 9/29/2022 4:25 PM 264088 AccessEnum.exe
-a--- 11/28/2022 12:36 PM 50379 AdExplorer.chm
...
If you get errors like "The file size exceeds the limit allowed and cannot be saved. : '\live.sysinternals.com\tools\RDCMan.exe'., you won't be able to use the WebClient to download the file. You will need to manually download the file from the web site.*
My script uses a StopWatch object to measure how long the script takes to run.
...
-a--- 1/27/2022 3:56 PM 1781632 Winobj64.exe
-a--- 12/16/2024 8:12 AM 1676848 ZoomIt.exe
-a--- 12/16/2024 8:12 AM 897568 ZoomIt64.exe
Stopping the web client
Sysinternals download complete in 00:04:22.4773172
Wouldn't it be nice to have this run faster? Maybe download files in parallel?
In PowerShell 7, I could use ForEach-Object -Parallel
to download files in parallel. However, this incurs an overhead cost of creating a new runspace for each file. That might negate any performance gains. Instead, I can use Start-ThreadJob
to create a thread job for each file. This allows me to download files in parallel without the overhead of creating a new runspace. Each job runs in its own thread.
$jobs = @()
$sb = {
param([string]$File, $Destination)
Write-Host "[$((Get-Date).TimeOfDay)] Downloading $File" -ForegroundColor Cyan
Copy-Item -Path $file -Destination $Destination -Force
}
Foreach ($file in $files) {
#there may be null characters in the path which will cause problems.
$file = $file -replace "`0", ''
$jobs += Start-ThreadJob -ScriptBlock $sb -ArgumentList $file,$Path -StreamingHost $Host
}
I'm also taking advantage of the StreamingHost
parameter to display the Write-Host
output.
Here's the revised script.
#requires -version 5.1
#requires -module Microsoft.PowerShell.ThreadJob
# Download Sysinternals tools from web to a local folder using thread jobs
param(
[Parameter(
Position = 0,
Mandatory,
HelpMessage = "Enter the name or path to the destination folder. It will be created if it doesn't exist."
)]
[ValidateNotNullOrEmpty()]
[string]$Path
)
if (-not (Test-Path $Path)) {
New-Item -Path $Path -ItemType Directory
}
try {
$svc = Get-Service -Name WebClient -ErrorAction Stop
}
catch {
Write-Warning "Failed to get the WebClient service. Cannot continue. $($_.Exception.Message)"
#bail out
return
}
if ($svs.Status -eq 'Running') {
<#
Define a variable to indicate service was already running
so that we don't stop it.
#>
$Stopped = $False
}
else {
#start the WebClient service if it is not running
Write-Host 'Starting WebClient' -ForegroundColor Magenta
try {
Start-Service -Name WebClient -ErrorAction Stop
$Stopped = $True
}
catch {
Write-Warning "Failed to start the WebClient service. Cannot continue. $($_.Exception.Message)"
#bail out
return
}
}
Write-Host "Downloading Sysinternals tools from \\live.sysinternals.com\tools to $Path" -ForegroundColor Cyan
#start a timer
$sw = [System.Diagnostics.Stopwatch]::new()
$sw.Start()
#filter out larger files which can't be downloaded
$files = Get-ChildItem -Path \\live.sysinternals.com\tools -File |
Where-Object {$_.Length -lt 10MB}
#download each file in a thread job
$jobs = @()
$sb = {
param([string]$File, $Destination)
Write-Host "[$((Get-Date).TimeOfDay)] Downloading $File" -ForegroundColor Cyan
Copy-Item -Path $file -Destination $Destination -Force
}
Foreach ($file in $files) {
#there may be null characters in the path which will cause problems.
$file = $file -replace "`0", ''
$jobs += Start-ThreadJob -ScriptBlock $sb -ArgumentList $file,$Path -StreamingHost $Host
}
#wait for jobs to complete and then clean up
$jobs | Wait-Job | Remove-Job
if ( $Stopped ) {
Write-Host 'Stopping the web client' -ForegroundColor Magenta
Stop-Service WebClient
}
$sw.stop()
Write-Host "Sysinternals download complete in $($sw.Elapsed)" -ForegroundColor Cyan
This runs considerably faster.
PS C:\> c:\scripts\download-SysInternals3.ps1 -Path D:\temp\tools2\
Starting WebClient
Downloading Sysinternals tools from \\live.sysinternals.com\tools to D:\temp\tools2\
[14:23:31.5230920] Downloading \\live.sysinternals.com\tools\accesschk.exe
[14:23:31.5651650] Downloading \\live.sysinternals.com\tools\accesschk64.exe
[14:23:31.6043804] Downloading \\live.sysinternals.com\tools\AccessEnum.exe
[14:23:31.6561763] Downloading \\live.sysinternals.com\tools\AdExplorer.chm
[14:23:31.6914218] Downloading \\live.sysinternals.com\tools\ADExplorer.exe
...
[14:24:07.3568224] Downloading \\live.sysinternals.com\tools\Winobj64.exe
[14:24:07.3922646] Downloading \\live.sysinternals.com\tools\ZoomIt.exe
[14:24:07.6291258] Downloading \\live.sysinternals.com\tools\ZoomIt64.exe
Stopping the web client
Sysinternals download complete in 00:01:43.6805029
I can probably eke out a little more performance by increasing the -ThrottleLimit
parameter of Start-ThreadJob
. This controls how many jobs can run in parallel. The default is 5. I set it to 10 in my script. I also removed the Write-Host
output.
PS C:\> c:\scripts\download-SysInternals3.ps1 -Path D:\temp\tools3\
Directory: D:\temp
Mode LastWriteTime Length Name
---- ------------- ------ ----
d----- 8/1/2025 2:35 PM tools3
Starting WebClient
Downloading Sysinternals tools from \\live.sysinternals.com\tools to D:\temp\tools3\
Stopping the web client
Sysinternals download complete in 00:00:18.2913703
Now I downloaded 161 files in 18 seconds.
Summary
As I mentioned in the introduction, you may not have a practical need for the code I shared. But hopefully you can apply the techniques and concepts to your own work. Next time, I want to show you how I update my Sysinternals folder. From a practical perspective, it would be just as easy to re-download the entire folder using my thread job script. However, every task is a learning opportunity and I hate to see those go to waste.