Creating a Monitoring Service with PowerShell
In this issue:
Over the last several weeks, we learned about using event subscriptions to monitor events on a local or remote computer. One potential drawback is that if you close the PowerShell session where you created the event subscription, the subscription is as well as any events stored in the event queue.
Today, I thought I'd share a solution that I use to create a more semi-permanent monitoring environment. This only makes sense if you intend to use an action script block and respond to events as they occur. My solution is also not for situation where you need to interact with the event subscriber or events. You could think of it as a pseudo-service. Let's take a look.
Creating the Monitoring Script
The first step is to create a PowerShell script that contains the event subscription or subscriptions. There's no reason you can't create multiple subscriptions and event handlers in the same script. For my demonstration, I will keep it simple. I want to monitor file changes in a given folder and log information to a CSV file. I can use the data in the CSV file later to create an incremental backup. That's not the focus, though, of this article.
I'll define a variable for the path to monitor.
$Path = "C:\Scripts"
Based on earlier articles, you know there are a few ways I can monitor file changes. For this script I am going to use a FileSystemWatcher
object.
$fsw = [System.IO.FileSystemWatcher]::New($Path)
$fsw.NotifyFilter ="LastWrite"
$fsw.EnableRaisingEvents = $true
$fsw.IncludeSubdirectories = $true
I am only interested in changes to files, so I will monitor the Changed
event when I get to creating the event subscription. Because I am using the information to plan for a backup, I only care about changes to the file where the last write time is updated.
Defining the Event Handler
When an event fires, I want to capture the information to a CSV file.
$action = {
$csv = "c:\logs\FileWatch.csv"
...
The FileSystemWatcher
returns a very small subset of information about the file. All I really need is the path, but I want to capture a little more information such as the file size. This means I need to get the file with Get-Item
.
Try {
$file = Get-Item -Path $event.SourceEventArgs.FullPath -Force -ErrorAction Stop
...
} #try
Catch {
#ignore if the file can't be found, it may already be deleted
#in which case I don't need to log it.
}
I want to point out a few details. I am using -Force
because I need to capture hidden and system files. The FileSystemWatcher
will detect hidden file changes, but Get-Item
will not return them unless I use -Force
. I am also using -ErrorAction Stop
so that if the file is deleted before I can get it, I can handle the error in the Catch
block. Although in reality, I don't need to do anything in the Catch
block. If the file is deleted, I don't need to log it because there is nothing to backup.
Duplicate Logic
Now for a tricky bit. Because of the way the FileSystemWatcher
works, I may get multiple events for a single file change. I am trying to mitigate this by using a single notify filter. But I need to do more. When testing with events in the event queue, I noticed that I would get successive events for the same file, usually within a few milliseconds. I figured that I could compare the file in the current event with the previous event and if it was the same file, ignore it, otherwise continue to process the event.
I can use Compare-Object
to compare the Length
and LastWriteTime
properties of the two files. If they are the same, I can ignore the event. To do this, I need to store the last file in a global variable.
if ($global:last) {
<#
Compare this even to the most recent event to see if the file information has change
and warrants logging.
#>
If (-Not (Compare-Object -ReferenceObject $global:last -DifferenceObject $file -Property Length,LastWriteTime )) {
#bail out and don't do anything
return
}
}
I'll update $global:last
at the end of the action script block after I have processed the event.
$global:last = $file
Creating the CSV Log Entry
If you recall, the FileSystemWatcher
will also detect changes to folders, which I don't care about. I also have a few file extensions that I use for temporary files that never need to be backed up. I can filter those out in the action script block.
if (($file -is [System.IO.FileInfo]) -AND ($file.Extension -NotMatch "adoc|foo")) {
...
If the detected file passes this check, then I can create a custom object with the information I want to log and append it to the CSV file.
[PSCustomObject]@{
Date = $event.TimeGenerated
Directory = $file.Directory
Fullname = $file.Fullname
Name = $file.Name
Size = $file.Length
LastWriteTime = $file.LastWriteTime
} | Export-CSV -path $csv -append -encoding UTF8