ConvertTo-HTML and XML – Better together

The PowerShell ConvertTo-HTML cmdlet is a great tool for creating nicely-formatted reports from PowerShell objects. By default it converts objects passed to it on the pipeline into an HTML table inside an HTML document.

One of the slightly frustrating things about the table produced by ConvertTo-HTML is that it is very vanilla. There are no styling options.

You can use the -CssUri parameter of ConvertTo-HTML to reference an external CSS style sheet. This should allow the use of CSS classes or IDs to style elements, but the output from ConvertTo-HTML doesn’t assign classes to any of the elements it generates. If only there was some way to assign classes… Enter XML.

HTML doesn’t have to be XML. See my previous article for that. Fortunately the HTML produced by ConvertTo-HTML when used with the -Fragment switch parameter generates text that can be coerced into an XML object. From there we can use XPath to post-process the table.

Enough theory. Here’s a practical example.

Say we want to create an HTML report on the state of the services on our PC using PowerShell. We want to group the services by StartType and highlight any services with the ‘Manual’ StartType that are running, and any with the ‘Automatic’ StartType that are stopped.

First let’s get the data:

$services = Get-Service | Group-Object -Property StartType -AsHashtable -AsString

This code grabs all of the services on our machine, groups them by StartType and stores them in a hashtable. Next we take each group of services and create an HTML table for it.

$automatictable = $services.Automatic | Select-Object ServiceName,DisplayName,Status | ConvertTo-HTML -Fragment
$disabledtable = $services.Disabled | Select-Object ServiceName,DisplayName,Status | ConvertTo-HTML -Fragment
$manualtable = $services.Manual | Select-Object ServiceName,DisplayName,Status | ConvertTo-HTML -Fragment

We can create an HTML document containing the tables like so:

ConvertTo-HTML -Body "$automatictable $disabledtable $manualtable" | Out-File -FilePath 'servicereport.html'

When we view servicereport.html using a browser we get something like this:

Basic table without CSS styling

Not very inspiring, is it? On to the styling!

We’re going to embed the CSS stylesheet into the HTML report file, in order to avoid any external dependencies or security problems with CSS files on the local filesystem.

As it stands, we can only select elements to style based on the element types themselves, so the first version of the stylesheet looks like this:

body {
    font-family: 'Segoe Ui',Arial, Helvetica, sans-serif
table {
    table-layout: fixed;
    width: 100%;
    white-space: nowrap;
td {
    white-space: nowrap;
    overflow: hidden;
    text-overflow: ellipsis;
th {
    background: darkblue;
    color: white;
td, th {
    text-align: left;
    padding: 5px 10px;
tr:nth-child(even) {
    background: lightblue;

We can embed the stylesheet in the HTML file using a here-string in PowerShell. Now we have a slightly better-looking report but we can’t highlight those outliers.

Basic table with CSS styling

For that we need CSS classes and XPath. Let’s start with the table of services with a StartType of ‘Automatic’ – it’s stored in the variable $automatictable. If we use GetType we can see that although this is well formed HTML, PowerShell just considers this an array of Strings.


We can do something about this.

[xml]$automatictable = $automatictable

Here $automatictable is coerced into an XML document. GetType confirms this.


We now have access to all of the methods and properties of the System.Xml.XmlDocument class, so we can use XPath to select specific TD elements of a table, such as those containing ‘Stopped’.


This could return any TDs with a value of ‘Stopped’. Let’s make sure we’re only looking at the 3rd TD.


Using this mechanism we can set a class attribute to style the appropriate TDs.


Or even every TD in the row:


All we need to do us add to our CSS file

.back-red {
     background: red;
     color: white;

Using the same approach we can style the $manualtable table to highlight running services in green.

When writing the XML versions of our tables into the HTML body for use by ConvertTo-HTML we need to use the XMLDocument’s OuterXML property, so our command line to create the HTML file now looks like this:

ConvertTo-HTML -Body "$style $($automatictable.OuterXml) $($disabledtable.OuterXml) $($manualtable.OuterXml)" | Out-File -FilePath 'servicereport.html'

The end result looks closer to what I was aiming for:

Table with TD highlighting

Having assembled all of the techniques we need we can turn this into a script. Have a look at this Gist for a sample of what can be done, as shown below:

Advanced table with TD highlighting

This approach to generating HTML from a script can be used with any kind of PowerShell objects, not just services. This article should have given you an insight into what is possible. Let me know if this article has helped in any way.

Additional reading

System.Xml.XmlDocument .Net class
W3Schools HTML Tables
W3Schools CSS
W3Schools XPath

Posted in PowerShell | Leave a comment

Formatting XML Output using PowerShell

Like many people I have to deal with XML output from commands. My main source of pain has been the toXML method of Windows Event Log events, which give lots of useful information, but are hard on the eye. Using the following PowerShell:

> $e = Get-WinEvent -LogName Security -FilterXPath "*[EventData[Data[@Name='TargetUserName']='Angelique.Cortez']]" -MaxEvents 20
> $e[-1].toXML()

Gives the following output:

<Event xmlns=''><System><Provider
 Name='Microsoft-Windows-Security-Auditing' Guid='{54849625-5478-4994-A5BA-
<Correlation/><Execution ProcessID='600' ThreadID='2176'/><Channel>Security</Channel>
<Data Name='Dummy'>-</Data><Data Name='TargetUserName'>Angelique.Cortez</Data><Data
 Name='TargetDomainName'>CARISBROOKELABS</Data><Data Name='TargetSid'>S-1-5-21-447422785-
3715515833-3878445295-1308</Data><Data Name='SubjectUserSid'>S-1-5-21-447422785-
3715515833-3878445295-500</Data><Data Name='SubjectUserName'>Administrator</Data><Data
 Name='SubjectLogonId'>0x8eb4d</Data><Data Name='PrivilegeList'>-</Data><Data
 Name='SamAccountName'>-</Data><Data Name='DisplayName'>-</Data><Data
 Name='UserPrincipalName'>-</Data><Data Name='HomeDirectory'>-</Data><Data
 Name='HomePath'>-</Data><Data Name='ScriptPath'>-</Data><Data Name='ProfilePath'>-
</Data><Data Name='UserWorkstations'>-</Data><Data Name='PasswordLastSet'>-</Data><Data
 Name='AccountExpires'>-</Data><Data Name='PrimaryGroupId'>-</Data><Data
 Name='AllowedToDelegateTo'>-</Data><Data Name='OldUacValue'>-</Data><Data
 Name='NewUacValue'>-</Data><Data Name='UserAccountControl'>-</Data><Data
 Name='UserParameters'>-</Data><Data Name='SidHistory'>-</Data><Data Name='LogonHours'>-

It’s not easy to find the particular XML node or attribute you are looking for. One great shortcut is to coerce the output (which is a string object) into an XML object, then use the Save method to output to the console.


This produces nicely-formatted output. Unfortunately this doesn’t work when you are connected to a computer using PowerShell remoting. The [Console] device isn’t there for you. If you spend any amount of time in remote powershell sessions then this can become a little annoying. Another solution is required.

Some examples on the web show the use of the System.XMl.XmlTextWriter class to output to the screen. Interestingly, the Microsoft Docs entry for XmlTextWriter carries the following note:

Starting with the .NET Framework 2.0, we recommend that you create XmlWriter instances by using the XmlWriter.Create method and the XmlWriterSettings class to take advantage of new functionality.

Microsoft Docs – XMLTextWriter

So here’s an updated function to do just that, Format-XMLText (also available as a gist):

Function Format-XMLText {
    Process {
        # Use a StringWriter, an XMLWriter and an XMLWriterSettings to format XML
        $stringWriter = New-Object System.IO.StringWriter
        $stringWriterSettings = New-Object System.Xml.XmlWriterSettings

        # Turn on indentation
        $stringWriterSettings.Indent = $true

        # Turn off XML declaration
        $stringWriterSettings.OmitXmlDeclaration = $true

        # Create the XMLWriter from the StringWriter
        $xmlWriter = [System.Xml.XmlWriter]::Create($stringWriter,$stringWriterSettings)

        # Write the XML using the XMLWriter

        # Don't forget to flush!

        # Output the text
        # This works in a remote session, when [Console]::Out doesn't

So now we can pipe the results of toXML to Format-XMLText :

> $e = get-WinEvent -LogName Security -FilterXPath "*[EventData[Data[@Name='TargetUserName']='Angelique.Cortez']]" -MaxEvents 20
> $e[-1].toXML() | FormatXMLText

The output from Format-XMLText looks like this:

<Event xmlns="">
    <Provider Name="Microsoft-Windows-Security-Auditing" Guid="{54849625-5478-4994-A5BA-3E3B0328C30D}" />
    <TimeCreated SystemTime="2019-10-08T08:45:56.119393800Z" />
    <Correlation />
    <Execution ProcessID="600" ThreadID="2176" />
    <Security />
    <Data Name="Dummy">-</Data>
    <Data Name="TargetUserName">Angelique.Cortez</Data>
    <Data Name="TargetDomainName">CARISBROOKELABS</Data>
    <Data Name="TargetSid">S-1-5-21-447422785-3715515833-3878445295-1308</Data>
    <Data Name="SubjectUserSid">S-1-5-21-447422785-3715515833-3878445295-500</Data>
    <Data Name="SubjectUserName">Administrator</Data>
    <Data Name="SubjectDomainName">CARISBROOKELABS</Data>
    <Data Name="SubjectLogonId">0x8eb4d</Data>
    <Data Name="PrivilegeList">-</Data>
    <Data Name="SamAccountName">-</Data>
    <Data Name="DisplayName">-</Data>
    <Data Name="UserPrincipalName">-</Data>
    <Data Name="HomeDirectory">-</Data>
    <Data Name="HomePath">-</Data>
    <Data Name="ScriptPath">-</Data>
    <Data Name="ProfilePath">-</Data>
    <Data Name="UserWorkstations">-</Data>
    <Data Name="PasswordLastSet">-</Data>
    <Data Name="AccountExpires">-</Data>
    <Data Name="PrimaryGroupId">-</Data>
    <Data Name="AllowedToDelegateTo">-</Data>
    <Data Name="OldUacValue">-</Data>
    <Data Name="NewUacValue">-</Data>
    <Data Name="UserAccountControl">-</Data>
    <Data Name="UserParameters">-</Data>
    <Data Name="SidHistory">-</Data>
    <Data Name="LogonHours">-</Data>

I think you’ll agree that the output is much easier for the human eye to parse. This function works inside a remoting session and can handle multiple XML objects via the pipeline.

Let me know if this post has helped you in any way.

Posted in PowerShell | Leave a comment

Who’s The Host?

A question was asked on r/PowerShell about how to tell whether a script was running within an instance of the Console or the Integrated Scripting Environment.

A little research identified that the Get-Host cmdlet can provide the required information via the Name property.

Interestingly, Get-Host can also identify when a script is running under Visual Studio Code or via the System.Management.Automation.PowerShell class.

HostHost Name
Console ConsoleHost
Integrated Scripting Environment Windows PowerShell ISE Host
Visual Studio Code Integrated Console Visual Studio Code Host
System.Management.Automation.PowerShell Default Host

Posted in PowerShell | Leave a comment

Validating XHTML using PowerShell and .Net

XML and HTML both originally derived from Standard Generalized Markup Language (SGML) but they have a slightly difficult relationship. It is possible to create HTML that isn’t XML, but Web browsers will read and accept it.

Later versions of HTML (or XHTML) were developed to be more easily processed by XML tools. HTML5, the most recent version of HTML, is now its own entity, not necessarily conforming to previous XML or HTML standards. W3 Schools has an interesting history page for HTML5.

Using XHTML allows you to use automation techniques when building HTML artefacts, such as validating them. These artefacts don’t need to be HTML pages as such; they could be emails or content streams for other applications.

My interest in all of this stems from the fact that Microsoft OneNote uses XHTML when you interact with Notes using the Microsoft Graph API. I wanted to be able to validate the XHTML before I sent it to OneNote using my PowerShell module, and I assumed that it would be easy to use a schema file using PowerShell and the .Net Framework (.Net).

I was nearly right.


To make HTML act as if it’s XML you need to make it well formed. This means that tags can’t overlap, and must always be closed (including ’empty’ tags). Element and attribute names must be in lower case. Attribute values must always be quoted. For a full list of criteria see the W3C page on the differences between XHTML and HTML4. Once these issues are sorted, you can use XML tools to manipulate HTML.


OneNoteUtilitiesGraph (OUG) is a project of mine on GitHub. It’s a PowerShell module designed to manipulate data stored in OneNote using the Microsoft Graph.

One of the capabilities I’ve been developing is creating content using OUG and accessing it consistently.

OneNote uses what it calls input and output html. This is a subset of HTML tags with additional attributes designed specifically for use managing OneNote content.

My interest in validation came from wanting to sanity check my OneNote input HTML.

Schemas and Validation

There are a number of XHTML schemas you can use to validate your XHTML.
These are available from the World Wide Web Consortium (W3C) web site, so you can access them from your PowerShell scripts.

As usual PowerShell works with .Net to help us do the validation. We can build an XmlDocument object $document and add a schema to it, then call the Validate method. Here’s a sample XHTML document:

<html xmlns="" lang="en" xml:lang="en"
    <title>Title of document</title>
    <p>some content</p> <div>hello</div>
    <p data-id='hello' id=''></p>
        <img src='img00100.jpg' alt='image'/>

It has a few issues that validation may help to identify.

Notice that in the following script I’ve supplied a script block to the Validate method that handles each error as it occurs. This is known as a callback or a delegate. I’ve learned that in C# you would provide the name of a function to be called each time a validation error occurs. PowerShell allows us to use a script block. This can be inline, as below.

I’ve cast the Schemas.Add method to [Void], so that it’s output doesn’t appear on screen.

$document = New-Object System.Xml.XmlDocument
[Void]$document.Schemas.Add("", "")
$document.Validate({Write-Host $args[1].Message})

When the above code is run in a script I get the following results:

The 'data-id' attribute is not declared.
The 'id' attribute is invalid - The value '' is invalid according to its datatype '' - The empty string '' is not a valid name.
The element 'body' in namespace '' has invalid child element 'h7' in namespace ''. List of possible elements expected: 'p, h1, h2, h3, h4, h5, h6, div, ul, ol, dl, pre, hr, blockquote, address, fieldset, table, form, noscript, ins, del, script' in namespace ''.

All of this error text has been generated by the validation process, from the schema file. The result is quite verbose, but also helpful. data-id is an attribute used by OneNote, so using this schema may not be as straightforward as I thought.

If had wanted to have a more complex ValidationEventHandler routine I could have created a script block using a here-string like the one below:

$validationEventHandler = @'
param($sender, [System.Xml.Schema.ValidationEventArgs]$evtArgs)
if ($evtArgs.Severity -eq [System.Xml.Schema.XmlSeverityType]::Warning) {
    Write-Host "`nWARNING: ";
 elseif ($evtArgs.Severity -eq [System.Xml.Schema.XmlSeverityType]::Error) {
   Write-Host "`nERROR: ";
 Write-Host $evtArgs.Message
$validationEventHandler =  [ScriptBlock]::Create($code)

I would then put a reference to $validationEventHandler in the Validate method call, as in:


The script block stored in the $validationEventHandler variable is called every time a ValidationEvent occurs.

Performance Issues

Performance is an issue using the primary source schema documents. Downloading and processing them can take seconds. This is fine for one-off validations but users get twitchy if things appear to ‘hang’. Maybe there’s a better way?

A Possible Workaround

I can download the XHTML schema using:

Invoke-WebRequest '' -OutFile xhtml1-strict.xsd

and load it from the local file system using:

$document.Schemas.Add("", "xhtml1-strict.xsd")

This works OK, but now I’ve got a local dependency on an .xsd file that I don’t own.

A Solution?

Having worked out how to download and use schema files from the web, it now looks like a local schema file might be the best solution. This could be tailored for the OneNote version of HTML.

This could be a longer project! Expect a further blog post on this.

Going through the process of working out how schemas and validation works has been educational.

What have I learned?

New knowledge I have gained includes:

  • Schemas
    How to download them, add them and use them to validate.
  • Validation
    How to make it work for XML/XHTML documents.
  • Callbacks and Delegates
    How to specify a delegate or callback using an inline script block or a reference to one created from a here-string.
  • HTML5
    Isn’t SGML or XML or old-style HTML.

Wrapping up

So you can use PowerShell and .Net to validate XML or XHTML documents using schemas. Unfortunately I can’t use the publicly available schemas to validate OneNote input HTML directly. Hopefully these notes will help anyone else looking to use PowerShell to validate XHTML or XML documents using schemas. Please let me know how you get on.


OneNoteUtilitiesGraph on GitHub
HTML5 Intro on W3 Schools
XML Document Class – Validate Method on Microsoft Docs
XHTML Schemas on the W3C site
Creating OneNote Pages on Microsoft Docs

Posted in Uncategorized | Tagged , | 1 Comment

Adam the Automator and Friends

I’ve recently joined Adam Betram on his Adam the Automator site as a contributor. My ATA page has a list of my articles which is automatically updated as I produce new content. This doesn’t mean the end of the Mad Scientist blog; I will be producing more commercial content for ATA – my personal musings, e-books etc. will continue to appear here.

Posted in Uncategorized | Leave a comment

I have published a Kindle book – PowerShell and Windows Event Logs

Front Cover

I’m happy to announce that my second Kindle book is published!

As the title suggests, PowerShell and Windows Event Logs covers accessing and managing Windows Event Logs using PowerShell.

As well as covering the two families of eventlog cmdlets available in Windows I also introduce the underlying .Net classes and how they can be used, along with our old friend WMI. I also talk a little bit about the history of Windows Event Logs (yes there is a screenshot of Windows NT 3.51).

There is information on using XML and XPath to filter for the records you are interested in, as well as an approach to extracting data from event logs – a task that is not so easy as it might first appear.

There are also a number of ‘Task Helpers’ that cover the use of .Net from PowerShell where the cmdlets cannot be used directly to achieve a given results as well as some common XPath patterns you can use to build your own event log queries.

PowerShell and Windows Event Logs is available now from the Amazon kindle book store. You can preview the book or view the book page on Amazon.

Posted in My Books, PowerShell | Leave a comment

PowerShell and OneNote for Windows 10

I have long been a fan of OneNote. I have always wanted a way of using it with PowerShell. My initial work in this area focused on the Windows desktop version, but there is now a REST API available via the Microsoft Graph.

OneNoteUtilitiesGraph is a PowerShell module designed to be used either in scripts or at the command line. Its purpose is to allow both read and write access to your OneNote data. For example:

Get-ONPages "starts with(title,'August')"

Will return a list of pages whose title starts with the word ‘August’.

The module also has the capability to create new NoteBooks, Sections, SectionGroups and Pages.

This module is available at

Future plans:

  • Page templates (script-side)
  • Support for Tags
  • HTML Templates for Pages
  • Merge-to-OneNote
  • Out-ONPage
  • New Graph REST features as they are released
Posted in My Software, PowerShell | Leave a comment

Liverpool Football Club Season Review 2018-19

My history with Liverpool Football Club (LFC) is documented elsewhere on this blog.

This season ended as did the last one, with a Champions league final, but first things first…

The years since my previous report have seen the development of Jurgen Klopp’s vision for LFC and also the departure of some significant players, including Phillipe Coutinho and Raheem Stirling. Finishing 8th in the English Premier League (EPL) for the 2015-16 season was a low point, but there has been gradual improvement, with successive 4th place finishes in 2016-17 and 2017-18. The arrival of Mané and Salah have boosted our attacking threat, and the development of a world class defence around van Dijk, Matip, Robinson and Alexander-Arnold have made us difficult to beat, home or away. The 2018-19 season seemed unlikely to replicate the ‘Heavy Metal’ football of the previous season which saw LFC scoring many goals – Salah being top scorer with an amazing 44 goals in all competitions – but confidence was high that we could achieve something special, something to counter the heartbreak of the Champions League final loss of the previous year.

Liverpool were different this year – not perfect, but almost so. Losing only one EPL game, drawing 7 and winning the remaining 30 for a total of 97 points. The team’s approach seemed more clinical, more determined, calmer. In the end they scored more goals (89) than the previous season, but also conceded fewer too. The team’s style of football was praised as exciting by pundits (some more grudging in their praise than others).

Unfortunately there was another team enjoying (another) great season. Manchester City’s all-stars finished one point ahead of Liverpool, 98 points to 97. The race to the end of the season was the most enthralling for many years, but again Liverpool fell short.

At the same time as the EPL campaign Liverpool were taking part in the 27th season of the UEFA Champions’ League. In honesty Liverpool struggled in Group C, winning three games and losing three and coming second in the group behind Paris Saint-Germain. Once in the knock-out stages though, things were different. Bayern Munich and FC Porto were brushed aside. Barcelona awaited in the Semi-final.

The first leg at the Camp Nou would have been considered a disaster in previous seasons. Liverpool defeated 3-0 including the obligatory magical free-kick from Lionel Messi. Somehow it didn’t feel like the end of the world, not this season, not with this team. The second, home leg was still to come.

The 7th of May 2019 saw one of the most remarkable Liverpool performances I have ever seen, topped only by that night in Istanbul. Braces of goals from Divock Origi and Georginio Wijnaldum, and one of the quickest-thinking corners I have ever seen resulted in Messi, Coutinho, Suarez and associates leaving the competition and Liverpool advancing to the final. They would be joined by Tottenham Hotspur, who managed their own recovery from a losing position the following day.

June the 1st 2019 at the Stadio Metropolitano, Madrid, Spain was the venue for the final, the first to be contested between two English teams for a decade. Confidence was high that Liverpool could win, given that they had defeated Spurs at both meetings during the EPL season. All expectations were exceeded early on, following a penalty awarded only thirty seconds or so into the game, which Salah duly dispatched. From then on the game settled into a strange pattern. Both teams were clearly lacking sharpness, possibly due to the three week break from the end of the EPL, possibly due to the heat and humidity in Spain. Spurs retained the majority of the possession. The first half finished at 1-0 to Liverpool.

The second half saw Spurs increase their intensity – forcing Alisson Becker to make a number of excellent saves, but the final word fell to Liverpool, via the boot of Divock Origi, scorer of the decisive goal against Barcelona. His quick-witted finish, a low driven shot across the keeper in the 87th minute, meant that the contest was effectively over. Liverpool were crowned European Champions for the 6th time, placing them third on the all-time list of winners, and the top English team. Once again an English team, captained by an English player raised the European Cup in Triumph.

This season in statistics:

  • Beat every EPL team at least once except Manchester City
  • Beat French Ligue 1 Champions 2018-19 – Paris Saint-Germain
  • Beat German Bundeslige Champions – Bayern Munich
  • Beat Spanish La Liga Champions 2018-19 – Barcelona
  • Beat Italian Serie A 2nd Placed Team 2018-19 – Napoli
  • Two players shared the EPL Golden Boot – Sadio Mane and Mohamed Salah (22 goals each)
  • EPL Golden Glove – Alisson Becker (21 clean sheets)
  • Premier League Player of the Season – Virgil van Dijk
  • PFA Players’ Player of the Year – Virgil van Dijk

After all the statistics are analysed, all of the emotions have subsided, one thing that Michael Owen said during the TV commentary stuck in my mind – I’ll try to paraphrase it.

Liverpool played brilliant football in the Premier League, gained 97 points and ended up with nothing. Liverpool played poorly today and still came away with the trophy.

Michael Owen, BT Sport

That’s the difference between Liverpool 2017-18 and 2018-19: the ability to take a difficult situation and still come through triumphant.

Here’s to the Red Men:

Fabinho, Virgil van Dijk, Georginio Wijnaldum, Dejan Lovren, James Milner, Naby Keïta, Roberto Firmino, Sadio Mané, Mohamed Salah, Joe Gomez, Alisson Becker, Jordan Henderson, Daniel Sturridge, Alberto Moreno, Adam Lallana, Alex Oxlade-Chamberlain, Simon Mignolet, Xherdan Shaqiri, Andrew Robertson, Divock Origi, Joël Matip, Curtis Jones, Ki-Jana Hoever, Rafael Camacho, Trent Alexander-Arnold, Nathaniel Clyne

and of course

Jürgen Klopp

See you next season.

#YNWA #SixTimes

Posted in Football | Tagged , | Leave a comment

Offline Windows Update Scans using PowerShell

It’s important to keep your Windows systems up to date. Your computer is just as much at risk from other devices on your network as it is from threats on the internet. So even if your computer isn’t accessing the internet, if it shares a network with others that are, it should be kept up to date.

There are a number of offerings that can help with this. Microsoft’s WSUS and SCCM can both act as a source of updates for internet disconnected machines, but what if you don’t have that infrastructure in place? Enter Offline Update Scanner.

Offline Update Scanner is a PowerShell script which leverages the Windows Update Agent COM object, the Microsoft Offline Update scan package CAB file and the Windows Scheduled Tasks COM object (I know about the Scheduled Tasks cmdlets but not all versions of Windows in circulation have these).

The current version of this tool is available on GitHub.

Offline Update Scanner (henceforth OUS) is designed to give you a list of updates that are required on a given machine. It is designed to run locally, or can be triggered via some built-in scheduled task functionality if you need to run it in a PSSession or via Invoke-Command, thus getting around the limitations of remotely managing Windows Updates.

OUS uses two parameter sets, one using -AddTask, the other using -Run.

OUS and Scheduled Tasks

Running OUS with -AddTask creates a Scheduled Task that will run as System on the computer. The task’s job is to run OUS with the -Run parameter to trigger the actual scan. This is useful because you cannot use a lot of the functionality of the Windows Update Agent COM object via any kind of remote command – see Using WUA from a Remote Computer on the Microsoft docs site. Running the script from a Scheduled Task makes it a local action, so the COM object is happy to oblige. Once the task has completed it deletes itself from the Task Scheduler.

Components of a Scheduled Task

In order to register a Scheduled Task we need to set up a number of components:

  • A connection to the Scheduled Task service
  • A Task Definition
  • Registration Information
  • Settings
  • An Action
  • A Trigger
  • A Principal

Connect to the Scheduled Task service

Connecting to the service involves creating a Schedule.Service object and calling its Connect method.

$STService = New-Object -ComObject Schedule.Service

A new Task Definition

Once we have a connection, we can create the other objects that we need. First comes a new Task Definition.

$NewTaskDef = $STService.NewTask(0)

Registration Information

Then from the new Task Definition we create some Registration Information.

$RegInfo = $NewTaskDef.RegistrationInfo
$RegInfo.Description = "Offline Update Scan"
$RegInfo.Author = "Stuart Squibb"

Security Principal

Next we create the security principal under which the scheduled task will run. This is important to ensure that the Schedule Task runs under an account with the correct level of access to do its job.

$Principal = $NewTaskDef.Principal
$Principal.LogonType = [TASK_LOGON_TYPE]::Task_Logon_Service_Account
$Principal.UserId = 'NT AUTHORITY\SYSTEM'
$Principal.Id = 'System'

Here I’ve used an Enum to define the types of logon available to Scheduled Tasks and chosen the value that specifies a Service Account. I want to use the local SYSTEM account, so that’s the UserId I’ve specified.


Next I’m going to define some general settings for the Task.

$Settings = $NewTaskDef.Settings
$Settings.Enabled = $True
$Settings.DisallowStartIfOnBatteries = $False

The DisallowStartIfOnBatteries property caught me out. I do a lot of work on a laptop. I couldn’t work out why the tasks I was creating were not running. By default DisallowStartIfOnBatteries is set to True.

The Enabled property specifies whether the task can run or not.


Next we need to create a Trigger for our task.

$Trigger = $NewTaskDef.Triggers.Create([TASK_TRIGGER_TYPE2]::TASK_TRIGGER_TIME)

Here I’m using an Enum to specify a time-based trigger. Here it is:

public enum TASK_TRIGGER_TYPE2
        TASK_TRIGGER_TIME	= 1,
        TASK_TRIGGER_IDLE	= 6,
        TASK_TRIGGER_BOOT	= 8,

Unrelated to Windows Updates, one of the interesting Trigger types is TASK_TRIGGER_SESSION_STATE_CHANGE. Here’s the Enum for the different types of session change that can be acted on:

        TASK_SESSION_LOCK	= 7,

Scheduled tasks can be triggered by console and RDP connections and disconnections as well as when a session is locked or unlocked.

Anyway, now that we have our Trigger we can define when we want the task to run.

$Trigger.StartBoundary = $StartTime
$Trigger.EndBoundary = $EndTime
$Trigger.ExecutionTimeLimit = "PT5M"
$Trigger.Id = "TimeTriggerId"
$Trigger.Enabled = $True 

The $StartTime and $EndTime variables are defined elsewhere in the script. By default I set the task to start 45 seconds after it is created. I also set an execution time limit of 5 minutes using the string format defined for this purpose. At this point I also set the Enabled property.


Next I create the Action – the actual thing I want the task to do.

$Action = $NewTaskDef.Actions.Create([TASK_ACTION_TYPE]::TASK_ACTION_EXEC)
$Action.Path = "C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe"
$Action.Arguments = "-ExecutionPolicy ByPass -NoProfile -NonInteractive -File C:\OfflineUpdateScan\OfflineUpdateScan.ps1 -Run -Format $Format -Path $Path"
$Action.WorkingDirectory = "C:\OfflineUpdateScan"

Here I create the Action object and assign the Path, Arguments and WorkingDirectory properties. Notice that I’m using variable substitution to create a dynamic set of arguments based on arguments passed to the script.

Registering the Task

The last step is to register the task with a folder.

$RootFolder = $STService.GetFolder("\")
[Void]$RootFolder.RegisterTaskDefinition($ScriptGuid, $NewTaskDef,[TASK_CREATION]::TASK_CREATE_OR_UPDATE,$Null,$Null,$Null)

The $ScriptGuid variable is used as the name of the Task and is defined elsewhere. I pass in the TaskDefinition object and tell the service to create a new task or update an existing one. The Task is registered. Note the use of [Void] – this is because the RegisterTaskDefinition method usually echoes an XML verison of the Task to the screen.

OUS and the Windows Update Agent

Running OUS with -Run begins a scan of the computer using the Microsoft Offline Update CAB file ( which is available here . Currently it is 560MB in size. You can specify the report file format (CSV, HTML or XML) and the file path using the -Format and -Path parameters. If you are running the script interactively you can also choose Console as a format.

The end result should be a file containing a list of the updates that you need to install on the machine.

Creating an Offline Scan

In order to run a Windows Updates scan we need to create a few objects.

  • An Update Service Manager
  • An Update Service
  • An Update Session
  • An Update Searcher
  • An Update Search Result Collection

Creating an Update Service Manager and Service

To create a Service Manager we create a new Microsoft.Update.ServiceManager object. We then create a new UpdateService object by calling the AddScanPackageService method with the path to the CAB file and a descriptive name.

$UpdateServiceManager = New-Object -ComObject "Microsoft.Update.ServiceManager"
$UpdateService = $UpdateServiceManager.AddScanPackageService("Offline Sync Service", $CabLocation, 1)

Creating an Update Session and Searcher

Next we create a new Microsoft.Update.Session object, then call its CreateUpdateSearcher method to create a new Searcher.

$UpdateSession = New-Object -ComObject "Microsoft.Update.Session"
$UpdateSearcher = $UpdateSession.CreateUpdateSearcher()

Joining up the Objects

Before we can search we need to link the Searcher to the Service.

$UpdateSearcher.ServerSelection = [ServerSelection]::SSOthers
$UpdateSearcher.ServiceID = $UpdateService.ServiceID

The [ServerSelection]::SSOthers constant is required for the Searcher to accept the Service when configured for offline scanning.


We actually search by calling the Search method of the Searcher, passing in a query. The query syntax can be found here. The result is a SearchResult object. The list of updates is stored in the Updates property of the SearchResult as an UpdateCollection object. We can use this to export data about the updates to a file.

$SearchResult = $UpdateSearcher.Search("IsInstalled=0")
$Updates = $SearchResult.Updates

Exporting the data

Once we’ve got a collection of updates we need to collect the properties of each update that we wish to export. The script exports the MsrcSeverity, Title, MaxDownloadSize and MinDownloadSize properties, along with a computed field. This joins up the possibly multiple values stored in an update’s KBArticleIds field into a single semicolon (;) delimited string. The computed field is created using:

@{Name="KBs";Expression={$_.KBArticleIds -join ';'}}

Lastly, we use a switch statement to decide what to do with our results.

Re-Purposing the code

Of course there’s nothing to stop you from using this code to do an online scan, simply remove the lines that set up the Offline Service. By default you will be connected to Windows Update online.


Using the Windows Update API

Using the Task Scheduler

Posted in PowerShell | Tagged , | Leave a comment

My Amazon Author Page

Now that ‘PowerShell and Active Directory’ is out and beginning to sell, I have begun to set up my Author Page on Amazon. This will be home to all of my Kindle books.

Posted in My Books | Leave a comment