Tooling with PowerShell: Keep it simple

- select the contributor at the end of the page -

“In one hour, you’ve replaced three years of unmaintained software tools and given us the ability to resolve every open support issue in the queue. So, thanks for that.”

These words came from a client. Their operations queue was long, the engineers cranky and over-worked. They lacked tools necessary to adequately manage an application and its data. So I gave the team “proper” tooling – tools that allow them to address arbitrary problems that users tend to find, as those users find them.  And yes it took less than an hour. In fact it took almost no new code at all. Let me show you how, and in the process offer you a unique perspective on what these tools should look like and how you can quietly leverage your development expertise in the manic world of operations.

For whom does it “just work?”

Microsoft and the community have done a great job of providing software frameworks that solve problems for developers creating applications. The relevant example from my client is Entity Framework, which solves the common problem of persisting objects from memory to a relational data store like SQL server. In fact, it does so in a way that is nearly effortless. Simply define the models that needs to persist and declare a data access type (two things necessary for any object persistence solution), and Entity Framework just figures out the logic from the model’s metadata. Adhere to some simple conventions, the object persistence “just works” (well, you know, caveat emptor) and the developer can move on to the next problem.

Unfortunately, these frameworks do little on their own to address the types of problems faced by the team supporting and managing the application and its infrastructure – what I refer to as “operations” in this post. The tools for this group need to build a bridge between operations and development, encouraging communication and understanding between teams and fostering a culture of devops.

The client’s broken toolset

When I walked in to this client, their entire application tooling was based on a technique known as scaffolding. This is a common approach in ASP.NET MVC sites built around Entity Framework. Remember that metadata that is used to determine how the model is stored? Well, it turns out you can use that metadata for other things too, like generating user interfaces. In this case, scaffolding defines a set of handlers in the web site that allow the execution of CRUD operations on a model. That is, you get specific pages designed for creating, reading, updating, and deleting a specific type of entity. Since this scaffolding operation is built into the ASP.NET MVC templates for Visual Studio, the developer is essentially getting these pages for free.

The image below shows the scaffolding UI used to create a new “User.” This User might represent a single user for a website or service.  The User model contains a few properties such as their name, email and status, and the generated UI provides a straightforward way to set each of these properties using well-known controls.

The PowerShell toolset

When I left this project, the client had a new toolset based entirely onPowerShell.  The toolset uses the EntityShell open-source project to enable CRUD operations on my client’s entities right from the command-line shell. So, remember that metadata in the model that is used to determine how the model is stored and for UI scaffolding? EntityShell uses this metadata to implement what’s known as a PowerShell Provider. This provider allows you to navigate the models stored by Entity Framework as if they were files on an “entity drive.” This example shows common filesystem commands, like cd and dir, being used to navigate and search an Entities drive containing our User models:

Viability of the toolsets for operations

From the perspective of the developer, both tool sets provide a no-effort solution that facilitates CRUD workflows on entities. To the operations engineer however, the UI tooling represents a minimally viable toolset for the problems they need to solve, while the PowerShell tooling adapts to their ever-changing needs. In addition, by sharing the underlying entity set between operations and development, the PowerShell toolset fosters a greater level of communication between teams.

Let’s see how these two toolsets would be used to solve some common entity management problems.

Case 1: “We’ve got a trade show tomorrow, Jake in sales says he needs 100 demo users ready to go. And give them all emails of ‘’ so we can find them later.”

The main aspect of this scenario is the number of users that need to be created.  Creating a user using the UI is a manual process:

  • Click the “Create New” link
  • Type in the User’s name
  • Type in the User’s email address
  • Set the User’s active status
  • Click the “Create” button

Having to repeat this manual process is where the tool loses all value. We just created one user, and we need 99 more. With the GUI, our only option is to repeat the five-step manual process until all 100 test accounts are created. This is where the engineer begins to tear up.

Now consider the PowerShell toolset. Creating 100 users is simply a matter of using the standard PowerShell new-item cmdlet on our Entities drive and specifying property values for the new user:

0..99 | foreach { new-item –path entities:/users –username “User$_” –useremail “”  -isActive $true }

This script reads: for each number 0 to 99, create a new user with a name that includes the current number, the email account, and ensure this user is active.

So one line of PowerShell or 100 manual iterations of a 5-step process, the end result is the same.  This comparison highlights the first key requirement of operational tools: they need to be repeatable and automatable. GUI tools can’t meet this requirement very well – they tend to assume a human presence for every individual operation.

Case 2: “Ok, the trade show is over.  Now disable those 100 users we created, we don’t want them using the product for free anymore.”

We’re looking at another bulk operation here, but the first step is isolating the 100 users on which we need to operate.  Using our barebones GUI toolset, our only option is a visual inspection for the target email address our boss asked us to use:

Once we find a target user, we need to edit the active status of the entity by:

  • Clicking the target user’s Edit link;
  • Unchecking the IsActive checkbox on the user’s Edit page;
  • Clicking the “Save” button to commit our changes for this user.

One down. 99 to go. Now the engineer is sobbing, occasionally casting a longing glance towards a picture of her family that she hasn’t seen in days.

The PowerShell tool performs a similar workflow: first the engineer must isolate the target users, then edit their active status, and finally commit those changes.  The difference is how we find and edit our target users:

$users = dir entities:/users | where Email -eq ‘’

That’s right, no visual search. We just tell the tool what we need: all users with a specific email address. The tool does the work, the engineer sips coffee. Once we have our targets isolated in the $users variable, we edit them in bulk and be done with it:

$users | set-item –isactive $false

This second case showcases how devops tooling automates the componentsof a workflow rather than the entire workflow en masse. Consider how our GUI forces us to complete a path for each user we need to change: find the user, edit that user, and commit the change. That entire path must be repeated in order for the tool to solve our problem.

The PowerShell tool offers a far more efficient approach: find all target users first, then edit them all in the same way, and commit the entire set of changes in bulk.

Case 3: “Hey, sales sent over this CSV file with data for 1000 paid user accounts from that trade show. Get them in the system ASAP.

“Fakey von Fakerschmidt”,”
“I. M. Knotwreal”,””

The noteworthy aspect of this case is the source of the input – our engineer is receiving a structured data file. Unfortunately our GUI tooling doesn’t have the capability to process it. Now our already frenzied engineer must either ask for new tooling – which given the ASAP requirement imposed by the boss will not arrive in time – or she must bite the bullet and enter each record from the CSV into the GUI tool by hand. Oh dear, look at her – she’s now mumbling to herself and rocking back and forth. I’m not sure she’s going to make it around here much longer.

If only she had the PowerShell tooling, it would be so easy. PowerShell doesn’t care where the input comes from. It simply wants to help get the work done. It’s happy to read the CSV file records as structured objects using the import-csv cmdlet, making them ready for the new-item cmdlet to consume:

import-csv salesdata.csv | foreach { new-item –path entities:/users –username $ –useremail $  -isActive $true }

This is perhaps the most important aspect of a devops tool: fostering agility in the efforts of engineer. Strict paths are good for keeping end-users on the happy path, but Operations need flexibility. Their tools should enable them to think creatively, combine concepts, and adapt quickly to solve the problem at hand.

Case 4: “Here is another file from marketing with some more user accounts we want in the system. I think Erin put them in last time, but she ran out of here screaming this morning, arms flailing and …. Yeah, so she doesn’t work here anymore.”

This case is interesting for two reasons: the engineer is being asked to perform a task that someone did in the past, and that someone is no longer around. Poor Erin. I hope she’s okay.

The thing is, with the GUI toolset there is no way Erin could have captured her solution. Each problem Erin took to the GUI was an isolated event, the solution lost as soon as it was created, leaving no way to record the activity so someone can learn from it or reproduce it later. Since you can’t save and share a mouse click, you’re doomed to repeat them.

Not so with the PowerShell tooling. A PowerShell operation is expressed as text, and text can be easily captured, shared, and reused in the form of script files. Erin could have implicitly documented the operation as a script, and then shared it with the new engineer so he could learn from her hard work. Then Erin could have gone on a sweet vacation to check out the music scene in Portland and gone on to have a wonderful, productive life.

Keeping the operations perspective as a developer

The golden rule of making good software is this: know thy user. When your user works in operations, their primary goal is to keep the application working. Creating tools to make that easy for them doesn’t have to be hard:

  • Make it automatable;
  • Make it composable;
  • It should foster agility;
  • It should promote capture, sharing, and reuse.

Just remember to focus on these aspects of your application tooling.

Get our content first. In your inbox.

Loading form...

If this message remains, it may be due to cookies being disabled or to an ad blocker.


Jim Christopher

Jim Christopher is the Curriculum Director for Enterprise Content at Pluralsight. He has over 18 years of experience developing software for aerospace, education, gaming, and business. Jim is a multi-year Microsoft MVP, avid speaker, Pluralsight author, and general lover of life. You can follow him on twitter, where he's known as @beefarino.