PowerShell Best Practices: Quit with the Text FIles

PowerShell Best Practices: Quit with the Text FIles

Instead of dropping in a bunch of computer names into a file, I strongly encourage you to think about how you’re getting those computer names and set that criteria in the script.

As a PowerShell beginner, you’re probably using the Google. A lot. And you should. You’ll find a lot of scripts out there; some good and some bad. One example to watch out for is demonstrating a concept of reading computer names from a text file to receive input. For example, let’s say you need to run a command on multiple computers.

When someone is demonstrating how to run a command on multiple computers, the easiest way to do this is to dump some computer names in a text file, save it and then use Get-Content or Import-CSV to read the text file and send the names to the command. I strongly encourage you NOT to do this.

When you just blindly dump a bunch of computer names in a text file, you’re probably getting from Active Directory, a database, DNS, etc. The source doesn’t really matter.

What you may not realize is that with just a few more lines of code in that script, you not only remove the manual process of creating the text file, but you’re also making your script more dynamic and portable.

Here’s a great example.

I once had two Vbscript scripts that disabled computer accounts based on lastLogonDate that probably took 10+ minutes of my time to run. Occasionally, I would:

  • Manually go into AD and sort all computers by the lastLogonDate
  • Copy all of the computer names out into a text file.
  • Googled for “how to read lines from a text file” and slapped that into another script which then removed all the computer accounts with recent lastLogonDate attributes which created another text file.
  • I then had a script that read that list and disabled the accounts.

This workflow was crazy, but it was the quickest way I could see how to do it at the time. I was just blindly copying/pasting code that I found and shoehorning it into the task I was trying to accomplish.

Doing this had three distinct problems:

Manual intervention - I was charged with finding all AD computers and copying/pasting them into a text file.

Introducing more complexity than necessary - What if the folder I was writing these text files to didn’t have the right permissions? What if I goofed up the file names in the script? These were questions that I didn’t have to even think about if I’d have written it right.

Static - The computer name set would change all the time. This is why I was continually copying/pasting a new set of computers. I needed to leave this script more flexible so that I could run it numerous times over an extended period of time, and it’d just work every time.

What was the right way to do it? See below. The PowerShell code snippet below removed the need to manually find the computer names and create the first text file. This code also enabled the script to be more flexible, allowing me to run it at any time of the year by dynamically retrieving the current date with Get-Date.

Import-Module ActiveDirectory
$DaysOld = 90
$OldDate = (Get-Date).AddDays("-$DaysOld")
Get-AdUser -Filter * -Properties samAccountName,lastlogondate | Where-Object { ($_.lastlogondate -le $OldDate) -and ($_.Enabled -eq $True)} | Disable-AdAccount -Confirm:$false

The example demonstrated here pertains to AD, but this logic doesn’t just apply here. You can this tactic to whatever source you get the computer names from. Simply define your criteria (90 days old) and then write the script to pull from the source directly instead of introducing an unnecessary step of creating that text file.


The next time you find a script on the Internet, don’t take it at face value. Reach out to experts in the field and get their opinion. You may find out that the script that reads information from a text file could be written much better by eliminating that text file altogether!


Comments are disabled in preview mode.
Loading animation