Is my PowerShell script inefficient? - powershell

I wrote a script to delete temporary files/folders older than 90 days on remote servers. The server.txt file is loaded with Get-Content, and I use 'net use' to map to the IPC$ share. I'm worried that I'm not using Best Practices to delete the old temp files. Here is the meat of my script:
net use \\$server\IPC$ /user:$Authname $pw /persistent:yes
Get-ChildItem -Path "\\$($server)\C$\Temp" -Recurse | Where-Object {!$_.PSIsContainer -and $_.LastAccessTime -lt $cutoffdate} | Remove-Item -Recurse
(Get-ChildItem -Path "\\$($server)\C$\Temp" -recurse | Where-Object {$_.PSIsContainer -eq $True}) | Where-Object {$_.GetFiles().Count -eq 0} | Remove-Item -Recurse
net use \\$Server\IPC$ /delete
The first gci deletes old files, the second deletes empty folders.
The reason I'm concerned is that in my initial tests, it's taking about a half hour to delete approximately 4 gb off of one server. And I work in a big shop; my script needs to be run against about 10,000 servers. At that rate my script won't be done for more than six months, and I was hoping to run it on a quarterly basis.
Am I doing something the hard way?

get a list of your servers
cycle through the list and use invoke-command -computername
your command will be executed on the remote server rather than pulling all the data across the network which is very slow

Related

Get-ChildItem on Multiple Computers, Performance Issues

I'm wanting to improve on my script to be able to accomplish the following:
Scan servers based on get-adcomputer on specific OUs.
Scan each server based on whatever drive letter it has.
Scan each server for log4j.
Export all results to a CSV that identifies the folder path, name of file, and the server that the file was found on.
I have been using the following code to start with:
$Servers = Get-ADComputer -Filter * -SearchBase "OU=..." | Select -ExpandProperty Name
foreach ($server in $Servers){
Invoke-Command -ComputerName $Server -ScriptBlock {
$Drives = (Get-PSDrive -PSProvider FileSystem).Root
foreach ($drive in $Drives){
Get-ChildItem -Path $drive -Force -Filter *log4j* -ErrorAction SilentlyContinue | '
foreach{
$Item = $_
$Type = $_.Extension
$Path = $_.FullName
$Folder = $_.PSIsContainer
$Age = $_.CreationTime
$Path | Select-Object `
#{n="Name";e={$Item}}, `
#{n="Created";e={$Age}},`
#{n="FilePath";e={$Path}},`
#{n="Extension";e={if($Folder){"Folder"}else{$Type}}}`
} | Export-Csv C:\Results.csv -NoType
}
}
I am having the following issues and would like to address them to learn.
How would I be able to get the CSV to appear the way I want, but have it collect the information and store it on my machine instead of having it on each local server?
I have noticed extreme performance issues on the remote hosts when running this. WinRM takes 100% of the processor while it is running. I have tried -Include first, then -Filter, but to no avail. How can this be improved so that at worst, it's solely my workstation that's eating the performance hit?
What exactly do the ` marks do?
I agree with #SantiagoSquarzon - that's going to be a performance hit.
Consider using writing a function to run Get-ChildItem recursively with the -MaxDepth parameter, including a Start-Sleep command to pause occasionally. Also, you may want to note this link
You'd also want to Export-CSV to a shared network drive to collect all the machines' results.
The backticks indicate a continuation of the line, like \ in bash.
Finally, consider using a Scheduled Task or start a powershell sub-process with a lowered process priority, maybe that will help?

Powershell script which will search folders with regex and which will delete files older than XX

I need a powershell script ;
it must search some subfolders which folders names are starting with character between 1 and 6 (like 1xxxx or 2xxx)
and using the name of these folders as variable it must look under each folder for the *.XML files which are older than 30 min
and if it finds them it must delete it.
there may be more than one folder at same time, which are providing the same conditions so IMO using an array is a good choice. But I'm always open to other ideas.
Anybody can help me please ?
Basically I was using this before the need changes but now it doesnt help me.
powershell -nologo -command Get-ChildItem -Path C:\geniusopen\inbox\000\ready\processed | Where CreationTime -lt (Get-Date).AddDays(-10) | remove-item
Thank you
You can do something like the following and just remove -WhatIf if you are satisfied with the results:
$Time = (Get-Date).AddMinutes(-30)
Get-ChildItem -Path 'C:\MostCommonLeaf' -Recurse -File -Filter '*.xml' |
Where {$_.CreationTime -lt $Time -and (Split-Path $_.DirectoryName -Leaf) -match '^[1-6]' -and $_.Extension -eq '.xml'} |
Remove-Item -WhatIf
MostCommonLeaf would be the lowest level folder that could start as your root search node. We essentially don't want to traverse directories for nothing.
You could potentially make the script above better if you know more about your directory structure. For example, if it is predictable within the path where the 1xxx folders will be, you can construct the -Path parameter to use the [1-6] range wildcard. -Filter '*.xml' could also return .xmls files for example, so that's why there is additional extension condition in the Where.
Using -Recurse and -Include together generally results in much slower queries. So even if tempted, I would avoid a solution that uses those together.
If there are millions of files/directories, a different command construction could be better. Running Split-Path millions of times could be less efficient than just matching on the directory name, e.g. where {$_.DirectoryName -match '\\[1-6][^\\]*$'}.
I think you are looking for something like this:
$limit = (Get-Date).AddMinutes(-30)
$path = "C:\Users\you\xxx"
$Extension = "*.xml"
Get-ChildItem -Path $path -Filter $Extension -Force | Where-Object {$_.CreationTime -lt $limit} | Remove-Item
I haven't tested it though.
Keep in mind whether you need: $.CreationTime or $.LastWriteTime

Increase speed of PowerShell Get-ChildItem large directory

I have a script that references a .csv document of filenames and then runs a Get-ChildItem over a large directory to find the file and pull the 'owner'. Finally the info outputs into another .csv document. We use this to find who created files. Additionally I have it create .txt files with filename and timestamp to see how fast the script is finding the data. The code is as follows:
Get-ChildItem -Path $serverPath -Filter $properFilename -Recurse -ErrorAction 'SilentlyContinue' |
Where-Object {$_.LastWriteTime -lt (get-date).AddDays(30) -and
$_.Extension -eq ".jpg"} |
Select-Object -Property #{
Name='Owner'
Expression={(Get-Acl -Path $_.FullName).Owner}
},'*' |
Export-Csv -Path "$desktopPath\Owner_Reports\Owners.csv" -NoTypeInformation -Append
$time = (get-date -f 'hhmm')
out-file "$desktopPath\Owner_Reports\${fileName}_$time.txt"
}
This script serves it's purpose but is extremely slow based on the large size of the parent directory. Currently it takes 12 minutes per filename. We query approx 150 files at a time and this long wait time is hindering production.
Does anyone have better logic that could increase the speed? I assume that each time the script runs Get-ChildItem it recreates the index of the parent directory, but I am not sure. Is there a way we can create the index one time instead of for each filename?
I am open to any and all suggestions! If more data is required (such as the variable naming etc) I will provide upon request.
Thanks!

Powershell: Recursively search a drive or directory for a file type in a specific time frame of creation

I am trying to incorporate Powershell into my everyday workflow so I can move up from a Desktop Support guy to a Systems Admin. One question that I encountered when helping a coworker was how to search for a lost or forgotten file saved in an unknown directory. The pipeline I came up with was:
dir C:\ -Recurse -Filter *.pdf -ErrorAction SilentlyContinue -Force | Out-File pdfs.txt
This code performed exactly how I wanted but now I want to extend this command and make it more efficient. Especially since my company has clients with very messy file management.
What I want to do with this pipeline:
Recursively search for a specific file-type that was created in a specified time-frame. Lets say the oldest file allowed in this search is a file from two days ago.
Save the file to a text file with the columns containing the Filename, FullName(Path), and sorted by the created time in descending order.
What I have so far:
dir C:\ -Recurse -Filter *.pdf -ErrorAction SilentlyContinue -Force | Select-Object Name, FullName | Out-File *pdfs.txt
I really need help on how to create a filter for the time that the file was created. I think I need to use the Where-Object cmdlet right after the dir pipe and before the Select Object pipe but I don't know how to set that up. This is what I wrote: Where-Object {$_.CreationTime <
You're on the right track, to get the files from a specific file creation date range, you can pipe the dir command results to:
Where-Object {$_.CreationTime -ge "06/20/2017" -and $_.CreationTime -le "06/22/2017"}
If you want something more repeatable where you don't have to hard-code the dates everytime and just want to search for files from up to 2 days ago you can set variables:
$today = (Get-Date)
$daysago = (Get-Date).AddDays(-2)
then plugin the variables:
Where-Object {$_.CreationTime -ge $daysago -and $_.CreationTime -le $today}
I'm not near my Windows PC to test this but I think it should work!
See if this helps
dir c:\ -Recurse -Filter *.ps1 -ErrorAction SilentlyContinue -Force | select LastWriteTime,Name | Where-Object {$_.LastWriteTime -ge [DateTime]::Now.AddDays(-2) } | Out-File Temp.txt

Powershell network drive Get-ChildItem issues

Essentially I'm trying to use PowerShell to find files with certain file extensions on a network drive created on or after June 1st of this year. I thought I could do that with the following statement:
Get-ChildItem NETWORKPATH*. -recurse -include .xlsx | Where-Object { $_.CreationTime -ge "06/01/2014" }
I run into 2 problems:
The command only returns files from the root and one folder on the network drive, there's over 100 folders on this network drive
The command returns 3 out of the 5 files created after 6/1/14 and one created well before my creation time date.
I have access to all of the folders on the network drive. When I run Windows 7 search it finds all of the files. It doesn't matter if I run Powershell as administrator or not. It doesn't matter if I run it from my machine (Windows 7) or from one of our 2008 servers. The network drive I'm trying to search through is on a 2003 file server. What am I doing wrong?
Make sure you add a wildcard to your Include parameter. Also you should never use strings for date comparison. See the example of why not here. Try the following:
$testDate = new-object DateTime (2014,06,01)
Get-ChildItem NETWORKPATH*. -recurse -include *.xlsx | Where-Object { $_.CreationTime -ge $testDate }
Also note that files and folders marked as hidden will not show up unless you add a -force to the get-childitem. Not sure if that is part of the issue or not.
gci -path PATH -recurse | where {$_.extension -match "xlsx"} was the silver bullet to all of this.
This is what I use.
$Extensions = '*.xlsx','*.csv','*.xls'
$path = 'Network path'
Get-ChildItem "$path" -Include $Extensions -Recurse -Force | where {$_.CreationTime -gt
[datetime]"10/05/2018"} | Select * | Export-Csv -Path C:\TestExcelfiles.csv -
NoTypeInformation | fl * #format-wide