Powershell Workflow Chugging at Memory and Crashing - powershell

I'm dabbling with workflows in powershell and I'm noticing some odd behavior. The below script will work when the directory doesn't contain a lot of files. After some point it will hold on line 6 (when run in the ise you'll see the workflow status bar), munch up memory, then eventually crash (after at least half an hour). This crash happens when the directory of files is at least 1.25GB, but not when the $Path has only 50mb of files. Here's an easy test:
Workflow Test-Me {
Param
(
$Path = "c:\temp",
$Days = 0
)
$Files = InlineScript{
Get-ChildItem -Path $using:Path -File -Recurse -Force | Where-Object {$_.LastWriteTime -lt ((get-date).AddDays(-$using:Days))}
}
$Files
}
Now the odd thing is that when Get-ChildItem -Path $using:Path -File -Recurse -Force | Where-Object {$_.LastWriteTime -lt ((get-date).AddDays(-$using:Days))} is run from outside of the workflow (in a regular function or just on a line of the shell) it completes in less than a minute, even with 1.25GB of files.
What is the workflow doing that causes it to eat memory, take a long time, and crash? It's obviously doing something unexpected. Again, it works if there's only a few files in the directory.
Also, a solution/workaround would be great.
Research:
Activity to invoke the Microsoft.PowerShell.Management\Get-ChildItem command in a workflow
Running Windows PowerShell Commands in a Workflow

The problem here appears to be with the retention of object data. Adding a select reduces the size of the returned object data so much so that searching 100GB+ did not cause it to crash. Solution is as followed:
Workflow Test-Me {
Param
(
$Path = "c:\temp",
$Days = 0
)
$Files = InlineScript{
Get-ChildItem -Path $using:Path -File -Recurse -Force | Where-Object {$_.LastWriteTime -lt ((get-date).AddDays(-$using:Days))} | select filename
}
$Files
}

Related

Powershell script which will search folders with regex and which will delete files older than XX

I need a powershell script ;
it must search some subfolders which folders names are starting with character between 1 and 6 (like 1xxxx or 2xxx)
and using the name of these folders as variable it must look under each folder for the *.XML files which are older than 30 min
and if it finds them it must delete it.
there may be more than one folder at same time, which are providing the same conditions so IMO using an array is a good choice. But I'm always open to other ideas.
Anybody can help me please ?
Basically I was using this before the need changes but now it doesnt help me.
powershell -nologo -command Get-ChildItem -Path C:\geniusopen\inbox\000\ready\processed | Where CreationTime -lt (Get-Date).AddDays(-10) | remove-item
Thank you
You can do something like the following and just remove -WhatIf if you are satisfied with the results:
$Time = (Get-Date).AddMinutes(-30)
Get-ChildItem -Path 'C:\MostCommonLeaf' -Recurse -File -Filter '*.xml' |
Where {$_.CreationTime -lt $Time -and (Split-Path $_.DirectoryName -Leaf) -match '^[1-6]' -and $_.Extension -eq '.xml'} |
Remove-Item -WhatIf
MostCommonLeaf would be the lowest level folder that could start as your root search node. We essentially don't want to traverse directories for nothing.
You could potentially make the script above better if you know more about your directory structure. For example, if it is predictable within the path where the 1xxx folders will be, you can construct the -Path parameter to use the [1-6] range wildcard. -Filter '*.xml' could also return .xmls files for example, so that's why there is additional extension condition in the Where.
Using -Recurse and -Include together generally results in much slower queries. So even if tempted, I would avoid a solution that uses those together.
If there are millions of files/directories, a different command construction could be better. Running Split-Path millions of times could be less efficient than just matching on the directory name, e.g. where {$_.DirectoryName -match '\\[1-6][^\\]*$'}.
I think you are looking for something like this:
$limit = (Get-Date).AddMinutes(-30)
$path = "C:\Users\you\xxx"
$Extension = "*.xml"
Get-ChildItem -Path $path -Filter $Extension -Force | Where-Object {$_.CreationTime -lt $limit} | Remove-Item
I haven't tested it though.
Keep in mind whether you need: $.CreationTime or $.LastWriteTime

Issues with creating a scheduled task that runs a powershell script with parameters

I am trying to execute a PowerShell script with parameters as a scheduled task. On the Start a program screen I have
Program/script
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
and in Add arguments
-Command "& C:\Test\MoveFiles.ps1 -destinationRoot \\OB-VM-ME-Data\ME-Data\Archived\Test"
What am I doing incorrectly?
EDIT: Attached is the script in question
Param (
[Parameter(Mandatory=$true)][string]$destinationRoot
)
$path = (Get-Item -Path ".\").FullName
Get-ChildItem $path\* -Include *.bmp, *.svg | Where-Object {
$_.LastWriteTime -lt (Get-Date).AddDays(-30)
} | ForEach-Object {
$content = $path + "\" + $_.Name
$year = (Get-Item $content).LastWriteTime.Year.ToString()
$monthNumber = (Get-Item $content).LastWriteTime.Month
$month = (Get-Culture).DateTimeFormat.GetMonthName($monthNumber)
$destination = $destinationRoot + "\" + $year + "\" + $month
New-Item -ItemType Directory -Force -Path $destination
Move-Item -Path $content -Destination $destination -Force
}
Don't use -Command if you want to execute a PowerShell script (with or without parameters). Use -File instead. Change the argument list of the scheduled task into something like this:
-File "C:\Test\MoveFiles.ps1" -destinationRoot "\\OB-VM-ME-Data\ME-Data\Archived\Test"
Edit:
I don't see anything inherently wrong with your script. The only thing sticking out that might prove problematic is that it tries to read files from the current working directory (Get-Item -Path ".\"), which may or may not be what you think it is. You can configure the working directory in the scheduled task settings, though, to remove this variable from the equation.
Since scheduled tasks are notoriously difficult to debug, and it's not even clear what the actual problem is or what causes it, your best bet is probably to follow the debugging steps I outlined in an answer to a similar question.

Windows PowerShell - Delete Files Older than X Days

I am currently new at PowerShell and I have created a script based on gathered information on the net that will perform a Delete Operation for found files within a folder that have their LastWriteTime less than 1 day.
Currently the script is as follows:
$timeLimit = (Get-Date).AddDays(-1)
$oldBackups = Get-ChildItem -Path $dest -Recurse -Force -Filter "backup_cap_*" |
Where-Object {$_.PSIsContainer -and $_.LastWriteTime -lt $timeLimit}
foreach($backup in $oldBackups)
{
Remove-Item $dest\$backup -Recurse -Force -WhatIf
}
As far as I know the -WhatIf command will output to the console what the command "should" do in real-life scenarios. The problem is that -WhatIf does not output anything and even if I remove it the files are not getting deleted as expected.
The server is Windows 2012 R2 and the command is being runned within PowerShell ISE V3.
When the command will work it will be "translated" into a task that will run each night after another task has finished backing up some stuff.
I did it in the pipe
Get-ChildItem C:\temp | ? { $_.PSIsContainer -and $_.LastWriteTime -lt $timeLimit } | Remove-Item -WhatIf
This worked for me. So you don't have to ttake care of the right path to the file.
other solution
$timeLimit = (Get-Date).AddDays(-1)
Get-ChildItem C:\temp2 -Directory | where LastWriteTime -lt $timeLimit | Remove-Item -Force -Recurse
The original issue was $dest\$backup would assume that each file was in the root folder. But by using the fullname property on $backup, you don't need to statically define the directory.
One other note is that Remove-Item takes arrays of strings, so you also could get rid of the foreach
Here's the fix to your script, without using the pipeline. Note that since I used the where method this requires at least version 4
$timeLimit = (Get-Date).AddDays(-1)
$Backups = Get-ChildItem -Path $dest -Directory -Recurse -Force -Filter "backup_cap_*"
$oldBackups = $backups.where{$_.LastWriteTime -lt $timeLimit}
Remove-Item $oldBackups.fullname -Recurse -Force -WhatIf

Why is my PowerShell script writing blank lines to console?

I have a bit of an odd problem. Or maybe not so odd. I had to implement a "custom clean" for a PowerShell script developed for building some unique configurations for my current project (the whys are not particularly important). Basically it copies a bunch of files from the release directories into some temporary directories with this code:
$Paths = Get-ChildItem $ProjectPath -recurse |
Where-Object { ($_.PSIsContainer -eq $true) -and
(Test-Path($_.Fullname + 'bin\release')) } |
Select-Object Fullname
ForEach ($Path in $Paths)
{
$CopyPath = $Path.Fullname + '\bin\Temp'
$DeletePath = $Path.Fullname + '\bin\Release'
New-Item -ItemType directory -path $CopyPath
Copy-Item $DeletePath $CopyPath -recurse
Remove-Item $DeletePath Recurse
}
And after the build copies it back with:
ForEach ($Path in $Paths)
{
$CopiedPath = $Path.Fullname + '\bin\Temp\'
$DeletedPath = $Path.Fullname + '\bin\Release\'
$Files = Get-ChildItem $CopiedPath -recurse |
where-object {-not $_PSIsContainer}
ForEach ($File in $Files)
{
if(-not (Test-Path ($DeletedPath+$File.Name)))
{
Copy-Item $File.Fullname ($DeletedPath+$File.Name)
}
}
Remove-Item $CopyPath -recurse -force
}
This is pretty clunky and noobish (Sorry, I'm a PowerShell noob drinking from a fire hose), but it works for the purpose and I will clean it up later. However, when it executes the initial copy to the temp directories, it writes a lot of blank lines to the screen, which isn't ideal as I have a message I display while this process is executing to assure our CM doesn't freak out and think it broke, but this message is blown away by the blank lines. Do you know what might be causing this and how I might solve this? I'm using PowerShell 2.0 out of the box and due to the nature of this project I can't upgrade or get any outside libraries. Thanks guys.
If the only thing you're looking to do is clean up the console output, then all you need to do is use the pipeline. You can start the command with [void], which will exclude all information from the pipeline. You can also pipe the whole thing into the Out-Null cmdlet, which will trap all output, except for the lines that don't have output.
The New-Item cmdlet by default returns output to the console on my version of Windows PowerShell (4.0). This may not be true on previous versions, but I think it is... Remove-Item also doesn't return any output, usually. If I were to take a stab, I'd kill output on those lines that use the "Item" noun using one of the methods mentioned above.

Deleting cabinet files

I'm trying to create a script to delete cabinet files in virtual servers. For some reason, the code that I've created ends up not deleting any cabinet files and instead tries to delete the entire WINDOWS Directory, and I have no idea why this is occurring. Was curious if anyone might have any ideas on what the issue may be, since I can't find anything:
$dir = "\\$server" + '\C$\windows'
$cabinetArray = #()
foreach ($item in get-childitem -path $dir){
if ($item.name -like "*.cab"){
$cabinetArray = $cabinetArray + $item
}
}
for ($i = 0; $i -le $cabinetArray.length; $i++){
$removal = $dir + "\" + $cabinetArray[$i]
remove-item $removal -force -recurse
}
I did some testing and it seems that for some reason my array that I'm trying to use to gather all the cabinet files isn't even getting filled for some reason. I'm not sure if there's a specific way to only gather the .cab files since right now whenever I run this on my test server it tries deleting everything.
I don't know if deleting all the cab files in that folder is a good idea or not, but I'll answer your question. You're doing a lot of math and building your own collection of objects when PoweShell will do it all for you. Try something like this:
$dir = "\\" + $server + '\C$\windows'
$cabinetFiles = Get-ChildItem -Path $dir -Filter "*.cab" -Recurse
$cabinetFiles | %{
Remove-Item -Path $_.FullName -Force
}
Or, as a one liner:
Get-ChildItem -Path ("\\" + $server + '\C$\windows') -Filter "*.cab" -Recurse | %{Remove-Item -Path $_.FullName -Force}
Use the pipeline, here's a simplified version of your code (remove -WhatIf do delete the files). The code gets all *.cab files from the windows directory of the remote box (recursively), makes sure that only file objects passes on and then deletes them.
Get-ChildItem "\\$server\admin$" -Filter *.cab -Recurse |
Where-Object {!$_.PSIsContainer} |
Remove-Item -Force -WhatIf
For some reason, the code that I've created ends up not deleting any cabinet files and instead tries to delete the entire WINDOWS Directory, and I have no idea why this is occurring.
It is occurring because your for loop is being entered, and that is happening because $cabinetArray's length is zero. Once the for loop is entered, the $removal variable is assigned the value of $dir plus a trailing backslash. You are then calling remove-item on the windows directory.