How to make a script run only while another one is running - powershell

I need your help resolving an exercise in Powershell.
I have to make a script which runs only when another one is running.
For example, I have to run a script which deletes files older than 1 day while a script which restarts a process runs.
I tried to use jobs to make the script run in parallel but I haven't had any succes.
--The script to delete files
Get-ChildItem -Path "a path" -include *.txt -Recurse | Where-Object {$_.LastWriteTime -lt $DateToDelete} | Remove-Item -Force
--The script to restart a process
Get-Process notepad | Stop-Process | Start-Process

I think your problem is with the second script, you can't restart a process like that.
If you tried this line Get-Process notepad | Stop-Process | Start-Process in the console it will prompt you requesting the FilePath to the process you want to start, that's because Stop-Process do not return any result to the pipeline and then Start-Process is not receiving anything from the pipeline.
Look here to see how to restart process using PowerShell
And take a look at this MS module Restart-Process
Use this code to run scripts as job:
$days = -1
$currentDate = Get-Date
$DateToDelete = $currentDate.AddDays($days)
start-job -ScriptBlock {
Get-ChildItem -Path "a path" -include *.txt -Recurse | Where-Object {$_.LastWriteTime -lt $DateToDelete} | Remove-Item -Force
}
start-job -ScriptBlock {
Get-Process notepad | Stop-Process
}

Related

How to repeat powershell command again and again?

Is there anyway we can repeat powershell script once it end? i want to run this command and once it complete i want this command to run again. i don't want to use task scheduler.
gci . -Recurse -Directory | % { if(!(gci -Path $_.FullName)) {ri -Force -Recurse $_.FullName} }
i usually use while loop
while ($true) {
# do something
}

Run all .exe in a folder

I'm playing with malware in a VM and every script I try gets stuck. Basically I need to run every .exe in a folder. Tried batch files using start, powershell, etc. The issue happens when AV moves some file to quarentine, or some process keep running then the script doesn't jump to the next one.
CMD start works but shows popups when doesn't find some file, then you have to keep clicking to jump to the next file.
These works but get stuck after a while:
Get-ChildItem 'C:\Users\LAB\Desktop\test' | ForEach-Object {
>> & $_.FullName
>> }
Same here:
for %%v in ("C:\Users\LAB\Desktop\test\*.exe") do start "" "%%~v"
and here:
for %%i in (C:\Users\LAB\Desktop\test\*.exe) do %%i
You need to provide some form of code to allow us to help you troubleshoot it; this is not a request a script page.
Anyways, you would be looking at something like this:
#Assuming the .exe's are located in C Root.
Get-ChildItem -Path C:\ | Where-Object {$_.Extension -like ".exe"}| Foreach {Start-Process $_.FullName}
#In Ps, we like to filter as far left as possible for faster results.
Get-ChildItem -Path C:\ -File "*.exe" | Foreach {Start-Process $_.FullName}
#Running the commands as jobs so it doesnt wait on any to finish before running the next.
Start-Job { Get-ChildItem -Path C:\ -File "*.exe" | Foreach {Start-Process $_.FullName} }
Start-Sleep 2
Get-Job | Remove-Job
Please refer to the following link: How to ask a question

PowerShell: using Compress-Archive with Start-Job won't work

I'm trying to use PowerShell to compress a bunch of video files on my H:\ drive. However, running this serially would take a long time as the drive is quite large. Here is a short snippet of the code that I'm using. Some parts have been withheld.
$shows = Get-ChildItem H:\
foreach($show in $shows){
Start-Job -ArgumentList $show -ScriptBlock {
param($show)
$destPath = "$($show.DirectoryName)\$($show.BaseName).zip"
Compress-Archive -Path $show.FullName -DestinationPath $destPath
}
}
When I run a Get-Job, the job shows up as completed with no reason in the JobStateInfo, but no .zip was created. I've run some tests by replacing the Compress-Archive command with an Out-File of the $destPath variable using Start-Job as well.
Start-Job -ArgumentList $shows[0] -ScriptBlock {
param($show)
$show = [System.IO.FileInfo]$show
$destPath = "$($show.DirectoryName)\$($show.BaseName).zip"
$destPath | Out-File "$($show.DirectoryName)\test.txt"
}
A text file IS created and it shows the correct destination path. I've run PowerShell as an Administrator and tried again but that doesn't appear to work either. Not sure if it matters, but I'm running on Windows 10 (latest).
Any help would be appreciated. Thanks!
For some reason, inside the job, the serialized fileinfo object has no basename (it's a scriptproperty). If you have threadjobs, that works.
dir | start-job { $input } | receive-job -wait -auto |
select name,basename,fullname
Name basename FullName
---- -------- --------
file1.txt C:\Users\js\foo\file1.txt
file2.txt C:\Users\js\foo\file2.txt
file3.txt C:\Users\js\foo\file3.txt
Am not sure if that's what you want but I think you first need to create an archive then update that archive with shows so I created zip called archive and looped through adding files.
$shows = Get-ChildItem H:\
foreach($show in $shows){
Compress-Archive -Path $show.FullName -Update -DestinationPath "C:\archive"
}

Windows PowerShell - Delete Files Older than X Days

I am currently new at PowerShell and I have created a script based on gathered information on the net that will perform a Delete Operation for found files within a folder that have their LastWriteTime less than 1 day.
Currently the script is as follows:
$timeLimit = (Get-Date).AddDays(-1)
$oldBackups = Get-ChildItem -Path $dest -Recurse -Force -Filter "backup_cap_*" |
Where-Object {$_.PSIsContainer -and $_.LastWriteTime -lt $timeLimit}
foreach($backup in $oldBackups)
{
Remove-Item $dest\$backup -Recurse -Force -WhatIf
}
As far as I know the -WhatIf command will output to the console what the command "should" do in real-life scenarios. The problem is that -WhatIf does not output anything and even if I remove it the files are not getting deleted as expected.
The server is Windows 2012 R2 and the command is being runned within PowerShell ISE V3.
When the command will work it will be "translated" into a task that will run each night after another task has finished backing up some stuff.
I did it in the pipe
Get-ChildItem C:\temp | ? { $_.PSIsContainer -and $_.LastWriteTime -lt $timeLimit } | Remove-Item -WhatIf
This worked for me. So you don't have to ttake care of the right path to the file.
other solution
$timeLimit = (Get-Date).AddDays(-1)
Get-ChildItem C:\temp2 -Directory | where LastWriteTime -lt $timeLimit | Remove-Item -Force -Recurse
The original issue was $dest\$backup would assume that each file was in the root folder. But by using the fullname property on $backup, you don't need to statically define the directory.
One other note is that Remove-Item takes arrays of strings, so you also could get rid of the foreach
Here's the fix to your script, without using the pipeline. Note that since I used the where method this requires at least version 4
$timeLimit = (Get-Date).AddDays(-1)
$Backups = Get-ChildItem -Path $dest -Directory -Recurse -Force -Filter "backup_cap_*"
$oldBackups = $backups.where{$_.LastWriteTime -lt $timeLimit}
Remove-Item $oldBackups.fullname -Recurse -Force -WhatIf

Powershell Workflow Chugging at Memory and Crashing

I'm dabbling with workflows in powershell and I'm noticing some odd behavior. The below script will work when the directory doesn't contain a lot of files. After some point it will hold on line 6 (when run in the ise you'll see the workflow status bar), munch up memory, then eventually crash (after at least half an hour). This crash happens when the directory of files is at least 1.25GB, but not when the $Path has only 50mb of files. Here's an easy test:
Workflow Test-Me {
Param
(
$Path = "c:\temp",
$Days = 0
)
$Files = InlineScript{
Get-ChildItem -Path $using:Path -File -Recurse -Force | Where-Object {$_.LastWriteTime -lt ((get-date).AddDays(-$using:Days))}
}
$Files
}
Now the odd thing is that when Get-ChildItem -Path $using:Path -File -Recurse -Force | Where-Object {$_.LastWriteTime -lt ((get-date).AddDays(-$using:Days))} is run from outside of the workflow (in a regular function or just on a line of the shell) it completes in less than a minute, even with 1.25GB of files.
What is the workflow doing that causes it to eat memory, take a long time, and crash? It's obviously doing something unexpected. Again, it works if there's only a few files in the directory.
Also, a solution/workaround would be great.
Research:
Activity to invoke the Microsoft.PowerShell.Management\Get-ChildItem command in a workflow
Running Windows PowerShell Commands in a Workflow
The problem here appears to be with the retention of object data. Adding a select reduces the size of the returned object data so much so that searching 100GB+ did not cause it to crash. Solution is as followed:
Workflow Test-Me {
Param
(
$Path = "c:\temp",
$Days = 0
)
$Files = InlineScript{
Get-ChildItem -Path $using:Path -File -Recurse -Force | Where-Object {$_.LastWriteTime -lt ((get-date).AddDays(-$using:Days))} | select filename
}
$Files
}