PowerShell Script not running when set in the task scheduler - powershell

I'm attempting to create a task via powershell to delete some files older then 6 hours, if I execute the script from powershell there are no issues, if I try to execute from task scheduler nothing happens..
Call the Powershell.exe in my schedulded task:
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
Set this Parameters :
-NoProfile -ExecutionPolicy Bypass -Command -NonInteractive -File "C:\Scripts\DeleteFilesDiff3H.PS1"
What could be the problem of the task scheduler not launching my script?
Tried to aply some solutions provide to similar issues without success
$Path = "E:\MyPath"
$now = Get-Date
Get-Childitem * |
Where-Object { $_.LastWriteTime -le $now.AddHours(-6) } |
Remove-Item -Recurse -Force
I got this messages:
Task Scheduler started "{38dcd44b-4210-473b-921e-3cc1442ff03b}" instance of the "\Delete Files 3H" task for user "my user".
Task Engine "S-1-5-21-159114655-2248028564-2417230598-213599:My User:Interactive:LUA[2]" received a message from Task Scheduler service requesting to launch task "\Delete Files 3H" .
Task Scheduler launched "{38dcd44b-4210-473b-921e-3cc1442ff03b}" instance of task "\Delete Files 3H" due to a time trigger condition.
Task Scheduler successfully completed task "\Delete Files 3H" , instance "{618e6f44-b523-4c56-ae0b-04d3552391cc}" , action "C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe" with return code 0.

You don't make use of the defined variable $path so Get-ChildItem will never look there. Update your code to the following and check if this works for you:
$Path = "E:\MyPath"
$now = Get-Date
Get-Childitem -path $Path |
Where-Object { $_.LastWriteTime -le $now.AddHours(-6) } |
Remove-Item -Recurse -Force

Related

Powershell script on VM as scheduled task

I wrote a simple script to move files that contain a specific substring in the file name from one folder to another.
$Source = Get-ChildItem -Path "C:\source" -Filter *.xlsm
$Target = "C:\target"
$substring = 'copy'
foreach ($file in $Source) {
if($file.Name -match $substring){
Move-Item -Path $file.FullName -Destination $Target -Force
}
}
I want to automate it on VM. It works fine when I'm running it manually and via task scheduler when I'm logged in VM, however when I switch to 'run whether logged on or logged off' in task scheduler (script properties) it won't work. I run it with following parameters:
-noprofile -executionpolicy unrestricted -noninteractive -file "path to my script"
Any ideas?
When the option "Run whether user is logged on or not" is selected, the scheduled task runs on a different session.
This means that it does not have access to the mapped network drives !.
So, you need to either map drives in your script or use the fully qualified name (i.e., \server name\share name)
More details here as well

PS1 script stops execution in the middle when run silently

I'm trying to set up a PS1 script to restart the Windows service on the remote machine. Script is supposed to be automatically run by PRTG platform (monitoring solution). Platform has built-in feature which allows you start script for you. Problem is that when PRTG runs the script it stops in a half way without error after the ‘Stop-Service’ cmdlet. When I remove it from the script then full script is run. I've tried different variations of the script but it's always like it stops when it finished execution of ‘Stop-Service’.
The script supposed to be run in silent mode by the PRTG. Do you have any idea what can be the cause of it or how to check it?
$service = Get-Service -ComputerName 123.123.123.123 -Name Tomcat
Stop-Service -InputObject $service -Force
Move-Item -Path "\\123.123.123.123\C$\Program Files (x86)\tomcat\logs\tomcat.log" -Destination "\\123.123.123.123\C$\Program Files (x86)\tomcat\logs\log_archieve"
Get-ChildItem "\\123.123.123.123\C$\Program Files (x86)\tomcat\logs\log_archieve\tomcat.log" | ForEach-Object {
Rename-Item $_.FullName "$BackupFolder$($_.BaseName -replace " ", "_" -replace '\..*?$')-$(Get-Date -Format "ddMMyyyy")_oldlog.log"
}
Start-Service -InputObject $service -Verbose

How to make a script run only while another one is running

I need your help resolving an exercise in Powershell.
I have to make a script which runs only when another one is running.
For example, I have to run a script which deletes files older than 1 day while a script which restarts a process runs.
I tried to use jobs to make the script run in parallel but I haven't had any succes.
--The script to delete files
Get-ChildItem -Path "a path" -include *.txt -Recurse | Where-Object {$_.LastWriteTime -lt $DateToDelete} | Remove-Item -Force
--The script to restart a process
Get-Process notepad | Stop-Process | Start-Process
I think your problem is with the second script, you can't restart a process like that.
If you tried this line Get-Process notepad | Stop-Process | Start-Process in the console it will prompt you requesting the FilePath to the process you want to start, that's because Stop-Process do not return any result to the pipeline and then Start-Process is not receiving anything from the pipeline.
Look here to see how to restart process using PowerShell
And take a look at this MS module Restart-Process
Use this code to run scripts as job:
$days = -1
$currentDate = Get-Date
$DateToDelete = $currentDate.AddDays($days)
start-job -ScriptBlock {
Get-ChildItem -Path "a path" -include *.txt -Recurse | Where-Object {$_.LastWriteTime -lt $DateToDelete} | Remove-Item -Force
}
start-job -ScriptBlock {
Get-Process notepad | Stop-Process
}

Get-Child Item not working through task scheduler

I am trying to run this script through GPO deployed scheduled task:
$registryPath = 'HKLM:\Software\CC\PST_Discovery_Script\Already_Run'
if (!(Test-Path -Path $registryPath)) {
$dir = "c:\Users"
$ext = "pst"
Get-ChildItem "$dir" *$ext -r | Select-Object FullName,LastAccessTime,LastWriteTime,CreationTime,#{N='Owner';E={$_.GetAccessControl().Owner}}, #{Name="MegaBytes"; Expression={"{0:F2}" -f ($_.Length / 1MB)}}, #{N='Hostname';E={$env:computername}} | export-csv "c:\PST_Discovery.csv" -Append -NoTypeInformation
New-Item -Path HKLM:\Software\CC\PST_Discovery_Script -Name Already_Run –Force
}
It works fine if I run the script manually through the Powershell console/ISE, but not through a scheduled task.
If I run it through a scheduled task, I know the script is running because it reads the registry key, and if it doesn't exist it writes a registry key, but it does't actually run the get-childitem line or export a CSV.
The scheduled task shows up on the client, and it's running using a Domain Admin credentials (me)
EDIT: Sorry, my formatting for the code went all wrong, i think it should be fixed up now
kaspermoerch: Yes, it's a domain admin, and thus has full permissions over the file system
boxdog: I actually had it writing to a UNC share, but changed it to local computer because it wasn't working. I'll try some other combinations of output location and user.
TheIncorrigible: Originally it was system, but it wasn't working so I edited the pushed out scheduled task and am using my domain admin account.
Adam:
- Yes, scheduled task is created
- Yes, task runs script using following code: Powershell.exe -ExecutionPolicy Bypass \server1\share1\PST_Discovery_Script.ps1
- Yes, it runs using my DA creds
- Yes, the file isn't created, though it still writes the registry value
I've checked scheduled task, see my screenshot. I'm elevating the task scheduler and manually running the task.
Just to make sure I understand correctly.
The scheduled task is created.
The scheduled task runs a script containing the following code.
The scheduled task executes the script using your credentials.
Your account is in the Domain Admin group.
By "not working", the PST_Discovery.csv file isn't created.
Correct me if I misunderstood anything.
First: I'd verify the scheduled task is running in an elevated context (runas administrator). Even if the account executing the task is an administrator, the job needs to run in an elevated context.
Second: I'm curious how this works for you but not me. I've never seen a call to Select-Object quite like you've got. If I try to pipe an gci | Select-Object -Property #(...what you have...) | Export-Csv... I get an exception complaining about the FullName.
$registryPath = 'HKLM:\Software\CC\PST_Discovery_Script\Already_Run'
if (!(Test-Path -Path $registryPath)) {
$dir = 'c:\Users'
$ext = 'pst'
Get-ChildItem -Path $dir -Filter *$ext -Recurse |
Select-Object -Property #(
FullName
LastAccessTime
LastWriteTime
CreationTime
#{N = 'Owner'; E = {$_.GetAccessControl().Owner}}
#{Name = "MegaBytes"; Expression = {"{0:F2}" -f ($_.Length / 1MB)}}
#{N = 'Hostname'; E = {$env:computername}}
) |
Export-Csv -Path c:\PST_Discovery.csv -Append -NoTypeInformation
New-Item -Path HKLM:\Software\CC\PST_Discovery_Script -Name Already_Run –Force
}
You're 100% sure that works? I had to change that snippet to the following:
Select-Object -Property #(
, 'FullName'
, 'LastAccessTime'
, 'LastWriteTime'
, 'CreationTime'
#{N = 'Owner'; E = {$_.GetAccessControl().Owner}}
#{Name = "MegaBytes"; Expression = {"{0:F2}" -f ($_.Length / 1MB)}}
#{N = 'Hostname'; E = {$env:computername}}
I'm running Powershell 5 on Windows 10. I'm executing your example from the CLI.
I copied the code format you laid out, and it worked fine for me.
A quick and easy method to capture anything going wrong would be to change the error action preference to stop at the start of your script so that the process exits at the first error and any errors will flow back to the scheduled task's Last Run Result. From there you should be able to see the exception.
$ErrorActionPreference = 'Stop'
That should at least provide a starting point if you haven't already solved it. If that returns anything other than 0, then you know to build in some error handling with try/catch and perhaps an output log, to help you get to the bottom of it.

How powershell handles returns from & calls

We have field devices that we decided to use a powershell script to help us handle 'updates' in the future. It runs every 5 minutes to execute rsync to see if it should download any new files. The script, if it sees any file types of .ps1, .exe, .bat ,etc. will then attempt to execute those files using the & operator. At the conclusion of execution, the script will write the file executed an excludes file (so that rsync will not download again) and remove the file. My problem is that the return from the executed code (called by &) behaves differently, depending on how the main script is called.
This is the main 'guts' of the script:
Get-ChildItem -Path $ScriptDir\Installs\* -Include #("*.ps1","*.exe","*.cmd","*.VBS","*.MSI") | ForEach {
Write-Verbose "Executing: $_"
& $_
$CommandName = Split-Path -Leaf $_
Write-Verbose "Adding $CommandName to rsync excludes"
Write-Output "$CommandName" | Out-File -FilePath $ScriptDir\excludes -Append -Encoding ASCII
Write-Verbose "Deleting '$_'"
Remove-Item $_
}
When invoking powershell (powershell.exe -ExecutionPolicy bypass) and then executing the script (.\Update.ps1 -Verbose), the script runs perfectly (i.e. the file is written to excludes and deleted) and you can see the verbose output (writing and deleting).
If you run the following (similar to task scheduler) powershell.exe -ExecutionPolicy bypass -NoProfile -File "C:\Update.ps1" -Verbose, you can see the new script get executed but none of the steps afterwards will execute (i.e. no adding to excludes or removing the file or the verbose outputs).