Trouble Controlling Powershell Workflow Timeout - powershell

I have a powershell workflow which is generating the error:
"The operation did not complete within the allotted timeout of
00:00:30. The time allotted to this operation may have been a portion
of a longer timeout"
The workflow script is:
Workflow Test-Me (){
Param
(
$Path = "c:\temp"
)
$Files = InlineScript{
Get-ChildItem -Path $using:Path -File -Recurse -Force | Where-Object {$_.LastWriteTime -lt ((get-date).AddDays(-$using:Days))} | Select FullName
}
Foreach -parallel ($File in $Files){
sequence{
InlineScript{
Remove-Item -Path $using:File.FullName -force -ErrorAction:SilentlyContinue
} -PSActionRunningTimeoutSec 300
}
}
}
The line that generates the error is the InlineScript which handles the remove-item operation. It runs and works for 30 seconds after it reaches the operation before it quits with the error referenced above. I've added the -PSActionRunningTimeoutSec parameter to the InlineScript and that didn't impact the error. I've also set the workflow common parameters as follows:
-PSRunningTimeoutSec = 300
-PSElapsedTimeoutSec = 0
I call the workflow cmdlet with the following process:
PS C:\> . "c:\path\to\Test-Me.ps1"
PS C:\> Test-Me -PSRunningTimeoutSec 300 -PSElapsedTimeoutSec 0
There's obviously a timeout somewhere that I can't find/don't know about but powershell isn't being specific. What timeout did I miss and how do I change it?
References:
about_WorkflowCommonParameters
PowerShell Workflows: Using Parameters
Syntactic Differences Between Script Workflows and Scripts
about_InlineScript

Related

Powershell. Write event logs

I have a script which moves some files from a folder to the temp folder, archives them and cleans the temp folder at the end.
I want my script to also write information about it in the win-event log.
Here is my script:
Get-ChildItem C:\Users\Administrator\Desktop\test1\ | Where-Object {$_.LastWriteTime -lt "09/24/2018 09:00 PM"} | Move-Item -Destination C:\Users\Administrator\Desktop\data\
Compress-Archive -path C:\Users\Administrator\Desktop\data\ -CompressionLevel Optimal -DestinationPath C:\Users\Administrator\Desktop\data1\test.zip
Remove-Item C:\Users\Administrator\Desktop\data\*
I want to add code which will write an event for any error into the win-event log.
Per the comments, you can use Write-EventLog to write to the Windows Event Logs. If you want to write any errors that occur during those commands, then you probably want to use a Try..Catch to catch any errors and handle them:
Try {
$PrevEAP = $ErrorActionPreference
$ErrorActionPreference = 'Stop'
Get-ChildItem C:\Users\Administrator\Desktop\test1\ | Where-Object {$_.LastWriteTime -lt "09/24/2018 09:00 PM"} | Move-Item -Destination C:\Users\Administrator\Desktop\data\
Compress-Archive -path C:\Users\Administrator\Desktop\data\ -CompressionLevel Optimal -DestinationPath C:\Users\Administrator\Desktop\data1\test.zip
Remove-Item C:\Users\Administrator\Desktop\data\*
Catch {
Write-Error $_
$ErrorEvent = #{
LogName = 'Application'
Source = 'YourScript'
EventID = 123
EntryType = 'Information'
Message = $_
}
Write-EventLog #ErrorEvent
}
Finally {
$ErrorActionPreference = $PrevEAP
}
In order for an exception (error) to trigger a Try..Catch the exception needs to be terminating (vs non-terminating). You can force cmdlets to do terminating errors by setting the cmdlets -ErrorAction to Stop, or you can do this globally via the $ErrorActionPreference variable.
In the catch block, the error is held in the special variable: $_. So we can use Write-Error to still write it out to the console (if you want to) and then we're using Write-EventLog to write it into the Event Log.
Customise LogName, Source, EventID, Information etc. as per your needs. Note LogName needs to be one of the existing Logs and Entry Type needs to be one of the valid entry types (Information, Warning, Error).

Move-Item : The input object cannot be bound to any parameters for the command either because the command does not take pipeline input?

I wanted to simply move files older that 30 days from "x" to "y"
however I get the following error.. Move-Item : The input object cannot be bound to any parameters for the command either because the command does not take pipeline input or the input and its properties do not match any of the parameters that take pipeline input.
Thought script was simple enough? Any suggestions?
# Move All Files From One Location To New Location Older than 30 day(s)
$LocationX = "\\Server\LogFiles\"
$LocationY = "\\BackupServer\LogFiles\"
$Daysback = "-30"
$CurrentDate = Get-Date
$DatetoMove = $CurrentDate.AddDays($Daysback)
Get-ChildItem $LocationX | Where-Object { $_.LastWriteTime -lt $DatetoMove } | Move-Item $LocationX $LocationY -Force
Look at Example 4 from Move-Item on TechNet
PS C:\> Get-ChildItem -Path ".\*.txt" -Recurse | Move-Item -Destination "C:\TextFiles"
You will see the destination is the only specified path parameter. Notice the source path is omitted since the fileinfo object being passed in the pipe will already have that information.
In your pipeline you're filling in the two parameters that would have being matched via the pipeline. Therefore there is no other place to be filled hence the ladder half of the error you are getting.
the input and its properties do not match any of the parameters that take pipeline input.
So, using your code and the logic from the help you should be able to do this and get the results you expect.
Get-ChildItem $LocationX |
Where-Object { $_.LastWriteTime -lt $DatetoMove } |
Move-Item -Destination $LocationY -Force

Powershell to Event

I've tried to make a PowerShell script that writes an event when the text files in an folder are older than 15 minutes.
This is the code I'm trying to use:
$targetfile=get-childitem C:\Users\user\Desktop\bizin\*.txt
if ($targetfile.AddMinutes -15 )
{
write-eventlog -logname Application -source ESENT -eventID 9999 -entrytype Error -message "No new files in 15min gate is down!" -category 3
'create event log to say bad things have happened'
}
I always get the event when I run the script. I would like to only get an event when the file is older than 15 minutes. It's going to be a scheduled task to run every 15 minutes.
You need to access the LastWriteTime property of the file and compare it with a DateTime object:
$targetFiles = Get-ChildItem -Path 'C:\Users\user\Desktop\bizin\' -Filter *.txt
foreach ($targetFile in $targetFiles)
{
if ($targetFile.LastWriteTime -lt [DateTime]::Now.AddMinutes(-15))
{
# Write the eventlog....
}
}
Note: You can use the -Filter parameter for the Get-ChildItem cmdlet to retrieve only txtfiles.
Here is the same example with use of aliases and the pipeline in just one line:
gci -Path 'C:\Users\user\Desktop\bizin\' -Filter *.txt | ? { $_.LastWriteTime -lt [DateTime]::Now.AddMinutes(-15) } | % { <# Write the eventlog... #> }

Wait for a script to finish

I am using a script which backups the folders and than in the next block tries to delete those folder from there original location.This is the script
if ($confirmation -eq 'y') {
# 3. BACKUP script
./bakup_mysite.ps1
# 4. DELETE CONTENTS OF my_site
get-childitem "C:\inetpub\wwwroot\my_site\" -recurse | % {
remove-item $_.FullName -recurse -force
}
}
If I put a Read-Host after step 3 it does stop and ask the user to press any key and then it deletes the next block. But I want to put a wait so the user doesn't have to press any key and everything happens automatically.
This is the backup code which gets called from my_site.ps1
$Service_folder_name = 'C:\Services\'
$Pics_folder_name = 'C:\Pics\'
$Date = Get-Date
$folder_date = $Date.ToString("yyyy-MM-dd_HHmm")
$backup_folder_name = 'c:\_Backups\my_site\' + $folder_date
if (!(Test-Path -path $backup_folder_name)) {
New-Item $backup_folder_name -type directory
}
if ((Test-Path -path $Pics_folder_name)) {
gi $pics_folder_name | .\Library\out-zip.ps1 $backup_folder_name\pics.zip $_
}
if ((Test-Path -path $Service_folder_name)) {
gi $Service_folder_name | .\Library\out-zip.ps1 $backup_folder_name\Services.zip $_
}
For Powershell cmdlets or functions, PowerShell waits before starting the next command. If it is not the case for your backup script, the trick is to pipeline to Out-Null :
./bakup_mysite.ps1 | Out-Null
PowerShell will wait until your script has exited before continuing.
Another option is to use a background job:
$BackupJob = Start-Job -FilePath "\Path\To\bakup_mysite.ps1"
Wait-Job $BackupJob
Powershell will wait until the job $BackupJob has completed before moving on to the next commands.

backing up .thumbnails powershell

I have written a backup script, which backs up and logs errors. works fine , except for some .thumbnails, many other .thumbnails do get copied!
of 54000 Files copied, the same 480 .thumbnails do not ever get copied or logged. i will be checking the attributes however i feel the copy-item function shouldve done the job. Any other recommendations are welcome as well, but please stay on topic, thx!!!!
here is my backUP script
Function backUP{ Param ([string]$destination1 ,$list1)
$destination2 = $destination1
#extract new made string for backuplog
$index = $destination2.LastIndexOf("\")
$count = $destination2.length - $index
$source1 = $destination2.Substring($index, $count)
$finalstr2 = $logdrive + $source1
Foreach($item in $list1){
Copy-Item -Container: $true -Recurse -Force -Path $item -Destination $destination1 -erroraction Continue
if(-not $?)
{
write-output "ERROR de copiado : " $error| format-list | out-file -Append "$finalstr2\GCI-ERRORS-backup.txt"
Foreach($erritem in $error){
write-output "Error Data:" $erritem.TargetObject | out-file -Append "$finalstr2\GCI-ERRORS-backup.txt"
}
$error.Clear()
}
}
}
Are you sure your backUP function is receiving .thumbnails files in $list1? If the files are hidden, then Get-ChildItem will only return them if the -Force switch is used.
As for other recommendations, Robocopy.exe is a good dedicated tool for performing file synchronization.
Apparantly permissions to the thumbnails folder, i had not.
that set, script worked fine!