Wait for a script to finish - powershell

I am using a script which backups the folders and than in the next block tries to delete those folder from there original location.This is the script
if ($confirmation -eq 'y') {
# 3. BACKUP script
./bakup_mysite.ps1
# 4. DELETE CONTENTS OF my_site
get-childitem "C:\inetpub\wwwroot\my_site\" -recurse | % {
remove-item $_.FullName -recurse -force
}
}
If I put a Read-Host after step 3 it does stop and ask the user to press any key and then it deletes the next block. But I want to put a wait so the user doesn't have to press any key and everything happens automatically.
This is the backup code which gets called from my_site.ps1
$Service_folder_name = 'C:\Services\'
$Pics_folder_name = 'C:\Pics\'
$Date = Get-Date
$folder_date = $Date.ToString("yyyy-MM-dd_HHmm")
$backup_folder_name = 'c:\_Backups\my_site\' + $folder_date
if (!(Test-Path -path $backup_folder_name)) {
New-Item $backup_folder_name -type directory
}
if ((Test-Path -path $Pics_folder_name)) {
gi $pics_folder_name | .\Library\out-zip.ps1 $backup_folder_name\pics.zip $_
}
if ((Test-Path -path $Service_folder_name)) {
gi $Service_folder_name | .\Library\out-zip.ps1 $backup_folder_name\Services.zip $_
}

For Powershell cmdlets or functions, PowerShell waits before starting the next command. If it is not the case for your backup script, the trick is to pipeline to Out-Null :
./bakup_mysite.ps1 | Out-Null
PowerShell will wait until your script has exited before continuing.
Another option is to use a background job:
$BackupJob = Start-Job -FilePath "\Path\To\bakup_mysite.ps1"
Wait-Job $BackupJob
Powershell will wait until the job $BackupJob has completed before moving on to the next commands.

Related

script file, wait function until end of generation files

I have a script file (batch file) which generate three files in a specific folder. Then i have a ps1 file which copy / move the generated files to another server / folders. Separately, everything is working properly
I'd like if it's possible to merge this, and have a wait function between the two scripts. In fact launching the copy / move ps1 function, only when the three files was correctly generated.
The following assumes:
that the files are created and written in full in a single operation.
that it is the appearance of a *.zip file that signals that all files of interest have been created (though they may still in the process of being written to), as you've indicated in a later comment.
$inFolder = '.' # set to the folder of interest.
$outFolder = './out' # ditto
Write-Verbose -vb 'Waiting for a *.zip file to appear...'
while (-not (Test-Path "$inFolder/*.zip")) { Start-Sleep 1 }
# Get a list of all files.
$files = Get-ChildItem -File $inFolder
Write-Verbose -vb 'Waiting for all files to be written completely...'
$files | ForEach-Object {
do {
# Infer from the ability to obtain an exclusive lock that the file has
# has been written in its entirety.
try { [IO.File]::Open($_.FullName, 'Open', 'Read', 'None').Dispose(); return }
catch { Start-Sleep 1 }
} while ($true)
}
# Move the files elsewhere
Write-Verbose -vb 'Moving...'
$files | Move-Item -Destination $outFolder -WhatIf
Note: The -WhatIf common parameter in the last command above previews the operation. Remove -WhatIf once you're sure the operation will do what you want.
param(
[String]$sourceDirectory = "c:\tmp\001",
[String]$destDirectory = "c:\tmp\001"
)
Get-ChildItem $sourceDirectory | ? {
#this step wait while locks free
[bool]$flag
while (!$flag) {
try {
$FileStream = [System.IO.File]::Open($_,'Open','Write')
$FileStream.Close()
$FileStream.Dispose()
$flag = $true
}
catch{
Start-ScheduledTask -s 1
$null
}
}
$true
} | Copy-Item -Destination $destDirectory

Powershell move files to a new folder that are not still writing to the source folder

I have a powershell script that's moving files from a source directory over to a target directory every 15 minutes. Files of around 1 meg are moving into the source directory by an SFTP server... so the files can be written at anytime by the SFTP clients.
The Move-Item command is moving files, however it seems that it's moving them without making sure the file isn't still being written (in-use?).
I need some help coming up with a way to write the files from the source to the target and make sure the entire file gets to the target. Anyone run across this issue before with Powershell?
I searched and was able to find a few functions that said they solved the problem but when I tried them out I wasn't seeing the same results.
Existing PowerShell script is below:
Move-Item "E:\SFTP_Server\UserFolder\*.*" "H:\TargetFolder\" -Verbose -Force *>&1 | Out-File -FilePath E:\Powershell_Scripts\LOGS\MoveFilesToTarget-$(get-date -f yyyy-MM-dd-HH-mm-ss).txt
I ended up cobbling together a few things and got this working as I wanted it. Basically I'm looping through the files and checking the length of the file once... then waiting a second and checking the length of the file again to see if it's changed. This seems to be working well. Here's a copy of the script incase it helps anyone in the future!
$logfile ="H:\WriteTest\LogFile_$(get-date -format `"yyyyMMdd_hhmmsstt`").txt"
function log($string, $color)
{
if ($Color -eq $null) {$color = "white"}
write-host $string -foregroundcolor $color
$string | out-file -Filepath $logfile -append
}
$SourcePath = "E:\SFTP_Server\UserFolder\"
$TargetPath = "H:\TargetFolder\"
$Stuff = Get-ChildItem "$SourcePath\*.*" | select name, fullname
ForEach($I in $Stuff){
log "Starting to process $I.name" green
$newfile = $TargetPath + $I.name
$LastLength = 1
$NewLength = (Get-Item $I.fullname).length
while ($NewLength -ne $LastLength) {
$LastLength = $NewLength
Start-Sleep -Seconds 1
log "Waiting 1 Second" green
$NewLength = (Get-Item $I.fullname).length
log "Current File Length = $NewLength" green
}
log "File Not In Use - Ready To Move!" green
Move-Item $I.fullname $TargetPath
}

Trouble Controlling Powershell Workflow Timeout

I have a powershell workflow which is generating the error:
"The operation did not complete within the allotted timeout of
00:00:30. The time allotted to this operation may have been a portion
of a longer timeout"
The workflow script is:
Workflow Test-Me (){
Param
(
$Path = "c:\temp"
)
$Files = InlineScript{
Get-ChildItem -Path $using:Path -File -Recurse -Force | Where-Object {$_.LastWriteTime -lt ((get-date).AddDays(-$using:Days))} | Select FullName
}
Foreach -parallel ($File in $Files){
sequence{
InlineScript{
Remove-Item -Path $using:File.FullName -force -ErrorAction:SilentlyContinue
} -PSActionRunningTimeoutSec 300
}
}
}
The line that generates the error is the InlineScript which handles the remove-item operation. It runs and works for 30 seconds after it reaches the operation before it quits with the error referenced above. I've added the -PSActionRunningTimeoutSec parameter to the InlineScript and that didn't impact the error. I've also set the workflow common parameters as follows:
-PSRunningTimeoutSec = 300
-PSElapsedTimeoutSec = 0
I call the workflow cmdlet with the following process:
PS C:\> . "c:\path\to\Test-Me.ps1"
PS C:\> Test-Me -PSRunningTimeoutSec 300 -PSElapsedTimeoutSec 0
There's obviously a timeout somewhere that I can't find/don't know about but powershell isn't being specific. What timeout did I miss and how do I change it?
References:
about_WorkflowCommonParameters
PowerShell Workflows: Using Parameters
Syntactic Differences Between Script Workflows and Scripts
about_InlineScript

monitor a log file with Get-Content -wait until the end of the day

I have a script monitoring a log file with get-Content and it waits for new lines to be written so it can detect the change. The logger changes the name of the log file every day to make it System_$date.log.
I am trying to make my Get-Content to wait for new lines as well as break when the date changes then to re execute itself with the new date in the filename. Does anybody have an idea how this could be done?
Thanks
Edit
The script looks like this:
Get-Content System_201371.log -wait | where {$_ -match "some regex"} |
foreach {
send_email($_)
}
The file name which is System_201371 changes everyday, it will be System_201372 and so on, Also this script runs as service, I need to it to break and re-execute itself with a new filename
You could use Jobs for this. Read the file in some background job, and let the "foreground" script wait until the day has changed. Once the day has changed, kill the old job and start a new one looking at the new file.
while($true)
{
$now = Get-Date
$fileName = 'System_{0}{1}{2}.log' -f $now.Year, $now.Month, $now.Day
$fullPath = "some directory\$fileName"
Write-Host "[$(Get-Date)] Starting job for file $fullPath"
$latest = Start-Job -Arg $fullPath -ScriptBlock {
param($file)
# wait until the file exists, just in case
while(-not (Test-Path $file)){ sleep -sec 10 }
Get-Content $file -wait | where {$_ -match "some regex"} |
foreach { send_email($_) }
}
# wait until day changes, or whatever would cause new log file to be created
while($now.Date -eq (Get-Date).Date){ sleep -Sec 10 }
# kill the job and start over
Write-Host "[$(Get-Date)] Stopping job for file $fullPath"
$latest | Stop-Job
}
This, should always monitor today's log and detect date change:
do{
$CurrentDate = get-date -uformat %Y%j
$CurrentLog = ("C:\somepath\System_" + $CurrentDate + ".log")
Start-Job -name Tail -Arg $CurrentLog -Scriptblock{
While (! (Test-Path $CurrentLog)){ sleep -sec 10 }
write-host ("Monitoring " + $CurrentLog)
Get-Content $CurrentLog -wait | where {$_ -match "some regex"} |
foreach { send_email($_) }
}
while ($CurrentDate -eq (get-date -uformat %Y%j)){ sleep -sec 10}
Stop-Job -name Tail
} while ($true)

backing up .thumbnails powershell

I have written a backup script, which backs up and logs errors. works fine , except for some .thumbnails, many other .thumbnails do get copied!
of 54000 Files copied, the same 480 .thumbnails do not ever get copied or logged. i will be checking the attributes however i feel the copy-item function shouldve done the job. Any other recommendations are welcome as well, but please stay on topic, thx!!!!
here is my backUP script
Function backUP{ Param ([string]$destination1 ,$list1)
$destination2 = $destination1
#extract new made string for backuplog
$index = $destination2.LastIndexOf("\")
$count = $destination2.length - $index
$source1 = $destination2.Substring($index, $count)
$finalstr2 = $logdrive + $source1
Foreach($item in $list1){
Copy-Item -Container: $true -Recurse -Force -Path $item -Destination $destination1 -erroraction Continue
if(-not $?)
{
write-output "ERROR de copiado : " $error| format-list | out-file -Append "$finalstr2\GCI-ERRORS-backup.txt"
Foreach($erritem in $error){
write-output "Error Data:" $erritem.TargetObject | out-file -Append "$finalstr2\GCI-ERRORS-backup.txt"
}
$error.Clear()
}
}
}
Are you sure your backUP function is receiving .thumbnails files in $list1? If the files are hidden, then Get-ChildItem will only return them if the -Force switch is used.
As for other recommendations, Robocopy.exe is a good dedicated tool for performing file synchronization.
Apparantly permissions to the thumbnails folder, i had not.
that set, script worked fine!