SFTP upload of files that were created since the last run - powershell

I am very new to PowerShell and I am in the process of writing a script that performs an SFTP file transfer via WinSCP. I will then be creating a Task on Windows Task Scheduler to run this script every 15 minutes indefinitely. Currently I have this line of code that gets all files in a directory within the last write time that was more than 20 seconds prior:
$filelist = Get-ChildItem C:\Users\lsarm\IHS\test |
where { $_.LastWriteTime -lt (Get-Date).AddSeconds(-20) }
I have been told that this needs to be changed so that it gets all files since the last time the Task was ran (15 minutes prior) instead, but I have had very little luck in finding the answer.
I have tried using Get-ScheduledTask but that only seems to get me basic information about the task and doesn't seem like it is what I need for my script. Also, I have already downloaded the WinSCP .dll file and unblocked it in PowerShell. Any help is welcome, TIA.

Using the time the task ran the last time is imo not reliable. There's still space for you to miss some files or transfer some files repeatedly.
Instead, consider remembering the timestamp of the most recent uploaded file.
Assuming you use Session.PutFiles, you can use code like this:
$transferResult =
$session.PutFiles($sourcePath, $destPath, $False, $transferOptions)
$transferResult.Check()
# Find the latest uploaded file
$latestTransfer =
$transferResult.Transfers |
Sort-Object -Property #{ Expression = { (Get-Item $_.Source).LastWriteTime } } `
-Descending |
Select-Object -First 1
And save the $latestTransfer to a file for the next run. Or loop the code with 15 minutes delay, instead of scheduling it every 15 minutes.
Another option is to remember the already transferred files.
Both options are in more details covered in:
How do I transfer new/modified files only?

Related

Sent mail based on time that previous mail was sent

I have a script that checks if a site is online and sends a mail if it's down.
The script is configured with a scheduled task that runs every 30 minutes.
The problem is the following:
If the site is down during the weekend or evening (or a day when i'm not monitoring the mailbox), the mails keep being sent.
I was wondering, what method could I use to only send mails if the last time a mail was sent was 3 hours before?
Based on this way, i can send a mail only once every 3 hours.
I have researched the use of registry keys but was wondering if this would be the correct approach.
Rather than the registry, I'd use a simple configuraiton file stored in ProgramData (or AppData should you need a per user configuration)
This make the process of loading / saving parameters and adding new ones very easy.
Also, should you need to save logs and / or other data, you can just put them inside that same folder.
$ConfigFullPath = "$env:APPDATA\My Monitoring solution\config.json"
# This create the config file if none is present.
if (-not (Test-Path $ConfigFullPath)) {
New-Item -ItemType File -Path $ConfigFullPath -Value ([PSCustomObject]#{'LastEmailSent' = [datetime]::MinValue}| ConvertTo-Json) -Force
}
$ConfigFileParams = ConvertFrom-Json -InputObject (get-content "$env:APPDATA\My Monitoring solution\config.json" -raw)
$SendEmail = ([Datetime]::UtcNow - ([DateTime]$ConfigFileParams.LastEmailSent)).TotalHours -ge 3
if ($SendEmail) {
try {
# Send-MailMessage -ErrorAction Stop
# Once email is sent, we update the config
$ConfigFileParams.LastEmailSent = [DateTime]::UtcNow
$ConfigFileParams | ConvertTo-Json | Out-File $ConfigFullPath
}
Catch {
#Manage what to do in case of failure
}
}
That being said, you can definitely use the registry to do the same.
For convenience and ease of use though, I strongly suggest a simpler json file based approach.
I think the best option would be to write the starting hour in a file saved on the disk and everytime you run the script to test if
(currentHour-hourFirstSent)%3==0 && currentMinute<30
You put the name of the file yyyy-mm-dd and if that file exist you read the starting hour from it, if not you create it and save the starting hour in it.

Powershell Delete Locked File But Keep In Memory

Until recently, we've been deploying .exe applications by simply copying them manually to the destination folder on the server. Often though, the file was already running at the time of deployment (the file is called from a SQL Server job)--sometimes even multiple instances. We don't want to kill the process while it's running. We also can't wait for it to finish because it keeps on being invoked, sometimes multiple times concurrently.
As a workaround, what we've done is a "cut and paste" via Windows Explorer on the .exe file into another folder. Apparently, what this does is it moves the file (effectively a delete) but keeps it in RAM so that the processes which are using it can continue without issues. Then we'd put the new files there which would get called when any later program would call it.
We've now moved to an automated deploy tool and we need an automated way of doing this.
Stop-Process -name SomeProcess
in PowerShell would kill the process, which I don't want to do.
Is there a way to do this?
(C# would also be OK.)
Thanks,
function moverunningprocess($process,$path)
{
if($path.substring($path.length-1,1) -eq "\") {$path=$path.substring(0,$path.length-1)}
$fullpath=$path+"\"+$process
$movetopath=$path + "--Backups\$(get-date -f MM-dd-yyyy_HH_mm_ss)"
$moveprocess=$false
$runningprocess=Get-WmiObject Win32_Process -Filter "name = '$process'" | select CommandLine
foreach ($tp in $runningprocess)
{
if ($tp.commandline -ne $null){
$p=$tp.commandline.replace('"','').trim()
if ($p -eq $fullpath) {$moveprocess=$true}
}
}
if ($moveprocess -eq $true)
{
New-Item -ItemType Directory -Force -Path $movetopath
Move-Item -path "$path\*.*" -destination "$movetopath\"
}
}
moverunningprocess "processname.exe" "D:\Programs\ServiceFolder"
Since you're utilizing a SQL Sever to call the EXE. Why do you add a table that contains the path to the latest version of the file and modify your code that fires the EXE. That way when a new version is rolled out, you can create a new folder, place the file in it, and update the table pointing to it. That will allow any still active threads to have access to the old version and any new threads will pickup up the new executable. You then can delete the old file after it's no longer needed.

Powershell - Check if file is finished writing

I have a powershell code that acts as a file listener on a given folder path. The listener kicks off a command line call to another program that opens and plays with the file.
The problem is that the powershell code Immediately kicks off the command line call if any file is put into this folder. This is a problem if the file is very large (say 100+mb) because when a person copies the file into the folder, the file may only be 5% done 'writing' when the command function kicks off and tries to open the file (and fails).
is there a way in powershell to check if a file is still being written too? That way I could build a loop that would check every x seconds and only run once the write was completed?
Does a file maintain a "lock" if it is being written too? Can this be checked for in Powershell?
Thanks everyone!
There may be a lock check available in System.IO.FileInfo, or somewhere like that but I use a simple length check. It goes in the called script not the file watcher script.
$LastLength = 1
$NewLength = (Get-Item $FileName).length
while ($NewLength -ne $LastLength) {
$LastLength = $NewLength
Start-Sleep -Seconds 60
$NewLength = (Get-Item $FileName).length
}
To answer your main question which is how to check status of file download, the easiest is to check the last modified time of file to see if it has exceeded 5 minutes (just to be on the safe side for network latency etc).
I had multiple files so the below code is at the folder level but you could simply change the path for single file too.
#print a user feed back to see if downloading is completed
write-host "started downloading"
function status_checker() {
if (((get-date) - $lastWrite) -gt $timespan) {
write-host "Downloading completed"
break
} else {
write-host "still downloading" (Get-Date)
}
}
#check every 10 seconds
while(1)
{
status_checker
# 5 minutes
start-sleep -seconds 300
}

Starting an exe file with parameters on a remote PC

We have a program running on about 400 PCs (All W7). This program is called Wisa.
We receive regular updates for this program, named something like wisa_update1.0.exe, wisa_update1.1.exe, wisa_update2.0.exe, etc. The users can not do the update themself due to account restrictions.
We manage to do the update once and distribute it with a copy-item to all PCs. Then with Enter-PSSession I can go to each PC and update the program with the following command:
wisa_update3.0 /verysilent
(with the argument /verysilent no questions are asked)
This is already a major gain in time, but I want to do the update more automatically.
I have a file "pc.txt" with all 400 PCs in it. I use this file already for the Copy-Item via Get-Content. Now I want to use this file to do the updates with the above command, but I can't find a good way to use a remote executable with a parameter in PowerShell.
What you want to do is load get-content -Path $PClist and then run your script actions in a foreach. You'll want to adapt this example to your own script:
$PClist = 'c:\pc.txt'
$aComputers = Get-Content -Path $PClist
foreach ($Computer in $aComputers)
{
code actions to perform
}
Also you can use multithreading and get it over with fraction of time (provided you have a good machine). The below mentioned link explains how to do it well.
http://www.get-blog.com/?p=22

copy / create multiple files

I need to first create and then copy some hundreds of folders & files via powershell (first create them on a local store and then copy them to a remote store).
However, when my foreach loop runs, every 40 or so write attempt fails due to "another process" which blocks the file/folder.
I currently fixed the issue using a simple sleep between every file creation (100ms). However, I wonder if there is no better way to do this? Especially when copying multiple files the sleep would depend on the network latency and dosn't seem to be a good solution to me.
Is there a way to "wait" till the write-operation of a file completed before starting another operation? Or to check if a file is still blocked by one process and wait till it's free again?
Have you tried running your code as a job? Example:
foreach ($file in $files) {
$job = Start-Job -ScriptBlock {
#operation here..
} | Wait-Job
#Log result of job using ex. $job and: '$job | Receive-Job' to get output
}
You could also extend it to create multiple jobs, and then use Get-Job | Wait-Job to wait for the all to finish before you proceed.