Powershell - Check if file is finished writing - powershell

I have a powershell code that acts as a file listener on a given folder path. The listener kicks off a command line call to another program that opens and plays with the file.
The problem is that the powershell code Immediately kicks off the command line call if any file is put into this folder. This is a problem if the file is very large (say 100+mb) because when a person copies the file into the folder, the file may only be 5% done 'writing' when the command function kicks off and tries to open the file (and fails).
is there a way in powershell to check if a file is still being written too? That way I could build a loop that would check every x seconds and only run once the write was completed?
Does a file maintain a "lock" if it is being written too? Can this be checked for in Powershell?
Thanks everyone!

There may be a lock check available in System.IO.FileInfo, or somewhere like that but I use a simple length check. It goes in the called script not the file watcher script.
$LastLength = 1
$NewLength = (Get-Item $FileName).length
while ($NewLength -ne $LastLength) {
$LastLength = $NewLength
Start-Sleep -Seconds 60
$NewLength = (Get-Item $FileName).length
}

To answer your main question which is how to check status of file download, the easiest is to check the last modified time of file to see if it has exceeded 5 minutes (just to be on the safe side for network latency etc).
I had multiple files so the below code is at the folder level but you could simply change the path for single file too.
#print a user feed back to see if downloading is completed
write-host "started downloading"
function status_checker() {
if (((get-date) - $lastWrite) -gt $timespan) {
write-host "Downloading completed"
break
} else {
write-host "still downloading" (Get-Date)
}
}
#check every 10 seconds
while(1)
{
status_checker
# 5 minutes
start-sleep -seconds 300
}

Related

SFTP upload of files that were created since the last run

I am very new to PowerShell and I am in the process of writing a script that performs an SFTP file transfer via WinSCP. I will then be creating a Task on Windows Task Scheduler to run this script every 15 minutes indefinitely. Currently I have this line of code that gets all files in a directory within the last write time that was more than 20 seconds prior:
$filelist = Get-ChildItem C:\Users\lsarm\IHS\test |
where { $_.LastWriteTime -lt (Get-Date).AddSeconds(-20) }
I have been told that this needs to be changed so that it gets all files since the last time the Task was ran (15 minutes prior) instead, but I have had very little luck in finding the answer.
I have tried using Get-ScheduledTask but that only seems to get me basic information about the task and doesn't seem like it is what I need for my script. Also, I have already downloaded the WinSCP .dll file and unblocked it in PowerShell. Any help is welcome, TIA.
Using the time the task ran the last time is imo not reliable. There's still space for you to miss some files or transfer some files repeatedly.
Instead, consider remembering the timestamp of the most recent uploaded file.
Assuming you use Session.PutFiles, you can use code like this:
$transferResult =
$session.PutFiles($sourcePath, $destPath, $False, $transferOptions)
$transferResult.Check()
# Find the latest uploaded file
$latestTransfer =
$transferResult.Transfers |
Sort-Object -Property #{ Expression = { (Get-Item $_.Source).LastWriteTime } } `
-Descending |
Select-Object -First 1
And save the $latestTransfer to a file for the next run. Or loop the code with 15 minutes delay, instead of scheduling it every 15 minutes.
Another option is to remember the already transferred files.
Both options are in more details covered in:
How do I transfer new/modified files only?

Powershell - Loop run console app >Wait for Ctrl+c

I have a windows console app that currently runs to process some files, at the end of the run, if successful, it starts a windows service and I get the output > xxx service is now running, press control_c to exit.
The console app looks at a config file to pull some parameters, I need to be able to re-run this multiple times while changing the parameters in the config file first. To do this manually I'd do the following:
change config file
run the app from powershell
wait for the message above to appear
click ctrl + c to terminate
change config file and run again
I thought it makes sense to automate this in a PS script where I can just pass the config values for all the runs, then the script loops through the values, edit the config file and run the exe.
Issue I have is the loop gets "stuck" at first run because the application is waiting for the ctrl+c command so never progresses through the loop.
what I have at the moment looks like this:
foreach ($dt in $datesarr)
{
##edit config values with stuff in $dt
$output=(<path to app here>)
while ($output[-1] -notlike "*Control-C*")
{
Start-Sleep -Seconds 10
}
}
problem I have is the script never reaches the while loop as it's just stuck after running the app awaiting for ctrl + c... What I want it to do is launch the app, wait for it to get to the ctrl + c bit then exit the loop and pick the second value in the parameter.
Any thoughts would be hugely appreciated!
Try the following approach, which is based on direct use of the following, closely related .NET APIs:
System.Diagnostics.ProcessStartInfo
System.Diagnostics.Process
Instead of trying to programmatically press Ctrl-C, the process associated with the external program is simply killed (terminated).
# Remove the next line if you don't want to see verbose output.
$VerbosePreference = 'Continue'
$psi = [System.Diagnostics.ProcessStartInfo] #{
UseShellExecute = $false
WindowStyle = 'Hidden'
FileName = '<path to app here>'
Arguments = '<arguments string here>' # only if args must be passed
RedirectStandardOutput = $true
RedirectStandardError = $true # optional - if you also want to examine stderr
}
Write-Verbose "Launching $($psi.FileName)..."
$ps = [System.Diagnostics.Process]::Start($psi)
Write-Verbose "Waiting for launched process $($ps.Id) to output the line of interest..."
$found = $false
while (
-not $ps.HasExited -and
-not ($found = ($line = $ps.StandardOutput.ReadLine()) -match 'Control-C')
) {
Write-Verbose "Stdout line received: $line"
}
if ($found) {
Write-Verbose "Line of interest received; terminating process $($ps.Id)."
# Note: If the process has already terminated, this will be a no-op.
# .Kill() kills only the target process itself.
# In .NET Core / .NET 5+, you can use .Kill($true) to also
# kill descendants of the process, i.e. all processes launched
# by it, directly and via its children.
$ps.Kill()
} else {
Write-Error "Process $($ps.Id) terminated before producing the expected output."
}
$ps.Dispose()

Do Until with Get-SmbOpenFile

I'm trying to check if any files are opened on a server with powershell.
I've got the below script which kind of works
If I start it when there are no files opened it checks first, waits 10 seconds and then prints the message "No files are opened". If I open any files and start the script it says "Files are opened, please wait..." but when I close all files and disconnect all sessions it still says the same
Clear-Host
$CheckOpenfiles = Get-SmbOpenFile
Do
{
"$(get-date) Files are opened, please wait..."
Start-Sleep 10
} Until (!$CheckOpenfiles)
"No files are opened"
As LotPings notes in the comments, you assign your value to $CheckOpenfiles before you start looping. This means that it is not reevaluated in your Until conditional.
Do {
"$(get-date) Files are opened, please wait..."
Start-Sleep 10
} Until (!(Get-SmbOpenFile))
"No files are opened"

Powershell: Brute-forcing password-protected .zip file (speeding up the process)

First-time questioner, so here's hoping I'm doing it right. :)
A co-worker and I have been playing around with Powershell, getting the lay of the land and see what you can do with it. Using info we found online (mostly here), we've managed to whip together a script to brute-force a password-protected .zip file using a .txt containing a list of passwords:
# Stopwatch for measurement
$stopWatch = [System.Diagnostics.Stopwatch]::startNew()
$7zipExec = """-7z.exe (7zip) location-"""
$input = """-.zip location-"""
$output = """-where to drop contents of .zip file-"""
$passwordfile = "-location of .txt file containing passwords-"
$windowStyle = "Hidden"
[long] $counter = 0
# Correct password is 12341234
foreach ($password in (get-content $passwordfile)) {
$counter++
Write-Host -NoNewLine "Attempt #($counter): $password"
$arguments = "x -o$output -p$password -aoa $input"
$p = Start-Process $7zipExec -ArgumentList $arguments -Wait -PassThru -WindowStyle $windowStyle
if ($p.ExitCode -eq 0) {
# Password OK
Write-Host " ...OK!"
Write-Host ""
Write-Host "Password is $password, found it after $counter tries."
break
}
elseif ($p.ExitCode -eq 2) {
# Wrong password
Write-Host " ...wrong"
}
else {
# Unknown
Write-Host " ...ERROR"
}
}
# Halt the stopwatch and display the time spent for this process
$stopWatch.Stop()
Write-Host
Write-Host "Done in $($stopWatch.Elapsed.Hours) hour(s), $($stopWatch.Elapsed.Minutes) minute(s) and $($stopWatch.Elapsed.Seconds) seconds(s)"
Read-Host -Prompt "Press Enter to exit"
It actually works! Probably not as clean as it could be, but we've managed to reach our goal to make a functioning script.
However! It takes about 1 second for each password try, and if you have a file with, say, the 10,000 most common passwords...that could take a while.
So now we're trying to figure out how to speed up the process, but we've hit a wall and need help. I'm not asking for someone to get 'er done, but I would really appreciate some tips/tricks/hints for someone who has only recently started getting into Powershell (and loving it so far!).
Took a while to get back to this, real life and all that, but while I did not manage to speed up the script, I did manage to speed up the process.
What I do now is run 4 instances of the script simultaneously (using an extra PS script to start them, which itself can be started with a batch file).
All of them have their own lists of passwords, and their own output directory (I found that when they use the same location, the file extracted by the script that found the password becomes unusable).
This way, it takes about 7-8 hours to attempt 100,000 of the most commonly used passwords! While I'm sure there are quicker scripts/programs out there, I'm pretty happy with the result.
Thanks all for the input!

copy / create multiple files

I need to first create and then copy some hundreds of folders & files via powershell (first create them on a local store and then copy them to a remote store).
However, when my foreach loop runs, every 40 or so write attempt fails due to "another process" which blocks the file/folder.
I currently fixed the issue using a simple sleep between every file creation (100ms). However, I wonder if there is no better way to do this? Especially when copying multiple files the sleep would depend on the network latency and dosn't seem to be a good solution to me.
Is there a way to "wait" till the write-operation of a file completed before starting another operation? Or to check if a file is still blocked by one process and wait till it's free again?
Have you tried running your code as a job? Example:
foreach ($file in $files) {
$job = Start-Job -ScriptBlock {
#operation here..
} | Wait-Job
#Log result of job using ex. $job and: '$job | Receive-Job' to get output
}
You could also extend it to create multiple jobs, and then use Get-Job | Wait-Job to wait for the all to finish before you proceed.