Do Until with Get-SmbOpenFile - powershell

I'm trying to check if any files are opened on a server with powershell.
I've got the below script which kind of works
If I start it when there are no files opened it checks first, waits 10 seconds and then prints the message "No files are opened". If I open any files and start the script it says "Files are opened, please wait..." but when I close all files and disconnect all sessions it still says the same
Clear-Host
$CheckOpenfiles = Get-SmbOpenFile
Do
{
"$(get-date) Files are opened, please wait..."
Start-Sleep 10
} Until (!$CheckOpenfiles)
"No files are opened"

As LotPings notes in the comments, you assign your value to $CheckOpenfiles before you start looping. This means that it is not reevaluated in your Until conditional.
Do {
"$(get-date) Files are opened, please wait..."
Start-Sleep 10
} Until (!(Get-SmbOpenFile))
"No files are opened"

Related

Azcopy upload then delete local

I'm working on a Powershell script to upload files to an Azure blob from a local directory then if successful remove the local files. The blob is just a staging location and the files are moved from there so I cannot sync. I think the only way to do this might be to check the status of the copy from the log? I thought this may do what I need but it just starts uploading all the files again when it cycles back through and never continues and removes the source. Any help would be greatly appreciated.
while($true){
.\azcopy copy "$Upload\*.iso" (SAS URL) --log-level ERROR
Start-Sleep -Seconds 60 -Verbose
}
Daniel's link got me going in the right direction.
.\azcopy copy "$Upload\*.iso" "SAS" --log-level ERROR
IF ($LASTEXITCODE -eq 0){
Write-Host "Transfer Successful"
Write-Host "Remove files after Azure copy is done"
rm "Source" -Verbose
Stop-Transcript
}
IF ($LASTEXITCODE -ne 0){
Write-Host "Transfer Failed"
Stop-Transcript
}

Check if last created file exists and create an event log entry with PowerShell

I'm writing files with PowerShell to store specific information in it.
I also want to create windows event log entries to check if the file which was "newly" created is really there.
New-EventLog -LogName System -Source "Files I store information in"
Write-EventLog -LogName System -Source "Files I store information in" -EntryType Information -EventId 1 -Message "Information written to file script started"
$FilePath = "C:\Path\Files"
command.exe -Out-File $FilePath\${env:computername}_$(get-date -f dd-MM-yyyy-hhmm).file
Basically I'm searching for a way to verify if the above command.exe created a file. I'm not sure how to do that when I'm using the "get-date" option to append this to the file name.
If the file was created I want to create a successful event log entry. If it wasn't created I want to create a non successful event log entry.
Anyone a hint on this ?
Try catch, would work
try{
$FilePath = "C:\Path\Files"
command.exe -Out-File $FilePath\${env:computername}_$(get-date -f dd-MM-yyyy-hhmm).file
--write below successfull log entry
you also could add more checks like below
if (test-path "$FilePath\${env:computername}_$(get-date -f dd-MM-yyyy-hhmm)"){
write here successfull log entry
}
else
{
---write here unsuccessfull log entry
}
catch{
---write here unsuccessfull log entry
}

Moving content from one external storage location to a network with email confirmation

This is my first non-very-basic attempt at PowerShell scripting, so please excuse any poor etiquette.
I have an need to transfer approximately 30GB of video data from USB attached storage to a local network share. As I started this little project, I quickly identified that the processes I do naturally when performing this task need to be accounted for during the scripting, so my question is, how do I lay this all out and achieve my end goal of a simple copy and allow for this.
This is what I have thus far;
### (The purpose of this script is to automate the offloading of the Driver Cameras during FP1 and FP2 on a Friday.
### Please remember that the cameras still need to be cleared after fireups on a Thursday"
Write-Host -NoNewLine "This script will move the footage captured on the SD cards of the FWD and RWD cameras and copy them to a defined network share" -ForegroundColor Green `n
Start-Sleep -Seconds 10
# Execute the copy from the foward facing camera to the network and remove the local files once complete
Write-Host "FWD Camera copy in progress" -ForegroundColor White -BackgroundColor Magenta `n
Start-Sleep -Seconds 5
Get-ChildItem -Path "SourcePath" -Recurse |
Move-Item -destination "DestinationPath" -Verbose
# Execute the copy from the rearward facing camera to the network and remove the local files once complete
Write-Host "RWD Camera copy in progress" -ForegroundColor White -BackgroundColor Magenta `n
Start-Sleep -Seconds 5
Get-ChildItem -Path "SourcePath" -Recurse |
Move-Item -destination "DestinationPath" -Verbose
Write-Host "Sending email confirmation" -ForegroundColor Green `n
Send-MailMessage -smtpserver ServerIP -To "EmailAddress" -From "EmailAddress" -Subject "Camera offload" -body BodyHere -bodyasHTML
Write-Host "All tasks have completed" -ForegroundColor Green `n
Read-Host "Press any key to exit..."
exit
What I'd like to add is fault tolerance and allow for this to be communicated via email dynamically. find these criteria below;
There's a chance the cable connecting the storage to the machine running the script could become disconnected and only have moved a number of items, can I add something to aid this?
If a file transfer fails how do i restart and track this? Can I add a loop to confirm all the items have been moved?
How do I reference a fault code to dynamically update the content of the email sent to the end user?
Finally, are there any other common practice references I've missed and that need to be included?
Many thanks in advance for any help.
This topic is a bit broad but let me try to address your question to help you to start. Of course I won't give you the whole code, just explanation what to use and why.
There's a chance the cable connecting the storage to the machine running the script could become disconnected and only have moved a number of items, can I add something to aid this?
First of all, as vonPryz said in comments, use robocopy!
It should survive network interruptions (e.g. check this post). As a general approach, I'd first make sure that the content is successfully copied before deleting it. For example you could use something like this:
(Get-ChildItem -Recurse).FullName.Replace("C:\old\path","D:\new\path") | Test-Path
Of course the above will only check if file exists, not if the file has the same content. To compare if the files are the same you could use Get-FileHash.
If a file transfer fails how do i restart and track this? Can I add a loop to confirm all the items have been moved?
Partially answered above. Robocopy has this feature built-in. And yes, you can add a loop to check.
How do I reference a fault code to dynamically update the content of the email sent to the end user?
Check the example from here:
robocopy b:\destinationdoesnotexist C:\documents /MIR
if ($lastexitcode -eq 0)
{
write-host "Success"
}
else
{
write-host "Failure with exit code:" $lastexitcode
}
Also, there's article on MS side listing all exit codes which might be helpful to handle the exceptions. All you have to do is to add $LASTEXITCODE to email body.

Powershell: Brute-forcing password-protected .zip file (speeding up the process)

First-time questioner, so here's hoping I'm doing it right. :)
A co-worker and I have been playing around with Powershell, getting the lay of the land and see what you can do with it. Using info we found online (mostly here), we've managed to whip together a script to brute-force a password-protected .zip file using a .txt containing a list of passwords:
# Stopwatch for measurement
$stopWatch = [System.Diagnostics.Stopwatch]::startNew()
$7zipExec = """-7z.exe (7zip) location-"""
$input = """-.zip location-"""
$output = """-where to drop contents of .zip file-"""
$passwordfile = "-location of .txt file containing passwords-"
$windowStyle = "Hidden"
[long] $counter = 0
# Correct password is 12341234
foreach ($password in (get-content $passwordfile)) {
$counter++
Write-Host -NoNewLine "Attempt #($counter): $password"
$arguments = "x -o$output -p$password -aoa $input"
$p = Start-Process $7zipExec -ArgumentList $arguments -Wait -PassThru -WindowStyle $windowStyle
if ($p.ExitCode -eq 0) {
# Password OK
Write-Host " ...OK!"
Write-Host ""
Write-Host "Password is $password, found it after $counter tries."
break
}
elseif ($p.ExitCode -eq 2) {
# Wrong password
Write-Host " ...wrong"
}
else {
# Unknown
Write-Host " ...ERROR"
}
}
# Halt the stopwatch and display the time spent for this process
$stopWatch.Stop()
Write-Host
Write-Host "Done in $($stopWatch.Elapsed.Hours) hour(s), $($stopWatch.Elapsed.Minutes) minute(s) and $($stopWatch.Elapsed.Seconds) seconds(s)"
Read-Host -Prompt "Press Enter to exit"
It actually works! Probably not as clean as it could be, but we've managed to reach our goal to make a functioning script.
However! It takes about 1 second for each password try, and if you have a file with, say, the 10,000 most common passwords...that could take a while.
So now we're trying to figure out how to speed up the process, but we've hit a wall and need help. I'm not asking for someone to get 'er done, but I would really appreciate some tips/tricks/hints for someone who has only recently started getting into Powershell (and loving it so far!).
Took a while to get back to this, real life and all that, but while I did not manage to speed up the script, I did manage to speed up the process.
What I do now is run 4 instances of the script simultaneously (using an extra PS script to start them, which itself can be started with a batch file).
All of them have their own lists of passwords, and their own output directory (I found that when they use the same location, the file extracted by the script that found the password becomes unusable).
This way, it takes about 7-8 hours to attempt 100,000 of the most commonly used passwords! While I'm sure there are quicker scripts/programs out there, I'm pretty happy with the result.
Thanks all for the input!

Powershell - Check if file is finished writing

I have a powershell code that acts as a file listener on a given folder path. The listener kicks off a command line call to another program that opens and plays with the file.
The problem is that the powershell code Immediately kicks off the command line call if any file is put into this folder. This is a problem if the file is very large (say 100+mb) because when a person copies the file into the folder, the file may only be 5% done 'writing' when the command function kicks off and tries to open the file (and fails).
is there a way in powershell to check if a file is still being written too? That way I could build a loop that would check every x seconds and only run once the write was completed?
Does a file maintain a "lock" if it is being written too? Can this be checked for in Powershell?
Thanks everyone!
There may be a lock check available in System.IO.FileInfo, or somewhere like that but I use a simple length check. It goes in the called script not the file watcher script.
$LastLength = 1
$NewLength = (Get-Item $FileName).length
while ($NewLength -ne $LastLength) {
$LastLength = $NewLength
Start-Sleep -Seconds 60
$NewLength = (Get-Item $FileName).length
}
To answer your main question which is how to check status of file download, the easiest is to check the last modified time of file to see if it has exceeded 5 minutes (just to be on the safe side for network latency etc).
I had multiple files so the below code is at the folder level but you could simply change the path for single file too.
#print a user feed back to see if downloading is completed
write-host "started downloading"
function status_checker() {
if (((get-date) - $lastWrite) -gt $timespan) {
write-host "Downloading completed"
break
} else {
write-host "still downloading" (Get-Date)
}
}
#check every 10 seconds
while(1)
{
status_checker
# 5 minutes
start-sleep -seconds 300
}