copy / create multiple files - powershell

I need to first create and then copy some hundreds of folders & files via powershell (first create them on a local store and then copy them to a remote store).
However, when my foreach loop runs, every 40 or so write attempt fails due to "another process" which blocks the file/folder.
I currently fixed the issue using a simple sleep between every file creation (100ms). However, I wonder if there is no better way to do this? Especially when copying multiple files the sleep would depend on the network latency and dosn't seem to be a good solution to me.
Is there a way to "wait" till the write-operation of a file completed before starting another operation? Or to check if a file is still blocked by one process and wait till it's free again?

Have you tried running your code as a job? Example:
foreach ($file in $files) {
$job = Start-Job -ScriptBlock {
#operation here..
} | Wait-Job
#Log result of job using ex. $job and: '$job | Receive-Job' to get output
}
You could also extend it to create multiple jobs, and then use Get-Job | Wait-Job to wait for the all to finish before you proceed.

Related

SFTP upload of files that were created since the last run

I am very new to PowerShell and I am in the process of writing a script that performs an SFTP file transfer via WinSCP. I will then be creating a Task on Windows Task Scheduler to run this script every 15 minutes indefinitely. Currently I have this line of code that gets all files in a directory within the last write time that was more than 20 seconds prior:
$filelist = Get-ChildItem C:\Users\lsarm\IHS\test |
where { $_.LastWriteTime -lt (Get-Date).AddSeconds(-20) }
I have been told that this needs to be changed so that it gets all files since the last time the Task was ran (15 minutes prior) instead, but I have had very little luck in finding the answer.
I have tried using Get-ScheduledTask but that only seems to get me basic information about the task and doesn't seem like it is what I need for my script. Also, I have already downloaded the WinSCP .dll file and unblocked it in PowerShell. Any help is welcome, TIA.
Using the time the task ran the last time is imo not reliable. There's still space for you to miss some files or transfer some files repeatedly.
Instead, consider remembering the timestamp of the most recent uploaded file.
Assuming you use Session.PutFiles, you can use code like this:
$transferResult =
$session.PutFiles($sourcePath, $destPath, $False, $transferOptions)
$transferResult.Check()
# Find the latest uploaded file
$latestTransfer =
$transferResult.Transfers |
Sort-Object -Property #{ Expression = { (Get-Item $_.Source).LastWriteTime } } `
-Descending |
Select-Object -First 1
And save the $latestTransfer to a file for the next run. Or loop the code with 15 minutes delay, instead of scheduling it every 15 minutes.
Another option is to remember the already transferred files.
Both options are in more details covered in:
How do I transfer new/modified files only?

How can I speed up a PowerShelll foreach loop

I have a PowerShell script that connects to a database and pulls a list of user data. I take this data and create a foreach loop to run a script for the data.
This is working but its slow as the results could be 1000+ entries, and it has to complete the Script.bat for User A before it can start User B. The Script.bat for a single user is independent from another and takes ~30s for each user.
Is there a way to speed this up at all? I've been playing with -Parallel, ForEach-Object and workflow but I can't get it to work, likely due to me being a noob in PS.
foreach ($row in $Dataset.tables[0].rows)
{
$UserID=$row.value
$DeviceID=$row.value1
$EmailAddress=$row.email_address
cmd.exe /c "`"$PSScriptRoot`"\bin\Script.bat -c `" -Switch $UserID`" >> `"$PSScriptRoot`"\${FileName3}_REST_${DateTime}.txt 2> nul";
}
You said it yourself, your bottleneck is with the batch file in your script, not the loop itself. foreach (as opposed to ForEach-Object) is already the faster foreach loop mechanism in PowerShell. Investigate your batch file to find out why it takes 30 seconds to complete, and optimize it where you can.
Using Jobs
Note: Start-Job will run the job under another process. If you have PowerShell Core you can make use of the Start-ThreadJob cmdlet in lieu of Start-Job. This will start your job as part of another thread of the same process instead of starting another process.
If you can't optimize your batch script or optimize it to meet your needs, then you can consider using Start-Job to kick off the job to execute asynchronously, and then check the result and get any output from it using Receive-Job. For example:
# Master list of jobs you need to check the result of later
$jobs = New-Object System.Collections.Generic.List[System.Management.Automation.Job]
# Run your script for each row
foreach ($row in $Dataset.tables[0].rows)
{
$UserID=$row.value
$DeviceID=$row.value1
$EmailAddress=$row.email_address
# Use Start-Job here to kick off the script and store the job information
# for later retrieval.
# The $using: scope modifier allows you to make use of variables that were
# defined in the session calling Start-Job
$job = Start-Job -ScriptBlock { cmd.exe /c "`"${using:PSScriptRoot}`"\bin\Script.bat -c `" -Switch ${using:UserID}`" >> `"${using:PSScriptRoot}`"\${using:FileName3}_REST_${DateTime}.txt 2> nul"; }
# Add the execution to the $jobs list to check the result of later
# Casting to void here prevents the Add method from returning the object
# we've added.
[void]$jobs.Add($job)
}
# Wait for the jobs to be done
Write-Host 'Waiting for all jobs to complete...'
while( $jobs | Where-Object { $_.State -eq 'Running' } ){
Start-Sleep -s 10
}
# Retrieve the output of the jobs
foreach( $j in $jobs ) {
Receive-Job $j
}
Note: Since you have ~1000 times you need to execute this script, you may want to consider writing your logic to only run a certain number of jobs at a time. My example above starts all necessary jobs without regarding the number that may execute at once.
For more information about jobs and the properties you can inspect on a running/completed job, check the links below:
About Jobs
Job Class
Using Scope*
* The documentation states that the using scope can only be declared when working with remote sessions, but this seems to work fine with Start-Job even if the job is local.

Sent mail based on time that previous mail was sent

I have a script that checks if a site is online and sends a mail if it's down.
The script is configured with a scheduled task that runs every 30 minutes.
The problem is the following:
If the site is down during the weekend or evening (or a day when i'm not monitoring the mailbox), the mails keep being sent.
I was wondering, what method could I use to only send mails if the last time a mail was sent was 3 hours before?
Based on this way, i can send a mail only once every 3 hours.
I have researched the use of registry keys but was wondering if this would be the correct approach.
Rather than the registry, I'd use a simple configuraiton file stored in ProgramData (or AppData should you need a per user configuration)
This make the process of loading / saving parameters and adding new ones very easy.
Also, should you need to save logs and / or other data, you can just put them inside that same folder.
$ConfigFullPath = "$env:APPDATA\My Monitoring solution\config.json"
# This create the config file if none is present.
if (-not (Test-Path $ConfigFullPath)) {
New-Item -ItemType File -Path $ConfigFullPath -Value ([PSCustomObject]#{'LastEmailSent' = [datetime]::MinValue}| ConvertTo-Json) -Force
}
$ConfigFileParams = ConvertFrom-Json -InputObject (get-content "$env:APPDATA\My Monitoring solution\config.json" -raw)
$SendEmail = ([Datetime]::UtcNow - ([DateTime]$ConfigFileParams.LastEmailSent)).TotalHours -ge 3
if ($SendEmail) {
try {
# Send-MailMessage -ErrorAction Stop
# Once email is sent, we update the config
$ConfigFileParams.LastEmailSent = [DateTime]::UtcNow
$ConfigFileParams | ConvertTo-Json | Out-File $ConfigFullPath
}
Catch {
#Manage what to do in case of failure
}
}
That being said, you can definitely use the registry to do the same.
For convenience and ease of use though, I strongly suggest a simpler json file based approach.
I think the best option would be to write the starting hour in a file saved on the disk and everytime you run the script to test if
(currentHour-hourFirstSent)%3==0 && currentMinute<30
You put the name of the file yyyy-mm-dd and if that file exist you read the starting hour from it, if not you create it and save the starting hour in it.

How to resume PowerShell script after system reboot?

I am using below powershell script to delete SharePoint Alerts.
foreach ($site in Get-SPSite -Limit All)
{
"Site Collection $site"
foreach ($web in $site.allwebs)
{
" Web $web"
$c = $web.alerts.count
" Deleting $c alerts"
for ($i=$c-1;$i -ge 0; $i--) { $web.alerts.delete($i) }
}
}
There are around million alerts in each of Dev, Test and UAT environments. It takes many hours to delete all the alerts at one go and as the servers automatically get restarted periodically, the script doesn't get executed fully.
I am aware that to resume PowerShell scripts after reboot we can use PowerShell Workflow with Checkpoint-workflow but not sure where to place checkpoints and PSPersist.
Need help to resume deleting of alerts in the above script after system reboot.
Update: After trying to implement it, I realized that SharePoint PowerShell cmdlets cannot be coupled with PowerShell Workflow. It doesn't allow
Add-PSSnapin "Microsoft.SharePoint.PowerShell"
to be added to workflows
Workflow SPAlerts
{
//Below tweaks didn't work
InlineScript{Add-PSSnapin "Microsoft.SharePoint.PowerShell"}
Invoke-Command -ScriptBlock {Add-PSSnapin "Microsoft.SharePoint.PowerShell"}
Invoke-Expression "......"
}
The MSDN documentation states:
You can place checkpoints anywhere in a workflow, including before and after each command or expression, or after each activity in a workflow. Windows PowerShell Workflow allows you to do so.
... Add a checkpoint after each significant part of the workflow completes; a part that you do not want to repeat if the workflow is interrupted.
... When the workflow definitions is complete, use the testing process to refine your checkpoints. Add and remove (comment out) checkpoints to achieve both a reasonable running time and a reasonable recovery time.
Looking at your code, I would place the checkpoint right after "Site Collection $site" if deleting alerts for the given website takes reasonable time on average. If there are just few sites each of them containing tons of alerts then I would place it on the start of the next foreach.
I would definitely not place it inside the worker foreach which deletes alerts.
I would also suggest you look at the foreach -Parallel capability of workflows to make the deleting parallel. That way even the last site/website should get it's turn, even if the server is restarted often.

Starting an exe file with parameters on a remote PC

We have a program running on about 400 PCs (All W7). This program is called Wisa.
We receive regular updates for this program, named something like wisa_update1.0.exe, wisa_update1.1.exe, wisa_update2.0.exe, etc. The users can not do the update themself due to account restrictions.
We manage to do the update once and distribute it with a copy-item to all PCs. Then with Enter-PSSession I can go to each PC and update the program with the following command:
wisa_update3.0 /verysilent
(with the argument /verysilent no questions are asked)
This is already a major gain in time, but I want to do the update more automatically.
I have a file "pc.txt" with all 400 PCs in it. I use this file already for the Copy-Item via Get-Content. Now I want to use this file to do the updates with the above command, but I can't find a good way to use a remote executable with a parameter in PowerShell.
What you want to do is load get-content -Path $PClist and then run your script actions in a foreach. You'll want to adapt this example to your own script:
$PClist = 'c:\pc.txt'
$aComputers = Get-Content -Path $PClist
foreach ($Computer in $aComputers)
{
code actions to perform
}
Also you can use multithreading and get it over with fraction of time (provided you have a good machine). The below mentioned link explains how to do it well.
http://www.get-blog.com/?p=22