Is there any way to wait for the copy process to finish before running another command?
I tried Start-job and Wait-Job, but it doesn't work.
$func = {
function move-tozip
{
param([string]$filedest)
$Shell = New-Object -com Shell.Application
$b = $shell.namespace($zippath.ToString())
$b.CopyHere($filedest.tostring())
#Remove-Item -Path $filedest
}
}
start-job -InitializationScript $func -ScriptBlock {move-tozip $args[0]} -ArgumentList $file
The easiest way to wait for a job to complete is to give it a name and tell Wait-Job to wait on the task with that name, your script will wait for the job with the name WaitForMe to complete and then run the rest of your code once it has.
Using the -Name paramter with your code below:
$func =
{
function Move-ToZip
{
Param([string[]]$path, [string]$zipfile)
if (-not $zipfile.EndsWith('.zip')) {$zipfile += '.zip'}
if (-not (test-path $zipfile))
{
set-content $zipfile ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
}
$shell = (new-object -com shell.application).NameSpace($zipfile)
foreach($file in $path)
{
$shell.CopyHere($file)
start-sleep -milliseconds 100
}
}
}
Start-Job -Name "WaitForMe" -InitializationScript $func -ScriptBlock {Move-ToZip -path $args[0] -zipfile $args[1]} -ArgumentList "D:\data.log", "D:\datazip.zip"
Write-Host "Waiting for job to complete"
Wait-Job -Name "WaitForMe"
Write-Host "Job has completed :D"
To zip one file or folder
-ArgumentList "D:\testfile.log", "D:\datazip.zip"
To zip multiple files or folders
-ArgumentList #("D:\testfile.log","D:\testFolder1"), "D:\testzip.zip"
EDIT 17/12/2015
I've adapted code from This MSDN blog to the Move-ToZip function as the previous code didnt work for me at all, i've tested the above code successfully on files and folders. I have not tested the performance of this method, if you wish to compress/zip multiple large files/folders i would highly suggest looking into using a known working library or third party utility like 7zip.
Related
I am trying to delete every file in a SharePoint list. My org has retention turned on so I can't just delete the entire list, but must remove every folder/file. My issue is around the connection itself when used with Start-Job.
It's painfully slow, so I wanted to spin up batches of 10+ jobs to delete simultaneously and reuse the connection, but there is an issue passing the connection as an argument because it becomes deserialized. I found this post with the exact same issue and no solution.
If I "workaround" it by connecting each Start-Job, I get throttled by SharePoint online.
function Empty-PnPFiles($SPSiteUrl, $RelativeURL)
{
$connection = Connect-PnPOnline -URL $SPSiteUrl -UseWebLogin -ReturnConnection
# Get All files in the folder
$Files = Get-PnPFolderItem -FolderSiteRelativeUrl $FolderSiteRelativeURL -ItemType File
# Delete all files in the Folder
$n = 0
$Jobs = #()
ForEach ($File in $Files)
{
$n++
Write-Host "Creating job to delete '$($File.ServerRelativeURL)'"
#Delete File
$Jobs += Start-Job -ArgumentList #($connection, $File) -ScriptBlock {
$LocalConnection = $args[0]
# $LocalConnection = Connect-PnPOnline -URL <My SP URL> -UseWebLogin -ReturnConnection
$LocalFile = $args[1]
Remove-PnPFile -ServerRelativeUrl $LocalFile.ServerRelativeURL -Connection $LocalConnection -Force -Recycle
}
# Do in batches. Every 10 Jobs, wait for completion
if ($n % 10 -eq 0)
{
Write-Host "Waiting for batch $n ($($Files.Count)) to complete before next batch" -ForegroundColor Green
$Jobs | Wait-Job | Receive-Job
$Jobs = #()
}
}
# If there are left over jobs, wait for them
if ($Jobs)
{
$Jobs | Wait-Job | Receive-Job
}
}
$SiteURL = "https://<MySiteCollection>.sharepoint.com/sites/<MySite>"
$ListName = "TempDelete"
Empty-PnPFiles -SPSiteUrL $SiteURL -RelativeURL "/TempDelete" # <My Folder to Delete all files>
The error I get is:
Cannot bind parameter 'Connection'. Cannot convert the "PnP.PowerShell.Commands.Base.PnPConnection" value of type "Deserialized.PnP.PowerShell.Commands.Base.PnPConnection" to type "PnP.PowerShell.Commands.Base.PnPConnection".
How can I pass the connection to the script block without the serialization error? Or is there a better way to bulk-delete files from SPO using PowerShell? I have to use PowerShell because it's the only tool available to me currently.
Use Invoke-Command instead of Start-Job
I'm running an exe from a PowerShell script. This executable writes its logs to a log file. I would like to continuously read and forward the logs from this file to the console while the executable is running.
Currently, I'm starting the exe like this:
$UNITY_JOB = Start-Job
-ScriptBlock { & "C:\Program Files\Unity\Hub\Editor\2019.2.11f1\Editor\Unity.exe" $args | Out-Null }
-ArgumentList $UNITY_ARGS
If I just do Get-Content $LOG_PATH -Wait at this point, I cannot detect when the exe terminates and the script blocks indefinitely.
If I start a second job for the logs, the output is not sent to the console:
$LOG_JOB = Start-Job -ScriptBlock { Get-Content $LOG_PATH -Wait }
(I need "real time" output, so I don't think Receive-Job would work)
I'd use a loop which ends when the job's status is Completed:
# Just to mock the execution
$extProgram = Start-Job -ScriptBlock { Start-Sleep -Seconds 30}
$file = 'C:\path\to\file.txt'
do {
cls
Get-Content $file -Tail $host.ui.RawUI.WindowSize.Height
Start-Sleep -Seconds 5 # Set any interval you need
} until ((Get-Job -Id $extProgram.id).State -eq "Completed")
I'm trying to get one master PowerShell script to run all of the others while waiting 30-60 seconds to ensure that the tasks are completed. Everything else I tried wouldn't stop/wait for the first script and its processes to complete before going through all the others at the same time and would cause a restart automatically.
Main script, run as admin:
$LogStart = 'Log '
$LogDate = Get-Date -Format "dd-MM-yyy-hh-mm-ss"
$FileName = $LogStart + $LogDate + '.txt.'
$scriptList = #(
'C:\Scripts\1-OneDriveUninstall.ps1'
'C:\Scripts\2-ComputerRename.ps1'
);
Start-Transcript -Path "C:\Scripts\$FileName"
foreach ($script in $scriptList) {
Start-Process -FilePath "$PSHOME\powershell.exe" -ArgumentList "-Command '& $script'"
Write-Output "The $script is running."
Start-Sleep -Seconds 30
}
Write-Output "Scripts have completed. Computer will restart in 10 seconds."
Start-Sleep -Seconds 10
Stop-Transcript
C:\Scripts\3-Restart.ps1
1-OneDriveUninstall.ps1:
Set-ItemProperty -Path REGISTRY::HKEY_LOCAL_MACHINE\Software\Microsoft\windows\CurrentVersion\Policies\System -Name ConsentPromptBehaviorAdmin -Value 0
taskkill /f /im OneDrive.exe
C:\Windows\SysWOW64\OneDriveSetup.exe /uninstall
2-ComputerRename.ps1:
$computername = Get-Content env:computername
$servicetag = Get-WmiObject Win32_Bios |
Select-Object -ExpandProperty SerialNumber
if ($computername -ne $servicetag) {
Write-Host "Renaming computer to $servicetag..."
Rename-Computer -NewName $servicetag
} else {
Write-Host "Computer name is already set to service tag."
}
The log file shows:
Transcript started, output file is C:\Scripts\Log 13-09-2019-04-28-47.txt.
The C:\Scripts\1-OneDriveUninstall.ps1 is running.
The C:\Scripts\2-ComputerRename.ps1 is running.
Scripts have completed. Computer will restart in 10 seconds.
Windows PowerShell transcript end
End time: 20190913162957
They aren't running correctly at all though. They run fine individually but not when put into one master script.
PowerShell can run PowerShell scripts from other PowerShell scripts directly. The only time you need Start-Process for that is when you want to run the called script with elevated privileges (which isn't necessary here, since your parent script is already running elevated).
This should suffice:
foreach ($script in $scriptList) {
& $script
}
The above code will run the scripts sequentially (i.e. start the next script only after the previous one terminated). If you want to run the scripts in parallel, the canonical way is to use background jobs:
$jobs = foreach ($script in $scriptList) {
Start-Job -ScriptBlock { & $using:script }
}
$jobs | Wait-Job | Receive-Job
I have the following PowerShell code that tests if a file exists within a script block:
$scriptblock =
{
Param($filename)
return "Scriptblock filename $filename Exists? -> $(Test-Path $filename)"
}
$myFilename = "MyFile.xml"
Write-Host "Main filename $myFilename Exists? -> $(Test-Path $myFilename)"
$job = Start-Job -Name "myJob" -ScriptBlock $scriptBlock -ArgumentList $myFilename
$result = Receive-Job -Name "myJob"
Write-Host $result
When I run it I get the following output indicating the file exists in the main execution but not in the script block.
Main filename MyFile.xml Exists? -> True
Scriptblock filename MyFile.xml Exists? -> False
Can someone please indicate what is needed to test for file existence in a script block?
Thanks!
As a best practice, you should probably be including the full path to the file you want tested, rather than relying on the current directory (which can vary if you run the script under a different user context).
$scriptblock = {
param($filename)
"Scriptblock filename $filename Exists? -> $(Test-Path $filename)"
}
$myFilename = "C:\Temp\MyFile.xml"
"Main filename $myFilename Exists? -> $(Test-Path $myFilename)"
Start-Job -Name "myJob" -ScriptBlock $scriptBlock -ArgumentList $myFilename
Receive-Job -Name "myJob"
I am using the following code to build some projects using PowerShell but this is taking nearly 30 minutes of time and some times it is taking more than that too, here is the PowerShell script I had when I execute it locally it is getting build in
$sourceDirectory = "D:\Service"
$SolutionToBuild = #("Solution1.sln","Solution2.sln","Solution3.sln","Solution4.sln")
$projectFiles = Get-ChildItem -Path "$sourceDirectory" -Filter *.sln -Recurse
foreach($solution in $SolutionToBuild)
{
foreach($projectFile in $projectFiles)
{
if($projectFile.Name -eq $solution)
{
write-host $projectFile.Name;
$SlnFilePath = $projectFile.FullName
$BuildParameters = """$SlnFilePath"" /Build Release|x86"
$CleanParameters = """$SlnFilePath"" /Clean Release|x86"
Start-Process -FilePath $vsPath -ArgumentList $CleanParameters -Wait
Start-Process -FilePath $vsPath -ArgumentList $BuildParameters -Wait
break;
}
}
}
So can one let me know why this taking much time
A minor change to the code will affect the total performance. Instead of using Start-Process, you may use the following code:
$buildProcInfo = New-Object -TypeName System.Diagnostics.ProcessStartInfo -ArgumentList "devenv.exe", $buildArgs
try { $buildProcess = [System.Diagnostics.Process]::Start($buildProcInfo) }
catch {
Write-Host "Error"
# exit?
}
$buildProcess.WaitForExit()