Powershell pass arguments to robocopy nested in a Start-Job - powershell

After a massive amount of headaches, I was able to get this almost functioning.
Problem: In the error/output Robocopy appears to be treating $args[4] (ref: $sourcePath) as everysingle IP in the range instead of just one object.
I'm assuming the rest of the syntax is correct, because if I switch $ip = 101..119 | foreach { "192.168.1.$_" } to $ip = "192.168.1.101" everything works correctly.
The Robocopy dumps into the console -source as all of the IP addresses in the range from $ip. What am I doing wrong here?
#####################################################
#Purpose: to ping an IP range of network locations in a loop until successful. When successful, copy new files from source onto storage.
#Each ping and transfer needs to be ran individually and simultaneously due to time constraints.
#####################################################
#define the IP range, source path & credentials, and storage path
$ip = 101..119 | foreach { "192.168.1.$_" }
#$ip = "192.168.1.101" #everything works if I comment above and uncomment here
$source = "\\$ip"
$sourcePath = "$source\admin\"
$dest = "C:\Storage\"
$destFull = "$dest$ip\"
$username = "user"
$password = "password"
#This is how to test connection. Once returns TRUE, then copy new files only from source to destination.
#copy all subdirectories & files in restartable mode
foreach ($src in $ip){
Start-Job -ScriptBlock {
DO {$ping = Test-Connection $args[0] -BufferSize 16 -Count 4 -quiet}
until ($ping)
net use \\$args[1] $args[2] /USER:$args[3]
robocopy $args[4] $args[5] /E /Z
} -ArgumentList $src, $source, $password, $username, $sourcePath, $destFull -Name "$src" #pass arguments to Start-Job's scriptblock
}
#get all jobs in session, supress command prompt until all jobs are complete. Then once complete, get the results.
Get-Job | Wait-Job
Get-Job | Receive-Job

At the point you create $source, $ip is an array, so $source ends up as a very long string of all the items concatenated:
\\192.168.1.101 192.168.1.102 192.168.1.103 192.168.1.104 ...
You can see this for yourself, by running just these two lines, then examining the contents of $source:
$ip = 101..119 | foreach { "192.168.1.$_" }
$source = "\\$ip"
This has a knock-on effect to $sourcePath which is used as $args[4] in your call to RoboCopy. You should build your paths inside your foreach loop, where you have access to each IP address ($src) from the $ip collection.

Some sources/etc. are different, but that's just due to test environment. I decided to use the [io.path] for the paths since I was running into problems with $args when defining variables.
Thank you boxdog for the above help. I was completely overlooking that fact.
$ScriptBlock = {
$source = [io.path]::Combine('\\',$args[0])
$sourcePath = [io.path]::Combine('\\',$args[0],'c$','admin\')
$dest = "C:\Storage\"
$destFull = [io.path]::Combine($dest,$args[0])
DO {$ping = Test-Connection $args[0] -BufferSize 16 -Count 1 -quiet}
until ($ping)
net use $source password /USER:user
robocopy $sourcePath $destFull /E /Z
}
$ip = 101..119 | foreach { "192.168.1.$_" }
foreach ($dvr in $ip){
Start-Job $ScriptBlock -ArgumentList $dvr
}
Get-Job | Wait-Job
Get-Job | Receive-Job

Related

Multithreading a powershell script and challenges

I have developed a script which does a lot of processing for a front end tool, now I am attempting to have the script run with multiple threads. It interacts a lot with SQL databases, this should not be a problem for multithreading as the database transactions are very short lived, and the queries well optimised.
what is the issue ?
#.\tester.ps1 -servers (1,'Server1',3,1),(2,'Server2',3,1) -output_folder 'C:\temp'
param ([array[]]$servers),$output_folder
for ($i = 0; $i -lt $servers.Count; $i++)
{
$myserverid = $servers[$i][0]
$myservername = $servers[$i][1]
$mylocationid = $servers[$i][2]
$myappid = $servers[$i][3]
write-output " $myserverid and $myservername and $mylocationid and $myappid"
invoke-sqlcmd -ServerInstance "$myservername" -query "select top 10 name from sysobjects" -Database "master"
}
The script file above will gets passed an array of servers and currently it will loop through the array one by one. A way for me to make the process faster is to run the script in parallel /run the script with multiple threads.
Research
I have looked at a technet script on https://gallery.technet.microsoft.com/scriptcenter/Run-a-PowerShell-script-991c8a42
Its not quite the same as my array is not just a list of servers, there will be other parameters sent with it.
What am I after
A way or pointer to make the script be able to run in parallel or an example using the provided script above.
Thanks in advance.
Extending my comment. In PowerShell v5, use Jobs and Workflows for Parallel use cases.
# Example using parallel jobs
$start = Get-Date
# get all hotfixes
$task1 = { Get-Hotfix }
# get all scripts in your profile
$task2 = { Get-Service | Where-Object Status -eq Running }
# parse log file
$task3 = { Get-Content -Path $env:windir\windowsupdate.log | Where-Object { $_ -like '*successfully installed*' } }
# run 2 tasks in the background, and 1 in the foreground task
$job1 = Start-Job -ScriptBlock $task1
$job2 = Start-Job -ScriptBlock $task2
$result3 = Invoke-Command -ScriptBlock $task3
# wait for the remaining tasks to complete (if not done yet)
$null = Wait-Job -Job $job1, $job2
# now they are done, get the results
$result1 = Receive-Job -Job $job1
$result2 = Receive-Job -Job $job2
# discard the jobs
Remove-Job -Job $job1, $job2
$end = Get-Date
# Example, using WorkFlow
workflow Test-WFConnection
{
param
(
[string[]]$Computers
)
foreach -parallel ($computer in $computers)
{
Test-Connection -ComputerName $computer -Count 1 -ErrorAction SilentlyContinue
}
}

Unable to pass SharePoint (PnP) connection to PowerShell job because it's becoming deserialized?

I am trying to delete every file in a SharePoint list. My org has retention turned on so I can't just delete the entire list, but must remove every folder/file. My issue is around the connection itself when used with Start-Job.
It's painfully slow, so I wanted to spin up batches of 10+ jobs to delete simultaneously and reuse the connection, but there is an issue passing the connection as an argument because it becomes deserialized. I found this post with the exact same issue and no solution.
If I "workaround" it by connecting each Start-Job, I get throttled by SharePoint online.
function Empty-PnPFiles($SPSiteUrl, $RelativeURL)
{
$connection = Connect-PnPOnline -URL $SPSiteUrl -UseWebLogin -ReturnConnection
# Get All files in the folder
$Files = Get-PnPFolderItem -FolderSiteRelativeUrl $FolderSiteRelativeURL -ItemType File
# Delete all files in the Folder
$n = 0
$Jobs = #()
ForEach ($File in $Files)
{
$n++
Write-Host "Creating job to delete '$($File.ServerRelativeURL)'"
#Delete File
$Jobs += Start-Job -ArgumentList #($connection, $File) -ScriptBlock {
$LocalConnection = $args[0]
# $LocalConnection = Connect-PnPOnline -URL <My SP URL> -UseWebLogin -ReturnConnection
$LocalFile = $args[1]
Remove-PnPFile -ServerRelativeUrl $LocalFile.ServerRelativeURL -Connection $LocalConnection -Force -Recycle
}
# Do in batches. Every 10 Jobs, wait for completion
if ($n % 10 -eq 0)
{
Write-Host "Waiting for batch $n ($($Files.Count)) to complete before next batch" -ForegroundColor Green
$Jobs | Wait-Job | Receive-Job
$Jobs = #()
}
}
# If there are left over jobs, wait for them
if ($Jobs)
{
$Jobs | Wait-Job | Receive-Job
}
}
$SiteURL = "https://<MySiteCollection>.sharepoint.com/sites/<MySite>"
$ListName = "TempDelete"
Empty-PnPFiles -SPSiteUrL $SiteURL -RelativeURL "/TempDelete" # <My Folder to Delete all files>
The error I get is:
Cannot bind parameter 'Connection'. Cannot convert the "PnP.PowerShell.Commands.Base.PnPConnection" value of type "Deserialized.PnP.PowerShell.Commands.Base.PnPConnection" to type "PnP.PowerShell.Commands.Base.PnPConnection".
How can I pass the connection to the script block without the serialization error? Or is there a better way to bulk-delete files from SPO using PowerShell? I have to use PowerShell because it's the only tool available to me currently.
Use Invoke-Command instead of Start-Job

PowerShell Invoke-Command Returns Blank Data?

Been trying to solve this for a bit and can't seem to figure it out.
I have the following script:
$Servers = Get-Content -Path "C:\Utilities_PowerShell\ServerList.txt"
$IISServiceName1 = 'W3SVC'
$IISServiceName2 = 'IISAdmin'
$IISServiceName3 = 'WAS'
$IISarrService = Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3
$IISarrServiceCheck = Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3 -ErrorAction SilentlyContinue -ErrorVariable NoService
function IISServiceStatus # Checks for status of IIS services
{
param (
$IISServiceName1,
$IISServiceName2,
$IISServiceName3,
$IISarrService,
$IISarrServiceCheck
)
if (Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3)
{
Write-Host "Status of IIS service(s) on $env:ComputerName :"
Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3 | Select Name,DisplayName,Status | Format-Table -AutoSize
}
else
{
Write-Host " No IIS service(s) were found..." -foreground "red"
}
}
$Sessions = New-PSSession -ComputerName $Servers
$EndJobs = $Sessions | ForEach-Object {
Invoke-Command -Session $_ -ScriptBlock ${function:IISServiceStatus} -AsJob -ArgumentList $IISServiceName1, $IISServiceName2, $IISServiceName3, $IISarrService, $IISarrServiceCheck | Wait-Job | Receive-Job
Write-Host " "
}
Whenever I run it, all I get is the output of:
Status of IIS service(s) on *PC* :
If I run the function outside of a loop/invoke-command, the results are absolutely perfect. What is wrong with my remote loop?
I've tried putting the variables inside the function, I've tried running invoke-command without the argument list, etc.
Update: 3/17/16
Turns out...if I run my actual script as is, the result of $EndJobs is weird in that it outputs ALL services in one table and then the three IIS services in another table. This would explain why when I run my invoke-command (stopIIS) scriptblock...I had to reboot the whole server because it took all of the services down.
These functions run PERFECTLY when not run via remote/invoke-command.
What the heck...invoke-command is seriously screwing with my stuff!
Anyone have any ideas/tips on how I can run my local script (which works 100%) on a set of servers from a text file without weird issues like this? Is invoke-command the only way?
do you have the same problem if you wrap it all into the script block like this?
$Servers = Get-Content 'C:\Utilities_PowerShell\ServerList.txt'
$Sessions = New-PSSession -ComputerName $Servers
$EndJobs = $Sessions | ForEach-Object {
Invoke-Command -Session $_ -ScriptBlock {
$IISServiceName1 = 'W3SVC'
$IISServiceName2 = 'IISAdmin'
$IISServiceName3 = 'WAS'
$IISarrService = Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3
$IISarrServiceCheck = Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3 -ErrorAction SilentlyContinue -ErrorVariable NoService
function IISServiceStatus { # Checks for status of IIS services
param (
$IISServiceName1,
$IISServiceName2,
$IISServiceName3,
$IISarrService,
$IISarrServiceCheck
)
if (Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3) {
Write-Host "Status of IIS service(s) on $env:ComputerName :"
Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3 | Select Name,DisplayName,Status | Format-Table -AutoSize
} else {
Write-Host ' No IIS service(s) were found...' -ForegroundColor Red
}
}
IISServiceStatus $IISServiceName1 $IISServiceName2 $IISServiceName3 $IISarrService $IISarrServiceCheck
} -AsJob | Wait-Job | Receive-Job
Write-Host ' '
}
$EndJobs
I'm having a similar issue. I'm using credssp to test 2nd hop auth for an automation for shutting down a production environment cleanly. My script has 3 sections; session setup, the invoke, session teardown. If I run each piece separately, I get output. If I run the whole script, I get blank lines matching the amount of output I get when I run them separately... there's nothing fancy in my invoke (backtick line continuation - I prefer Python's formatting paradigm better than Powershell/C#):
Invoke-Command `
-Session $workingSession `
-ScriptBlock {
get-service *spool* -ComputerName server01
}

Update IIS 6 WebSite Credentials via powershell

I am trying to write a script that will loop through all of my local IIS website and update their physical path credentials whenever I'm forced to update my domain password.
The following works... the first time you run it...
function Set-Site-Credentials(
$SiteElement,
$Credentials
){
$SiteElement.virtualDirectoryDefaults.userName = "$($Credentials.Domain)\$($Credentials.UserName)"
$SiteElement.virtualDirectoryDefaults.password = $Credentials.Password
$SiteElement | Set-Item -Force
}
After running this, I noticed that the following properties also get set
$SiteElement.userName #Same as was set earlier on .virtualDirectoryDefaults
$SiteElement.password #Same as was set earlier on .virtualDirectoryDefaults
Subsequently, anytime I try to update the credentials using the code above, these two properties remain unchanged, and the changes don't take affect in IIS.
So the result is:
$SiteElement.userName #Unchanged
$SiteElement.password #Unchanged
$SiteElement.virtualDirectoryDefaults.userName #New value
$SiteElement.virtualDirectoryDefaults.password #New value
And the IIS site still shows the old username in the UI and the credentials fail.
So naturally I tried setting those extra 2 properties in my update function:
function Set-Site-Credentials(
$SiteElement,
$Credentials
){
$SiteElement.userName = "$($Credentials.Domain)\$($Credentials.UserName)"
$SiteElement.password = $Credentials.Password
$SiteElement.virtualDirectoryDefaults.userName = "$($Credentials.Domain)\$($Credentials.UserName)"
$SiteElement.virtualDirectoryDefaults.password = $Credentials.Password
$SiteElement | Set-Item -Force
}
The code throws no errors or warnings, but the end result is the same, those 2 extra properties remain unchanged.
I am using the following code to get "$SiteElement"
$sites = Get-ChildItem IIS:\Sites
$sites | Foreach-Object { Set-Site-Credentials -SiteElement $_ -Credentials $newCredentials }
Also, at the end of the script I restart IIS using this command:
Restart-Service W3SVC
Ugh, finally found a command that works. All in all I've tried 4 different variation of the same thing from different example around the interwebz, all of which only work the first time. But this command updates properly on subsequent changes:
function Set-Site-Credentials(
$SiteElement,
$Credentials
){
Set-WebConfiguration -Filter "$($SiteElement.ItemXPath)/application[#path='/']/virtualDirectory[#path='/']" -Value #{userName="$($Credentials.Domain)\$($Credentials.UserName)"; password="$($Credentials.Password)"}
}
The full script
param (
[switch]$All,
[switch]$AllPools,
[switch]$AllSites,
[string]$AppPool,
[string]$Site
)
Import-Module WebAdministration
function Set-AppPool-Credentials(
$AppPoolElement,
$Credentials
){
Set-ItemProperty $AppPoolElement.PSPath -name processModel -value #{userName="$($Credentials.Domain)\$($Credentials.UserName)";password="$($Credentials.Password)";identitytype=3}
}
function Set-Site-Credentials(
$SiteElement,
$Credentials
){
Set-WebConfiguration -Filter "$($SiteElement.ItemXPath)/application[#path='/']/virtualDirectory[#path='/']" -Value #{userName="$($Credentials.Domain)\$($Credentials.UserName)"; password="$($Credentials.Password)"}
}
$newCredentials = (Get-Credential).GetNetworkCredential()
$appPools = Get-ChildItem IIS:\AppPools
$sites = Get-ChildItem IIS:\Sites
if($All -or $AllPools){
$appPools | Foreach-Object { Set-AppPool-Credentials -AppPoolElement $_ -Credentials $newCredentials }
}
elseif($AppPool){
$poolElement = ($appPools | Where-Object { $_.name -eq $AppPool })
Set-AppPool-Credentials -AppPoolElement $poolElement -Credentials $newCredentials
}
if($All -or $AllSites){
$sites | Foreach-Object { Set-Site-Credentials -SiteElement $_ -Credentials $newCredentials }
}
elseif($Site){
$siteElement = ($sites | Where-Object { $_.name -eq $Site })
Set-Site-Credentials -SiteElement $siteElement -Credentials $newCredentials
}
Restart-Service W3SVC

Send files over PSSession

I just burned a couple of hours searching for a solution to send files over an active PSSession. And the result is nada, niente. I'm trying to invoke a command on a remote computer over an active session, which should copy something from a network storage. So, basically this is it:
icm -Session $s {
Copy-Item $networkLocation $PCLocation }
Because of the "second hop" problem, I can't do that directly, and because I'm running win server 2003 I cant enable CredSSP. I could first copy the files to my computer and then send/push them to the remote machine, but how? I tried PModem, but as I saw it can only pull data and not push.
Any help is appreaciated.
This is now possible in PowerShell / WMF 5.0
Copy-Item has -FromSession and -toSession parameters. You can use one of these and pass in a session variable.
eg.
$cs = New-PSSession -ComputerName 169.254.44.14 -Credential (Get-Credential) -Name SQL
Copy-Item Northwind.* -Destination "C:\Program Files\Microsoft SQL Server\MSSQL10_50.SQL2008R2\MSSQL\DATA\" -ToSession $cs
See more examples at here, or you can checkout the official documentation.
If it was a small file, you could send the contents of the file and the filename as parameters.
$f="the filename"
$c=Get-Content $f
invoke-command -session $s -script {param($filename,$contents) `
set-content -path $filename -value $contents} -argumentlist $f,$c
If the file is too long to fit in whatever the limits for the session are, you could read the file in as chunks, and use a similar technique to append them together in the target location
PowerShell 5+ has built-in support for doing this, described in David's answer.
I faced the same problem a while ago and put together a proof-of-concept for sending files over a PS Remoting session. You'll find the script here:
https://gist.github.com/791112
#requires -version 2.0
[CmdletBinding()]
param (
[Parameter(Mandatory=$true)]
[string]
$ComputerName,
[Parameter(Mandatory=$true)]
[string]
$Path,
[Parameter(Mandatory=$true)]
[string]
$Destination,
[int]
$TransferChunkSize = 0x10000
)
function Initialize-TempScript ($Path) {
"<# DATA" | Set-Content -Path $Path
}
function Complete-Chunk () {
#"
DATA #>
`$TransferPath = `$Env:TEMP | Join-Path -ChildPath '$TransferId'
`$InData = `$false
`$WriteStream = [IO.File]::OpenWrite(`$TransferPath)
try {
`$WriteStream.Seek(0, 'End') | Out-Null
`$MyInvocation.MyCommand.Definition -split "``n" | ForEach-Object {
if (`$InData) {
`$InData = -not `$_.StartsWith('DATA #>')
if (`$InData) {
`$WriteBuffer = [Convert]::FromBase64String(`$_)
`$WriteStream.Write(`$WriteBuffer, 0, `$WriteBuffer.Length)
}
} else {
`$InData = `$_.StartsWith('<# DATA')
}
}
} finally {
`$WriteStream.Close()
}
"#
}
function Complete-FinalChunk ($Destination) {
#"
`$TransferPath | Move-Item -Destination '$Destination' -Force
"#
}
$ErrorActionPreference = 'Stop'
Set-StrictMode -Version Latest
$EncodingChunkSize = 57 * 100
if ($EncodingChunkSize % 57 -ne 0) {
throw "EncodingChunkSize must be a multiple of 57"
}
$TransferId = [Guid]::NewGuid().ToString()
$Path = ($Path | Resolve-Path).ProviderPath
$ReadBuffer = New-Object -TypeName byte[] -ArgumentList $EncodingChunkSize
$TempPath = ([IO.Path]::GetTempFileName() | % { $_ | Move-Item -Destination "$_.ps1" -PassThru}).FullName
$Session = New-PSSession -ComputerName $ComputerName
$ReadStream = [IO.File]::OpenRead($Path)
$ChunkCount = 0
Initialize-TempScript -Path $TempPath
try {
do {
$ReadCount = $ReadStream.Read($ReadBuffer, 0, $EncodingChunkSize)
if ($ReadCount -gt 0) {
[Convert]::ToBase64String($ReadBuffer, 0, $ReadCount, 'InsertLineBreaks') |
Add-Content -Path $TempPath
}
$ChunkCount += $ReadCount
if ($ChunkCount -ge $TransferChunkSize -or $ReadCount -eq 0) {
# send
Write-Verbose "Sending chunk $TransferIndex"
Complete-Chunk | Add-Content -Path $TempPath
if ($ReadCount -eq 0) {
Complete-FinalChunk -Destination $Destination | Add-Content -Path $TempPath
Write-Verbose "Sending final chunk"
}
Invoke-Command -Session $Session -FilePath $TempPath
# reset
$ChunkCount = 0
Initialize-TempScript -Path $TempPath
}
} while ($ReadCount -gt 0)
} finally {
if ($ReadStream) { $ReadStream.Close() }
$Session | Remove-PSSession
$TempPath | Remove-Item
}
Some minor changes would allow it to accept a session as a parameter instead of it starting a new one. I found the memory consumption on the Remoting service on the destination computer could grow quite large when transferring large files. I suspect PS Remoting wasn't really designed to be used this way.
NET USE allows you to add a local drive letter for the remote system, which then then allows you to use the drive letter in your PSSession, or even without a PSSession. This is helpful if you don't have Powershell v5.0, and even if you do,
You may use the remote machine name or its IP address as part of the remote UNC path and you can specify the username and password credentials on the same line:
NET USE Z: \\192.168.1.50\ShareName /USER:192.168.1.50\UserName UserPassword
Another example:
NET USE Z: \\RemoteSystem\ShareName /USER:RemoteSystem\UserName UserPassword
OR
NET USE Z: \\RemoteSystem\ShareName /USER:Domain\UserName UserPassword
If you don't supply the user credentials on the same line, you will be prompted for them:
>NET USE Z: \\192.168.1.50\ShareName
Enter the user name for '192.168.1.50': 192.168.1.50\UserName
Enter the password for 192.168.1.50: *****
The command completed successfully.
You may remove the drive letter when you're finished with the following:
NET USE Z: /delete
You can get the full syntax with NET USE /?
>net use /?
The syntax of this command is:
NET USE
[devicename | *] [\\computername\sharename[\volume] [password | *]]
[/USER:[domainname\]username]
[/USER:[dotted domain name\]username]
[/USER:[username#dotted domain name]
[/SMARTCARD]
[/SAVECRED]
[[/DELETE] | [/PERSISTENT:{YES | NO}]]
NET USE {devicename | *} [password | *] /HOME
NET USE [/PERSISTENT:{YES | NO}]
NET is a standard external .exe command in the system folder and works in Powershell just fine.
$data = Get-Content 'C:\file.exe' -Raw
Invoke-Command -ComputerName 'server' -ScriptBlock { $using:data | Set-Content -Path 'D:\filecopy.exe' }
Don't actually know what the maximum file size limitation is.