Copying files from one source to multiple destinations in parallel - powershell

I'm attempting to write a microsoft powershell script which copies files from a single source to multiple destinations in parallel based on a config file. The config file is a CSV file which looks like this:
Server,Type
server1,Production
server2,Staging
My script is called with one argument (.\myscript.ps1 buildnumber) but it doesn't seem to actually do any deleting or copying of files.
I'm sure my copy-item and remove-item code works as I have tested them independently but I think its either an issue with how I am using script blocks or perhaps how I am using start-job.
Could anyone help me understand why this isn't working?
Thanks
Brad
<#
File Deployment Script
#>
#REQUIRES -Version 2
param($build)
$sourcepath = "\\server\software\$build\*"
$Config = import-csv -path C:\config\serverlist.txt
$scriptblock1 = {
$server = $args[0]
$destpath1 = "\\$server\share\Software Wizard\"
$destpath2 = "\\$server\share\Software Wizard V4.9XQA\"
remove-item "$destpath1\*" -recurse -force
remove-item "$destpath2\*" -recurse -force
copy-item $sourcepath -destination $destpath1 -recurse -force
copy-item $sourcepath -destination $destpath2 -recurse -force
}
$scriptblock2 = {
$server = $args[0]
$destpath = "\\$server\share\Software Wizard\"
#remove-item "$destpath\*" -recurse -force
copy-item $sourcepath -destination $destpath -recurse -force
}
foreach ($line in $Config) {
$server = $line.Server
$type = $line.Type
if ($type -match "Staging") {
Write-Host "Kicking job for $server off"
start-job -scriptblock $scriptblock2 -ArgumentList $server
}
if ($type -match "Production") {
Write-Host "Kicking job for $server off"
start-job -scriptblock $scriptblock2 -ArgumentList $server
}
}

Your script block doesn't have access to variables declared outside of it when it's called from start-job. So $scriptblock1 and $scriptblock2 can't see $sourcepath.

To elaborate on Jamey's answer, you can see that the $sourcepath variable declared in the caller scope is not available within the job by comparing the output of the two calls below:
$sourcepath = 'source path'
$scriptblock = { Write-Host "sourcepath = $sourcepath; args = $args" }
& $scriptblock 'server name'
Start-Job $scriptblock -ArgumentList 'server name' | Wait-Job | Receive-Job
To fix this, simply pass the outer variable as part of the argument list:
$scriptblock2 = {
param($sourcepath, $server)
$destpath = ...
Copy-Item $sourcepath -Destination $destpath -Recurse -Force
}
...
Start-Job -Scriptblock $scriptblock2 -ArgumentList $sourcepath,$server

Related

PS Script to uninstall Firefox from multiple locations

I am working on creating a script to uninstall Firefox from multiple locations. I have a script that I've created and it works to an extent. I have made changes to my original script based on the answer below plus some other changes
$LocalUsers = (Get-ChildItem -Path "C:\Users").name
# Uninstalling from Program Files
if (Test-Path "${env:ProgramFiles(x86)}\Mozilla Firefox\uninstall\helper.exe"){
Start-Process -FilePath "${env:ProgramFiles(x86)}\Mozilla Firefox\uninstall\helper.exe" -ArgumentList '/S' -Verbose #-ErrorAction SilentlyContinue
}
if (Test-Path "${env:ProgramFiles}\Mozilla Firefox\uninstall\helper.exe"){
Start-Process -FilePath "${env:ProgramFiles}\Mozilla Firefox\uninstall\helper.exe" -ArgumentList '/S' -Verbose #-ErrorAction SilentlyContinue
}
# Uninstalling for each user
ForEach ($LocalUser in $LocalUsers){
$Userpath = "C:\Users\" + $LocalUser
if (Test-Path "$Userpath\AppData\Local\Mozilla Firefox\uninstall\helper.exe"){
Start-Process -FilePath "$Userpath\AppData\Local\Mozilla Firefox\uninstall\helper.exe" -ArgumentList '/S' -Verbose #-ErrorAction SilentlyContinue
}
Start-Sleep 20
# Remove shortcuts from appdata
Remove-Item -Path "$userpath\AppData\Local\Mozilla" -Force -Recurse -Verbose #-ErrorAction SilentlyContinue
Remove-Item -Path "$userpath\AppData\LocalLow\Mozilla" -Force -Recurse -Verbose #-ErrorAction SilentlyContinue
Remove-Item -Path "$userpath\AppData\Roaming\Microsoft\Windows\Start Menu\Programs\Firefox.lnk" -Force -Verbose #-ErrorAction SilentlyContinue
Remove-Item -Path "$userpath\desktop\firefox.lnk" -Force -Verbose #-ErrorAction SilentlyContinue
}
# Remove related registry keys
$pathToRemove = #(
'HKLM:\Software\Mozilla'
'HKLM:\SOFTWARE\mozilla.org'
'HKLM:\SOFTWARE\MozillaPlugins'
'HKLM:\SOFTWARE\WOW6432Node\Mozilla'
'HKLM:\SOFTWARE\WOW6432Node\mozilla.org'
'HKLM:\SOFTWARE\WOW6432Node\MozillaPlugins'
'C:\ProgramData\Microsoft\Windows\Start Menu\Programs\Firefox.lnk'
)
foreach($path in $pathToRemove) {
if(Test-Path $path) {
try {
Remove-Item $path -Recurse -Force -Verbose #-ErrorAction SilentlyContinue
}
catch {
Write-Warning $_.Exception.Message
}
}
}
The script has worked on some machines where it uninstalls the application, however, for others trace of it is being left behind in Windows Program Files. It is appearing as a dead link. I know it is a dead link because it is missing the Firefox logo. The strange thing is its points to %localappdata%\Mozilla Firefox\uninstall\helper.exe per the error
What the app should look like if installed (ignoring the version just a screenshot from online):
I'm assuming the problem is your chained if \ elseif \ else conditions, what could be happening is that if the first condition was $true you're only removing the first registry key and then exiting the chained conditions (this is by design):
# only results in 'hello if' and then exits the chained conditions
if($true) {
'hello if'
}
elseif($true) {
'hello elseif'
}
What you can do in this case is store all the paths in an array and then loop over them, testing if the path exists and, if it does, remove it:
$pathToRemove = #(
'HKLM:\Software\Mozilla'
'HKLM:\SOFTWARE\mozilla.org'
'HKLM:\SOFTWARE\MozillaPlugins'
'HKLM:\SOFTWARE\WOW6432Node\Mozilla'
'HKLM:\SOFTWARE\WOW6432Node\mozilla.org'
'HKLM:\SOFTWARE\WOW6432Node\MozillaPlugins'
'C:\ProgramData\Microsoft\Windows\Start Menu\Programs\Firefox.lnk'
)
foreach($path in $pathToRemove) {
if(Test-Path $path) {
try {
Write-Verbose "Attempting to remove: $path" -Verbose
Remove-Item $path -Recurse -Force
Write-Verbose "Successfully removed: $path" -Verbose
}
catch {
Write-Warning $_.Exception.Message
}
}
}

New-PSsession to create a loop and wait to finish foreach line in text file

I am trying to get files from servers in a list using the below
$server = Get-Content server.txt
$server| ForEach-Object {
$session=new-pssession -computername $server -credential (Import-Clixml "mycredentials.xml")
Invoke-Command -Session $session -ScriptBlock ${function:getfiles}
Copy-Item -path "C:\some\folder\*" -Destination "C:\localfolder" -recurse -FromSession $session
}
If I supply explicitly a name in -computername, works like a charm.
When there are several names in the list, the execution stops after the first one. I suspect that the session closes after the first execution.
Is there a way to make it like this:
get-content -> for each line execute the copy-item -> close session -> open new session to new server -> .....etc, meaning that $session will be only for the current server.
$function:getfiles
function getfiles {
New-Item -Force -Path C:\path\trace.txt
$remoteserver=$env:computername
$trace='C:\path\trace.txt'
$Include = #('*.keystore', '*.cer', '*.crt', '*.pfx', '*.jks', '*.ks')
$exclude = '^C:\\(Windows|Program Files|Documents and Settings|Users|ProgramData)|\bBackup\b|\breleases?\b|\bRECYCLE.BIN\b|\bPerfLogs\b|\bold\b|\bBackups\b|\brelease?\b|'
Get-ChildItem -Path 'C:\','D:\' -file -Include $include -Recurse -EA 0|
Where-Object { $_.DirectoryName -notmatch $exclude } |
Select-Object -ExpandProperty FullName |
Set-Content -Path $trace
$des = "C:\some\folder\$remoteserver"
$safe = Get-Content $trace
$safe | ForEach-Object{
#find drive-delimeter
$first=$_.IndexOf(":\");
if($first -eq 1){
#stripe it
$newdes=Join-Path -Path $des -ChildPath #($_.Substring(0,1)+$_.Substring(2))[0]
}
else{
$newdes=Join-Path -Path $des -ChildPath $_
}
$folder=Split-Path -Path $newdes -Parent
$err=0
#check if folder exists"
$void=Get-Item $folder -ErrorVariable err -ErrorAction SilentlyContinue
if($err.Count -ne 0){
#create when it doesn't
$void=New-Item -Path $folder -ItemType Directory -Force -Verbose
}
$void=Copy-Item -Path $_ -destination $newdes -Recurse -Container -Verbose
}
}
UPDATE
So I have found out that the file where the lines should be be redirected from the script is not populated, which explains why the next step for copy-item fails. I have tried redirecting in different ways, still cant get it populated. The file is created without issues.
Made a workaround - placed the function in a script which is copied to the remote server / execute it \ clean afterwards.

How to log IF statement in PowerShell?

I made a logon PowerShell script to check files, if older than source then copy newer one to PC. I am trying to have the result logged but my log file is always empty. Where did I do wrong?
# Set Source
$S_P = "\\NETWORK\S_P.exe"
$S_T = "\\NETWORK\S_T.exe"
$S_P_Date = (Get-Item $S_P -ErrorAction SilentlyContinue).LastWriteTime.ToString("yyyy-MM-dd")
$S_T_Date = (Get-Item $S_T -ErrorAction SilentlyContinue).LastWriteTime.ToString("yyyy-MM-dd")
# Set Destination
$D_P = "C:\TEMP1\S_P.exe"
$D_T = "C:\TEMP2\S_T.exe"
$DF_P = "C:\TEMP1"
$DF_T = "C:\TEMP2"
$D_P_Date = (Get-Item $D_P -ErrorAction SilentlyContinue).LastWriteTime.ToString("yyyy-MM-dd")
$D_T_Date = (Get-Item $D_T -ErrorAction SilentlyContinue).LastWriteTime.ToString("yyyy-MM-dd")
$D_Log = "C:\TEMP\updated.txt"
# Compare date and Copy
function Check_Copy {
if (!(Test-Path $D_Log)) {New-Item $D_Log}
if ((Test-Path $D_P) -and (Test-Path $D_T)) {
if ($D_P_Date -le $S_P_Date) {Copy-Item $S_P $DF_P -Force}
if ($D_T_Date -le $S_T_Date) {Copy-Item $S_T $DF_T -Force}
} else {
Copy-Item $S_P $DF_P -Force
Copy-Item $S_T $DF_T -Force
}
}
Check_Copy | Out-File $D_Log -Append
You need to use the PassThru Parameter for the output to be piped out to your file. Otherwise, the output is hidden.
Simply add it to your Copy-Item lines so that it looks like this:
Copy-Item $S_P $DF_P -Force -PassThru
Copy-Item $S_T $DF_T -Force -PassThru
Read this TechNet blog to learn more about using object pass through in PowerShell.

Cannot bind argument to parameter 'Path' it is null and cannot find path

Trying to get these two to work but keep getting errors. Basically looking to pickup all files from C:\Temp\Test with .txt extension and copy it to Server1 and Server2 D:\Temp\Test.
Doesn't work...
$servers = "Server1","Server2"
$SourcePath = (Get-ChildItem C:\Temp\Test *.txt).Name
$servers | ForEach {
Invoke-Command $servers -ScriptBlock {
$CompName = (Get-WmiObject -Class Win32_ComputerSystem).Name
$DestPath = "\\$CompName\D$\Temp\Test"
Copy-Item $SourcePath -Destination $DestPath -Recurse
}
}
This is a common mistake actually. When you use Invoke-Command to invoke your scriptblock on the remote server it creates a new instance of PowerShell on that remote computer. That new instance of PowerShell has no idea what the $SourcePath variable is, since it was never set in that new instance. To work around this give your scriptblock a parameter, and then supply the value of $SourcePath when you invoke to scriptblock. It can be done like this:
$servers = "Server1","Server2"
$SourcePath = (Get-ChildItem C:\Temp\Test *.txt).Name
$servers | ForEach {
Invoke-Command $servers -ScriptBlock {
Param($SourcePath)
$CompName = (Get-WmiObject -Class Win32_ComputerSystem).Name
$DestPath = "\\$CompName\D$\Temp\Test"
Copy-Item $SourcePath -Destination $DestPath -Recurse
} -ArgumentList $SourcePath
}

Moving files asynchronously in powershell

I have the following problem: I am writing a loop that checks if some files appeared in a folder and if so then moves those files to another folder.
The script works nicely now, here is its code:
$BasePath = "C:\From"
$TargetPath = "C:\To"
$files = Get-ChildItem -File -Recurse -Path "$($BasePath)\$($Filename)" -ErrorAction SilentlyContinue
foreach ($file in $files)
{
$subdirectorypath = split-path $file.FullName.Replace($BasePath, "").Trim("\")
$targetdirectorypath = "$($TargetPath)\$($subdirectorypath)"
if ((Test-Path $targetdirectorypath) -eq $false)
{
Write-Host "Creating directory: $targetdirectorypath"
md $targetdirectorypath -Force
}
Write-Host "Copying file to: $($targetdirectorypath.TrimEnd('\'))\$($File.Name)"
Move-Item $File.FullName "$($targetdirectorypath.TrimEnd('\'))\$($File.Name)" -Force
}
However, as some of those files can be quite big, I would like to move those files asynchronously, in a "fire-and-forget" way. What is the best way to do it with powershell? This script will probably be running forever, so any asynchronous jobs would have to dispose themselves after they are done copying I think.
Thanks for suggestions
I would use a background job:
$scriptblock = {
$BasePath = $args[0]
$TargetPath = $args[1]
$files = Get-ChildItem -File -Recurse -Path "$($BasePath)\$($Filename)" -ErrorAction SilentlyContinue
foreach ($file in $files)
{
$subdirectorypath = split-path $file.FullName.Replace($BasePath, "").Trim("\")
$targetdirectorypath = "$($TargetPath)\$($subdirectorypath)"
if ((Test-Path $targetdirectorypath) -eq $false)
{
Write-Host "Creating directory: $targetdirectorypath"
md $targetdirectorypath -Force
}
Write-Host "Copying file to: $($targetdirectorypath.TrimEnd('\'))\$($File.Name)"
Move-Item $File.FullName "$($targetdirectorypath.TrimEnd('\'))\$($File.Name)" -Force
}
}
$arguments = #("C:\From","C:\To")
start-job -scriptblock $scriptblock -ArgumentList $arguments
If later you want to see any output from the job you can do the following
Get-Job | Receive-Job