How do I copy multiple files from multiple hosts in powershell? - powershell

I am trying to make a powershell script (5.1) that will copy several files and folders from several hosts, these hosts change frequently therefore it would be ideal if I can use a list that I can append when required.
I have this all working using xcopy so I know the locations exist. I want to ensure that if a change is made when I am not In work someone can just add or remove a host in the text file and the back up will continue to work.
The code I have is supposed to go through each host in my list of hosts and copy all the files from the list of file paths before moving onto the next host.
But there are 2 errors showing up:
The term '\REMOTEHOST\c$\Users\Public\desktop\back-up\$Computers' is not recognized as the name of a cmdlet, function, script
file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
At line:8 char:17
and:
copy-item : Cannot find path '\HOST\C$\LK\Appdata\Cmmcfg C$\LKAppData\Errc C$\LK\Appdata\TCOMP C$\LK\Probes C$\LK\Appdata\CAMIO C$\LK\Appdata\LaunchPad C$\LK\Appdata\Wincmes
C$\barlen.dta C$\Caliprogs C$\Cali' because it does not exist.
This does not seem to reading through the list as I intended, I have also noticed that the HOST it is reading from is 6th in the list and not first.
REM*This file contains the list of hosts you want to copy files from*
$computers = Get-Content 'Y:\***FILEPATH***\HOSTFILE.txt'
REM*This is the file/folder(s) you want to copy from the hosts in the $computer variable*
$source = Get-Content 'Y:\***FILEPATH***\FilePaths.txt'
REM*The destination location you want the file/folder(s) to be copied to*
$destination = \\**REMOTEHOST**\c$\Users\Public\desktop\back-up\$Computers
foreach ($item in $computers) {
}
foreach ($item in $source) {
}
copy-item \\$computer\$source -Destination $destination -Verbose

Your destination variable needs to be enclosed in quotes. To have it evaluate other variables inside of it, enclose it in double quotes. Otherwise PowerShell thinks it's a command you are trying to run.
$destination = "\\**REMOTEHOST**\c$\Users\Public\desktop\back-up\$Computers"

cracked it, thank you for your help. I was messing up the foreach command!I had both variables set to Item, so I was confusing things!
foreach ($itemhost in $computers) {
$destination = "\Remotehost\c$\Users\xoliver.jeffries\desktop\back-up\$itemhost"
foreach ($item in $source)
{copy-item "\$itemhost\$item*" -Destination $destination -Verbose -recurse}
}
Its not the neatest output but that's just a snag! the code now enables me to use a list of hosts and a list files and copy them to a remote server!

Related

Copy-item cmdlets only working correctly when the destination folder exists

I want to copy folders with their contents to a remote computer using a PSSession and Copy-item. When running the script for the first time it has to create the destination folder, it does so correctly and then is supposed to dump the folders with their contents inside into the destination folder. What it is instead doing is dumping two of the folders correctly and then dumping the contents of the third folder, not the folder itself. When I run it a second time without deleting the destination folder, everything runs fine.
I have tried using various different parameters, including -container but it doesn't seem to help at all. Here is where I use the function in my code, I use a lot of environment variables and variables in general because this needs to be a script that can be put anywhere and work.
if (Test-Path -path "$env:TEMP\VMlogs") {
Write-Host "I'M GONNA SEND IT!"; Pause
Copy-Item -path "$env:TMP\VMLogs\*" -tosession $Targetsession -destination $Destination`_$Source -force -recurse
Write-Host Logs copied sucessfully!
Remove-Item "$env:TEMP\VMlogs" -recurse
} else {
Write-Host "There was an issue copying logs!"
Pause
Exit
What I expect is that the folders are put into the destination folder with their intact structure but instead this only happens on the second running of the script, after the destination folder has already been created.

Is it possible to make a search and replace in file-content on multiple network locations?

I need to search for a string and then replace it with another in multiple files. Sound easy, but the hard part is that is that it's multiple files on multiple network locations. I've tried connecting to all of the locations at once with vscode and then using the built-in search and replace function. This allmost works, except when I get to big searches is seems to hang.
I'm now looking for another, more stable, way to do this. Anybody got any ideas? I thought powershell could be a good competitor, but unfortunately I'm not that used to working with powershell.
I found this guide and it's a bit like what I want, except I need to do it on multiple files at multiple locations at once.
https://mcpmag.com/articles/2018/08/08/replace-text-with-powershell.aspx
I would settle with running one skript for each location since it's only < 20 locations to scan. But it needs to include subfolders.
Any tips are appreciated, thanks! :)
Edit 1:
The folder structure differs from location to location so its hard to say how it looks. But I can say that no location has a folder structure deeper than 15 steps. The text that I'm replacing are thumbprints of certificates stored in .config files. The files are between 100 and 1000 characters long and the thumbprints I'm replacing looks something like this d2e8c58e5b34021671f2121483572f03f54ab9ae
This is assuming that the different network locations are in trusted domains or at least part of the wmi trustedhosts. PowerShell remoting will also need to be enabled on all computers involved. Run (In elevated PowerShell) Enable-PSRemoting -Force to enable PowerShell Remoting
$command = { Get-ChildItem -Path C:\Test\ -Include *.config -Name -Recurse | ForEach-Object {$configContent = Get-Content -Path $_ -Raw; $configContent.Replace("Old Value", "New Value") | Out-File -FilePath ($_.FullName) -Force } }
Invoke-Command -ComputerName "TestServer1", "TestServer2", "etc..." -ScriptBlock $command
If you are not part of the domain but have a domain/server login, you will need to use the -Credentials switch on the Invoke-Command function. This will basically find all files that have the .config extension in any subfolders in the path, get the current content of the .config file, replace your value, and finally overwrite the existing config file. WATCH OUT THOUGH this will get EVERY .config file that is in that path. If you have more than one it will also grab it, but if it doesn't have the string it will just rewrite the same file.
Without seeing an example of the folder structures and files this is quite hard to give a thorough answer on. However I would probably build a series of ForEach segments. For example:
ForEach ($Server in $Servers)
{
ForEach ($File in $Files)
{
Select-String -Path $File -Pattern "$ExampleString"
}
}

How can I use Powershell to keep two directories updated with the latest files

I have two directories:
C:\G\admin\less
C:\G\user\less
Inside of those directives I have multiple less and css files. I know all the names of the files so I would like to hardcode a list of these into the script. Perhaps this could be in an array or something but my knowledge of PowerShell is not even enough to know if there are arrays in the scripting language.
C:\G\admin\less
html-light.less
html-light.css
html-dark.less
html-dark.css
do-not-track-me.less
C:\G\user\less
html-light.less
html-light.css
html-dark.less
html-dairk.css
do-not-track-me.less
Is there a way I can use PowerShell to check each of these files (that I want to hardcode in my program) one by one and copy the last modified file from its directory to the other directory so that both directories will contain the same latest versions of these files?
Note that I would need to evaluate the predefined list of files one by one. Comparing modified date in one directory with the other and copying over as needed.
again assume that this isn't the best solution or approach
This solution assumes following
- when the LastWriteTime on one folder is bigger than the other it copy it to another folder.
- I'm not doing the path validation because of laziness but if you want with path validation just ask.
- I'm assuming that all the files on those folder must be tracked otherwise read the comment on code.
- i suggest you backup your folder before you run the script.
#if there is a file you don't want to track on those folder (for example you don't want to track txt files)
#just write $userFiles=dir C:\G\user\less\ -Exclude "*.txt"
#if you don't want track txt but only one of them should be track among with other file format
#$userFiles=dir C:\G\user\less\ -Exclude "*.txt" -Include "C:\G\user\less\Txtadditionaltotrack.txt"
$userFiles=dir C:\G\user\less\
$adminfiles=dir C:\G\admin\less\
foreach($userfile in $userFiles)
{
$exactadminfile= $adminfiles | ? {$_.Name -eq $userfile.Name} |Select -First 1
#my suggestion is to validate if it got the file.
#By now because of my lazy i will not call the test-path to validate if it got the file
#I'm assuming all directory are exact copy of each other so it will find the file.
if($exactadminfile.LastWriteTime -gt $userfile.LastWriteTime)
{
Write-Verbose "Copying $exactadminfile.FullName to $userfile.FullName "
Copy-Item -Path $exactadminfile.FullName -Destination $userfile.FullName -Force
}
else
{
Write-Verbose "Copying $userfile.FullName to $exactadminfile.FullName "
Copy-Item -Path $userfile.FullName -Destination $exactadminfile.FullName -Force
}
}
you can improve it because the way this code is it always copy file from one directory to another inside the else you can validate so that when the lastwriteTime is equal on both it doesn't copy.
You can improve it in many ways. i hope you got the ideia
Find the modification made to code so that it can archieve this requirement.
PLEASE READ THE COMMENT IN CODE.
NOTE THAT I'M NOT FOLLOWING THE BEST PRATICE (avoid unexpected error, name correctly all variable, ...)
#to make it more dynamical you can save on one file
#all the file names including extension in different lines.
#For example on path C:\FilesToWatch\watcher.txt
#$filestowatch=get-content C:\FilesToWatch\watcher.txt
$filestowatch="felicio.txt","marcos.txt"
$userFiles=dir C:\G\user\less\
$adminfiles=dir C:\G\admin\less\
#Optionally instead of use this if approach you can
#$userFiles=dir C:\G\user\less\ |? {$filestowatch -contains $_.Name}
#$adminfiles=dir C:\G\admin\less\|? {$filestowatch -contains $_.Name}
#loading in the above manner the first if statement on code bellow can be removed because
#We make sure that $userFiles and $adminfiles only have correct file to monitor
foreach($userfile in $userFiles)
{
if($filestowatch -contains $userfile.Name)
{
$exactadminfile= $adminfiles | ? {$_.Name -eq $userfile.Name} |Select -First 1
#my suggestion is to validate if it got the file.
#By now because of my lazy i will not call the test-path to validate if it got the file
#I'm assuming all directory are exact copy of each other so it will find the file.
if($exactadminfile.LastWriteTime -gt $userfile.LastWriteTime)
{
Write-Verbose "Copying $exactadminfile.FullName to $userfile.FullName "
Copy-Item -Path $exactadminfile.FullName -Destination $userfile.FullName -Force
}
else
{
Write-Verbose "Copying $userfile.FullName to $exactadminfile.FullName "
Copy-Item -Path $userfile.FullName -Destination $exactadminfile.FullName -Force
}
}
}
I think what you need is symbolic links, aka symlinks.
With symlinks, you can define files and folders that will be always in-sync, where the target file is updated automatically when the original is modified.
To create a symbolic link, enter the following in the console/command prompt:
mklink /[command] [link path] [file or folder path]
Mklink can create several types of links, according to these commands:
/D – creates a soft symbolic link, which is similar to a standard folder or file shortcut in Windows. This is the default option, and mklink will use it if you do not enter a command.
/H – creates a hard link to a file.
/J – creates a hard link to a folder.
The syntax is simple. Choose your option, define the path you want for the symlink, and finally the path of the original file/folder.
For example, imagine I'm developing a new project, and I want to share it to my client via Dropbox shared folder. I don't want to move all my workspace to dropbox, I just want to share that specific folder to them:
mklink /J C:\Dropbox\clients_shared_folders\project_x C:\my_dev_rootfolder\project_x
Note that the first path is the symbolic folder I want to create, while the second path is the existing directory.
In you case, I'll be assuming your working on the admin folder, and want to generate a syncd copy on the user folder:
mklink /J C:\G\user\less C:\G\admin\less
Here's a nice article for more info:
http://www.howtogeek.com/howto/16226/complete-guide-to-symbolic-links-symlinks-on-windows-or-linux/

How can I use spaces in a fully qualified pathname in Powershell?

I have a script that copies a number of files from different sources to a single directory for backup. The only step of the script the errors out has a space in both the path and file names: \\server\Network Shares\Transfer\tu3\tu3 Code.mdb
I get the error copy-item : Cannot find path '\\server\Network Shares\Transfer\tu3\tu3 Code.mdb' because it does not exist. and I'm assuming it's because of the spaces in either the path or filename. Does PowerShell allow spaces in a fully qualified path? If not, how can I get at the file?
Here's the relevant code (My$Destis defined as a global variable for the script):
$TU3CodeUpdatedPathname = "\\server\Network Shares\Transfer\tu3\"
$TU3CodeUpdatedFilename = "tu3 Code.mdb"
$TU3CodeUpdated = $TU3CodeUpdatedPathname + $TU3CodeUpdatedFilename
#
$Source = $TU3CodeUpdated
$Dest = $VMShareSpacePathname
#
copy-item $Source $Dest
Try being more explicit, and wrap the parameter values in quotes. Adding -Verbose might help with debugging. If it's complaining the file doesn't exist, maybe double check that the file is indeed accessible when your script runs under the account, if it's not the same as your user account.
Copy-Item -Path "$Source" -Destination "$Dest"
Just to ensure, you might have mixed up the variable names TU3/HS3?
$TU3CodeUpdatedPathname = "\\server\Network Shares\Transfer\tu3\"
$TU3CodeUpdatedFilename = "tu3 Code.mdb"
$TU3CodeUpdated = Join-Path -Path $TU3CodeUpdatedPathname -ChildPath $TU3CodeUpdatedFilename
Otherwise I can't see anything wrong with your code.
Spaces are just fine within quotes as you did write it.
I would guess the running user from the script does not have access rights to the file/share.
This post might help in that case.
This worked for me to copy folder with space in its name. I am using powershell 4.0
$Source = "D:\test\Test cases"
$Dest = "D:\bck\Test cases"
Copy-Item -Path "$Source" "$Dest" -Recurse

Read text file and run a copy command for each line in the text file

I am trying to write a Powershell script that will read a text file on my desktop that is filled with user names, then go out to a specified folder on our network share, lets say u:\data and copy the contents from that folder to another network share lets says y:\information, for each user in the text file.
How would this be written?
I have tried several things with reading the text file then trying several commands to copy and paste but they each failed.
UPDATE:
Below is what I have done so far:
$user = Get-Content "test.txt"
$path = "\\abnas2\abusers\users"
$path2 = "\\abnas2\abdept\dept\testcopy"
$Copy = Copy-Item -path $path\$user\ * -Destination $path2\$user
I had one username in the test.txt file called user1 and it pulled the name, and copied perfectly.
Now if I add more than one name to the test.txt file and run the above, it errors out. The error it returned made it look like the 3 user names in the list were one user name.
What I need this to do is run the command for each name on the list. I was thinking I could use the foreach command But not sure how to do it.
UPDATE - 04\09\2014:
I have tried the following and am getting an error back:
$user = Get-Content "test.txt"
$path = "\abnas2\abusers\users"
$path2 = "\abnas2\abdept\dept\testcopy"
$Copy = Copy-Item -path $path\$user* -Destination $path2\$user
foreach($username in $user) {
Copy-Item -path $path\$username* -Destination $path2\$username\
}
When I run it I am getting the following error:
Copy-Item : An object at the specified path \\abnas2\abusers\users\user1 user2 user3 does not exist.
These are the names in my test.txt file, is there a way to get it to read one line at a time and execute the copy and when done go to the next name on the list and do the same? I'm not sure how to get it to do that.
You can use foreach
In this case:
foreach($username in $user) {
Copy-Item -path $path\$username\* -Destination $path2\$username\
}
would copy the contents of each named folder in $user under $path to its corresponding folder in $path2.