Copy-item cmdlets only working correctly when the destination folder exists - powershell

I want to copy folders with their contents to a remote computer using a PSSession and Copy-item. When running the script for the first time it has to create the destination folder, it does so correctly and then is supposed to dump the folders with their contents inside into the destination folder. What it is instead doing is dumping two of the folders correctly and then dumping the contents of the third folder, not the folder itself. When I run it a second time without deleting the destination folder, everything runs fine.
I have tried using various different parameters, including -container but it doesn't seem to help at all. Here is where I use the function in my code, I use a lot of environment variables and variables in general because this needs to be a script that can be put anywhere and work.
if (Test-Path -path "$env:TEMP\VMlogs") {
Write-Host "I'M GONNA SEND IT!"; Pause
Copy-Item -path "$env:TMP\VMLogs\*" -tosession $Targetsession -destination $Destination`_$Source -force -recurse
Write-Host Logs copied sucessfully!
Remove-Item "$env:TEMP\VMlogs" -recurse
} else {
Write-Host "There was an issue copying logs!"
Pause
Exit
What I expect is that the folders are put into the destination folder with their intact structure but instead this only happens on the second running of the script, after the destination folder has already been created.

Related

Powershell Script running, but no action taken

The following 2-line script attempts to copy a file from a source folder to a destination folder (where a copy of the file already exist).
Copy-Item -Path "D:\Thunderbird\Profiles\xyz.default\ImapMail\imap.gmail.com\msgFilterRules.dat" -Destination D:\Dropbox\Bkup\Bkp_Thunderbird\ -Force
Write-Host "Script executed"
The script executes without error (!) - i.e., I see "Script executed" in PowerShell - but no files are copied.
I have:
validated that Source and Destination paths are correct.
tried the script with and without quotes in the source path.
tried it with Thunderbird both open and not.
tried it with and without a "\" at the end of -Destination.
tried it with and without -Force at the end of the line.

How do I copy multiple files from multiple hosts in powershell?

I am trying to make a powershell script (5.1) that will copy several files and folders from several hosts, these hosts change frequently therefore it would be ideal if I can use a list that I can append when required.
I have this all working using xcopy so I know the locations exist. I want to ensure that if a change is made when I am not In work someone can just add or remove a host in the text file and the back up will continue to work.
The code I have is supposed to go through each host in my list of hosts and copy all the files from the list of file paths before moving onto the next host.
But there are 2 errors showing up:
The term '\REMOTEHOST\c$\Users\Public\desktop\back-up\$Computers' is not recognized as the name of a cmdlet, function, script
file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
At line:8 char:17
and:
copy-item : Cannot find path '\HOST\C$\LK\Appdata\Cmmcfg C$\LKAppData\Errc C$\LK\Appdata\TCOMP C$\LK\Probes C$\LK\Appdata\CAMIO C$\LK\Appdata\LaunchPad C$\LK\Appdata\Wincmes
C$\barlen.dta C$\Caliprogs C$\Cali' because it does not exist.
This does not seem to reading through the list as I intended, I have also noticed that the HOST it is reading from is 6th in the list and not first.
REM*This file contains the list of hosts you want to copy files from*
$computers = Get-Content 'Y:\***FILEPATH***\HOSTFILE.txt'
REM*This is the file/folder(s) you want to copy from the hosts in the $computer variable*
$source = Get-Content 'Y:\***FILEPATH***\FilePaths.txt'
REM*The destination location you want the file/folder(s) to be copied to*
$destination = \\**REMOTEHOST**\c$\Users\Public\desktop\back-up\$Computers
foreach ($item in $computers) {
}
foreach ($item in $source) {
}
copy-item \\$computer\$source -Destination $destination -Verbose
Your destination variable needs to be enclosed in quotes. To have it evaluate other variables inside of it, enclose it in double quotes. Otherwise PowerShell thinks it's a command you are trying to run.
$destination = "\\**REMOTEHOST**\c$\Users\Public\desktop\back-up\$Computers"
cracked it, thank you for your help. I was messing up the foreach command!I had both variables set to Item, so I was confusing things!
foreach ($itemhost in $computers) {
$destination = "\Remotehost\c$\Users\xoliver.jeffries\desktop\back-up\$itemhost"
foreach ($item in $source)
{copy-item "\$itemhost\$item*" -Destination $destination -Verbose -recurse}
}
Its not the neatest output but that's just a snag! the code now enables me to use a list of hosts and a list files and copy them to a remote server!

Copy All Items within a Drive and Export to New Destination

I need to be able to copy all items within a drive and move them to a new destination using Powershell. At present, I've tried doing this with Copy-Item but am unable to do so within a drive. I've looked for solutions elsewhere but have yet to find a working fix, Any suggestions?
Copy-Item -Path 'P:' -Destination 'Destination'
If you're trying just to copy all of the items from one drive to a folder on another drive, you can use Copy-Item, like you are, with a few adjustments (-Destination path is just an example):
Copy-Item -Path C:\* -Destination F:\destinationPath -Recurse
Using the * in C:\* for the path tells the shell to get all files and directories at that folder, in this case, the C: root. -Recurse tells it to copy all files and directories recursively. Consider adding the -Force parameter if you want to use this in automation, as it will forcibly overwrite any existing files at the destination rather than prompt for input.

Copying group of folders, first in line get's it's contents spilled in the open

I want to copy a group of folders which have various files inside them from one place to a single folder elsewhere. I want to bind said folder group to a variable.
param(
$folders=('../folder1','../folder2')
)
Copy-Item -Path $folders -Destination '../folder3' -Recurse -Force;
This works, however, inside folder3, folder1's contents are spilled out, while folder2's contents are placed in a folder of the same name just like intended.
I need them both to be copied intact, if I switch their places then folder2 gets the same treatment. It's like the script does not read the first folder in line in same way as the others. Am I missing something?
EDIT:
Managed to find a work-around by running additional command to create a folder inside "folder3" named same as first in line folder before copying. Script then places the files inside that folder correctly. Still rather messy, I wonder if it's a bug.
Use a loop
foreach($folder in $folders){
Copy-Item -Path $folder -Destination '../folder3'-Recurse -Force
}

Excluding a file while using Copy-Item

I'm having a few issues while copying desktop files as a part of the back up process from local folder to server.
My script automatically creates a backup folder on the server and I just realised when it creates the back up folder windows automatically creates a desktop.ini file, and as desktop.ini is a system file the script failed to write another desktop.ini from desktop to the backup folder in server.
My initial code was :
Copy-Item ("$Global:localBackupRestoreLocation\Desktop\$file") ("$Global:remoteBackupRestorePath\$nameOfbackupDirectory") ($recursive="true")
But I was to use the exclude in that line and I know exclude does not work recursively and I have to use Get-ChildItem.
Try something like this
$Exclude = #('desktop.ini')
copy-item -Path $Source -Destination $Dest -Exclude $Exclude