Copy-File in scriptblock can't find path - powershell

I have implemented a PS Script that deploys code on multiple servers at the same time. Here I need to copy some source file from one server to another. See the code below:
for ($i=1; $i -le 5; $i++) {
$serverName="iwflO" + $i
$sourceFile="\\iwdflO1\C$\Deploy\bin"
$destination="\\$serverName\C$\Program Files (X86)\Shian\MyService\bin\"
$Myblock = {
Param{$sourceFile,$destination)
Copy-Item -Force -Recurse $sourceFile -Destination $destination
}
$result = Invoke-Command -ComputerName $ServerName -Credential "shian" -ScriptBlock $Myblock -ArgumentList $sourceFile,$destination;
$result;
}
cd c:\
It's working fine for iwflO1 which is the root server from where I'm running the script but for other servers it's giving me an error like
Cannot find Path "\iwdflO1\C$\Deploy\bin" because it does not exist.
But if I logged in to iwflO2 or any other server and hit the path manually its working fine.

I can see the mistake is with the block :
Instead of this:
$Myblock={param{$sourceFile,$destination)
copy-Item -Force -Recurse $sourceFile -Destination $destination
}
Do this:
$Myblock={param($sourceFile,$destination)
copy-Item -Force -Recurse $sourceFile -Destination $destination
}
This is working fine if I am hardcoding the server names(tested in my local)
Since you are using admin share, directly try this:
Copy-Item -Path \\serverA\c$\programs\temp\test.txt -Destination \\serverB\c$\programs\temp\test.txt;
Note: You have to specify the file. Else you get-childitem -recurse inside the source folder and put it directly in the destination .
Hope it helps.

Related

How to schedule a PS script in task scheduler which copies files from remote computer to local machine?

My requirement was to copy certain files from remote machine to the local machine and I have written mapped the folder to the local machine.I use the drive letter of the mapped folder and the script works fine. But when I schedule the task, it throws error that the drive doesn't exists or isn't mapped.
Please find the script below:
$destination = "D:\Script\"
$source = "T:\"
$datefolders = Get-ChildItem $source
$TodayString = (Get-Date).ToString("yyyy-MM-dd")
foreach($folder in $datefolders)
{
$foldername = $folder.ToString()
if ($foldername -eq $TodayString)
{
$path1 = $source + $foldername + "\*\*.csv"
Copy-Item -Path $path1 -Destination $destination -Force
$allfiles = Get-ChildItem "D:\Script\"
foreach($file in $allfiles)
{
$filename = ([io.fileinfo]$file).basename
$filepath = $destination + $file
$connectorfolder = "D:\Script\" + $filename + "\"
Move-Item -Path $filepath -Destination $connectorfolder -Force
}
break
}
}
Please note that the script works fine when ran directly. Please help me on how I can provide the remote server details which is shown as "T:\" drive in my script.
The error I can see in Event viewer is given below:
Error Message = Could not find the drive 'T:\'. The drive might not be ready or might not be mapped.
Please help.
Regards,
Mitesh Agrawal

Copying multiple files in parallel through power-shell without using any third party software?

Problem Statement:
I am trying to copy 100 of files (each of them like more than a GB in size) from source to the destination directory, I am automating this by a power-shell script. While executing the script the copy operation is copying the files in sequence. Is there any way we can copy them in parallel to reduce some time as it is taking a lot of time to copy all the files & have a limitation of using any third-party software.
$DATAFileDir="D:\TEST_FOLDER\DATAFILESFX\*"
$LOGFileDir="D:\TEST_FOLDER\LOGFILESFX\*"
$DestDataDir="D:\TEST_FOLDER\Data\"
$DestLogDir="D:\TEST_FOLDER\Log\"
#Copying the Primary file
Copy-Item -Path $DATAFileDir -Destination $DestDataDir -Recurse -Force -Verbose
#Copying the Audit File
Copy-Item -Path $LOGFileDir -Destination $DestLogDir -Recurse -Force -Verbose
Any suggestion for it ?
You can start job individual process for every file you want to copy.
$Source = Get-ChildItem -Path C:\SourceFolder -Recurse | Select -ExpandProperty FullName
$Destination = 'C:\DestinationFolder'
foreach ($Item in #($Source)){
#starting job for every item in source list
Start-Job -ScriptBlock {
param($Item,$Destination) #passing parameters for copy-item
#doing copy-item
Copy-Item -Path $Item -Destination $Destination -Recurse -Force
} -ArgumentList $Item,$Destination #passing parameters for copy-item
}
You should be able to achieve this quite easily with a powershell workflow. The throttlelimit will throttle how many files will be copied in parallel. Remove it to copy all files in parallel (probably not recommended for 100 files).
workflow copyfiles {
param($files)
foreach -parallel -throttlelimit 3 ($file in $files) {
Copy-Item -Path $file -Destination 'C:\destination\' -Force -verbose
}
}
$files = Get-ChildItem -Path C:\source -Recurse -File
copyfiles $files.FullName
Or you can use start-threadjob. If you have ps5, you can get threadjob from the gallery. https://powershellgallery.com/packages/ThreadJob/2.0.0 Or foreach-object -parallel in ps 7 https://devblogs.microsoft.com/powershell/powershell-foreach-object-parallel-feature/
start-bitstransfer? https://learn.microsoft.com/en-us/powershell/module/bitstransfer/start-bitstransfer?view=win10-ps
start-bitstransfer z:\files\*.iso c:
This powershell script uses .NET Framework classes directly, and should perform faster, even for lots of files. Use throttlelimit to control how much parallelization you need.
param([String]$argSourceRootDir,[String]$argTargetRootDir)
workflow copyfiles {
param($sourceRootDir, $targetRootDir)
$sourcePaths = [System.IO.Directory]::GetFiles($sourceRootDir, "*.*", "AllDirectories")
foreach -parallel -throttlelimit 8 ($sourcePath in $sourcePaths) {
$targetPath = $sourcePath.Replace($sourceRootDir, $targetRootDir)
$targetDir = $targetPath.Substring(0, $targetPath.Length - [System.IO.Path]::GetFileName($targetPath).Length - 1)
if(-not (Test-Path $targetDir))
{
$x = [System.IO.Directory]::CreateDirectory($targetDir)
$z = [Console]::WriteLine("new directory: $targetDir")
}
$z = [Console]::WriteLine("copy file: $sourcePath => $targetPath")
$x = [System.IO.File]::Copy($sourcePath, $targetPath, "true")
}
}
copyfiles $argSourceRootDir $argTargetRootDir
Just save this code as ParallelCopy.ps1 and run it like this:
. ParallelCopy.ps1 "C:\Temp\SourceDir" "C:\Temp\TargetDir"
If all the 100 files are getting published to single redshift table then,
Redshift has the capability to load multiple files in parallel using a single copy command.
Checkout the redshift documentation : https://docs.aws.amazon.com/redshift/latest/dg/t_splitting-data-files.html

PowerShell SQL Job Step Move-Item not working on 1 server

This identical code has been used in 3 servers, and only one of them does it silently fail to move the items (it still REMOVES them, but they do not appear in the share).
Azure-MapShare.ps1
param (
[string]$DriveLetter,
[string]$StorageLocation,
[string]$StorageKey,
[string]$StorageUser
)
if (!(Test-Path "${DriveLetter}:"))
{
cmd.exe /c "net use ${DriveLetter}: ${StorageLocation} /u:${StorageUser} ""${StorageKey}"""
}
Get-Exclusion-Days.ps1
param (
[datetime]$startDate,
[int]$daysBack
)
$date = $startDate
$endDate = (Get-Date).AddDays(-$daysBack)
$allDays =
do {
"*"+$date.ToString("yyyyMMdd")+"*"
$date = $date.AddDays(-1)
} until ($date -lt $endDate)
return $allDays
Migrate-Files.ps1
param(
[string]$Source,
[string]$Filter,
[string]$Destination,
[switch]$Remove=$False
)
#Test if source path exist
if((Test-Path -Path $Source.trim()) -ne $True) {
throw 'Source did not exist'
}
#Test if destination path exist
if ((Test-Path -Path $Destination.trim()) -ne $True) {
throw 'Destination did not exist'
}
#Test if no files in source
if((Get-ChildItem -Path $Source).Length -eq 0) {
throw 'No files at source'
}
if($Remove)
{
#Move-Item removes the source files
Move-Item -Path $Source -Filter $Filter -Destination $Destination -Force
} else {
#Copy-Item keeps a local copy
Copy-Item -Path $Source -Filter $Filter -Destination $Destination -Force
}
return $True
The job step is type "PowerShell" on all 3 servers and contains this identical code:
#Create mapping if missing
D:\Scripts\Azure-MapShare.ps1 -DriveLetter 'M' -StorageKey "[AzureStorageKey]" -StorageLocation "[AzureStorageAccountLocation]\backup" -StorageUser "[AzureStorageUser]"
#Copy files to Archive
D:\Scripts\Migrate-Files.ps1 -Source "D:\Databases\Backup\*.bak" -Destination "D:\Databases\BackupArchive"
#Get date range to exclude
$exclusion = D:\Scripts\Get-Exclusion-Days.ps1 -startDate Get-Date -DaysBack 7
#Remove items that are not included in exclusion range
Remove-Item -Path "D:\Databases\BackupArchive\*.bak" -exclude $exclusion
#Move files to storage account. They will be destroyed
D:\Scripts\Migrate-Files.ps1 -Source "D:\Databases\Backup\*.bak" -Destination "M:\" -Remove
#Remove remote backups that are not from todays backup
Remove-Item -Path "M:\*.bak" -exclude $exclusion
If I run the job step using SQL then the files get removed but do not appear in the storage account. If I run this code block manually, they get moved.
When I start up PowerShell on the server, I get an error message: "Attempting to perform the InitializeDefaultDrives operation on the 'FileSystem' provider failed." However, this does not really impact the rest of the operations (copying the backup files to BackupArchive folder, for instance).
I should mention that copy-item also fails to copy across to the share, but succeeds in copying to the /BackupArchive folder
Note sure if this will help you but you could try to use the New-PSDrive cmdlet instead of net use to map your shares:
param (
[string]$DriveLetter,
[string]$StorageLocation,
[string]$StorageKey,
[string]$StorageUser
)
if (!(Test-Path $DriveLetter))
{
$securedKey = $StorageKey | ConvertTo-SecureString -AsPlainText -Force
$credentials = New-Object System.Management.Automation.PSCredential ($StorageUser, $securedKey)
New-PSDrive -Name $DriveLetter -PSProvider FileSystem -Root $StorageLocation -Credential $credentials -Persist
}
Apparently I tricked myself on this one. During testing I must have run the net use command in an elevated command prompt. This apparently hid the mapped drive from non-elevated OS features such as the Windows Explorer and attempts to view its existence via non-elevated command prompt sessions. I suppose it also was automatically reconnecting during reboots because that did not fix it.
The solution was as easy as running the net use m: /delete command from an elevated command prompt.

Powershell Remove-Item IF file already exists after Copy-item

I need to add a safety net in my script. I'm trying to do a copy job based on a list of users provided through a txt file. Copy the files from that users home directory to a new location. Once the files are copied, check if the file exists in the new location. If yes, then Remove-Item.
Can someone help me? I just don't know how to implement the "if file exists" logic.
$username = Get-Content '.\users.txt'
foreach ($un in $username)
{
$dest = "\\server\homedirs\$un\redirectedfolders"
$source = "\\server\homedirs\$un"
New-Item -ItemType Directory -Path $dest\documents, $dest\desktop
Get-ChildItem $source\documents -Recurse -Exclude '*.msg' | Copy-Item -Destination $dest\documents
Get-ChildItem $source\desktop -Recurse -Exclude '*.msg' | Copy-Item -Destination $dest\desktop
Get-ChildItem $source\mydocuments, $source\desktop -Recurse -Exclude '*.msg' | Remove-Item -Recurse
}
The shortest way to delete file if it doesn't exist is NOT to use Test-Path but:
rm my_file.zip -ea ig
This is short version of
rm my_file.zip -ErrorAction Ignore
which is much more readable and more DRY then
if (Test-Path my_file.zip) { rm my_file.zip }
To answer your question per se, you can do it like this:
Get-ChildItem $source\mydocuments, $source\desktop -Recurse -Exclude '*.msg' | %{
if (Test-Path ($_. -replace "^$([regex]::escape($source))","$dest")) {
Remove-Item $_ -Recurse
}
}
Test-Path returns $true if the file at the given path exists, otherwise $false.
$_ -replace "^$([regex]::escape($source))","$dest" converts the path of each source item you're enumerating with the corresponding destination path, by replacing $source at the beginning of the path with $dest.
The basic regex for the first argument to the -replace operator is ^$source (which means "match the value of $source at the beginning of the string"). However, you need to use [regex]::escape in case $source contains any regex special characters, which is in fact extremely likely with Windows paths, since they contain backslashes. For example, the value you've given here for $source contains \s, which in a regex means "any whitespace character". $([regex]::escape($source)) will interpolate the value of $source with any regex special characters properly escaped, so that you're matching the explicit value.
That said, if your purpose is to copy each item to a new location, and remove the original only if the copy to the new location is successful, it seems like you're reinventing the wheel. Why not just use Move-Item instead of Copy-Item?
Not directly related to the question, but rather than repeating the same command for each subdirectory, you can use a foreach loop:
foreach ($subdir in (echo documents desktop)) {
# Whatever command you end up using to copy or move the items,
# using "$source\$subdir" and "$dest\$subdir" as the paths
}
Test-Path commandlet will help you check if the file exists
http://technet.microsoft.com/en-us/library/ee177015.aspx
#Adi Inbar, I need to use a function like this because I need to move files to a remote session, and the Move-Item does not work when I tried -ToSession... only Copy-Item.
The Key is that if the power or internet goes down, the script will delete the file even if it wasn't copied.
$username = "name"
$password = ConvertTo-SecureString "password" -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential -ArgumentList ($username, $password)
$Session = New-PSSession -ComputerName "IPAdress" -Credential $credential
Copy-Item -Path C:\userPC_1\csv-output\*.csv -Destination C:\userPC_2\Documents\Test_Scripts -ToSession $Session -Verbose
Get-PSSession | Remove-PSSession
Get-ChildItem -Path C:\userPC_1\csv-output\*.csv | Remove-Item -Force

PowerShell: Copy-Item Cannot find path

I'm trying to get PowerShell to copy files from a remote computer (on which I have admin rights through AD) to the local computer.
It fails in the strangest place. Here's a snippet of the script:
$configs = Get-ChildItem -Recurse -ErrorAction SilentlyContinue -Filter "*.config" $serverUNCPath
foreach($config in $configs){
$config_target_dir = $dest.Path + $config.Directory.FullName.Replace($serverUNCPath,"")
if(Test-Path -Path $config_target_dir){
Copy-Item $config -Destination $config_target_dir
}
}
It fails with the message
Cannot find path 'D:\ServerDeploy\TestMachine1\website\web.config' because it does not exist.
At :line:39 char:12
+ Copy-Item <<<< $config -Destination $config_target_dir
The path D:\ServerDeploy\TestMachine1\website exists. I'm going mad over this.
What can I do to fix it?
Eeeeh.... OK?
If I replaced the line
Copy-Item $config -Destination $config_target_dir
with
Copy-Item $config.FullName $config_target_dir
it suddenly magically worked....
What gives?