I have the following problem: I am writing a loop that checks if some files appeared in a folder and if so then moves those files to another folder.
The script works nicely now, here is its code:
$BasePath = "C:\From"
$TargetPath = "C:\To"
$files = Get-ChildItem -File -Recurse -Path "$($BasePath)\$($Filename)" -ErrorAction SilentlyContinue
foreach ($file in $files)
{
$subdirectorypath = split-path $file.FullName.Replace($BasePath, "").Trim("\")
$targetdirectorypath = "$($TargetPath)\$($subdirectorypath)"
if ((Test-Path $targetdirectorypath) -eq $false)
{
Write-Host "Creating directory: $targetdirectorypath"
md $targetdirectorypath -Force
}
Write-Host "Copying file to: $($targetdirectorypath.TrimEnd('\'))\$($File.Name)"
Move-Item $File.FullName "$($targetdirectorypath.TrimEnd('\'))\$($File.Name)" -Force
}
However, as some of those files can be quite big, I would like to move those files asynchronously, in a "fire-and-forget" way. What is the best way to do it with powershell? This script will probably be running forever, so any asynchronous jobs would have to dispose themselves after they are done copying I think.
Thanks for suggestions
I would use a background job:
$scriptblock = {
$BasePath = $args[0]
$TargetPath = $args[1]
$files = Get-ChildItem -File -Recurse -Path "$($BasePath)\$($Filename)" -ErrorAction SilentlyContinue
foreach ($file in $files)
{
$subdirectorypath = split-path $file.FullName.Replace($BasePath, "").Trim("\")
$targetdirectorypath = "$($TargetPath)\$($subdirectorypath)"
if ((Test-Path $targetdirectorypath) -eq $false)
{
Write-Host "Creating directory: $targetdirectorypath"
md $targetdirectorypath -Force
}
Write-Host "Copying file to: $($targetdirectorypath.TrimEnd('\'))\$($File.Name)"
Move-Item $File.FullName "$($targetdirectorypath.TrimEnd('\'))\$($File.Name)" -Force
}
}
$arguments = #("C:\From","C:\To")
start-job -scriptblock $scriptblock -ArgumentList $arguments
If later you want to see any output from the job you can do the following
Get-Job | Receive-Job
Related
I am looking to move files from various paths to E:\backups\folder1\export, E:\Backups\folder2\backups, E:\backups\folder3\export and using below code...but strangely it runs twice on the same path and doesn't move to next path and duplicates the output too.
Cls
$date = Get-Date -Format "yyyMMdd_hhmmss"
#$sourcePath = "C:\Program Files\Atlassian\Application Data\JIRA\export\"
#$destPath = "E:\Backup\Jira\export"
$config = Import-Csv -Path 'E:\Backup\Scripts\Atlassian_Backups.csv'
#Write-Output $config
Start-Transcript -Path E:\Backup\Logs\Atlassian_backupMove.log
foreach ($item in $config)
{
Write-Host "Moving all files in '$($sourcePath)' to '$($destPath)'"
$fileList = #(Get-ChildItem -Path "$($sourcePath)" -File)
#Write-Output $fileList
if ($filelist.count -gt 0)
{
Write-host $filelist.Count
ForEach($file in $fileList)
{
try {
#Move-Item -Path $file.FullName -Destination ((Split-Path
$file.FullName).Replace("$($sourcePath)",$destPath)) -Force -ErrorAction Stop
Copy-Item -Path $file.FullName -Destination $destPath -Verbose -Force -ErrorAction Stop |
Format-table
}
catch{
Write-Warning "Unable to move '$($file.FullName)' to '$(((Split-Path
$file.FullName).Replace("$($sourcePath)",$destPath)))': $($_)"
return
}
}
}
}
Stop-Transcript
Rename-Item E:\Backup\Logs\Atlassian_backupMove.log
E:\Backup\Logs\Atlassian_backupMove_$date.log
write-host Log File has been created and renamed to Atlassian_backupMove_$date.log'
below was changed in my code and worked well.
{#Move-Item -Path $file.FullName -Destination ((Split-Path
$file.FullName).Replace("$($sourcePath)",$destPath)) -Force -ErrorAction Stop
Copy-Item -Path $file.FullName -Destination $item.destPath -Verbose -Force -
ErrorAction Stop | Format-table } catch{ Write-Warning "Unable to move
'$($file.FullName)' to '$(((Split-Path
$file.FullName).Replace("$($item.sourcePath)",$item.destPath)))': $($_)" return
}
I am trying to get files from servers in a list using the below
$server = Get-Content server.txt
$server| ForEach-Object {
$session=new-pssession -computername $server -credential (Import-Clixml "mycredentials.xml")
Invoke-Command -Session $session -ScriptBlock ${function:getfiles}
Copy-Item -path "C:\some\folder\*" -Destination "C:\localfolder" -recurse -FromSession $session
}
If I supply explicitly a name in -computername, works like a charm.
When there are several names in the list, the execution stops after the first one. I suspect that the session closes after the first execution.
Is there a way to make it like this:
get-content -> for each line execute the copy-item -> close session -> open new session to new server -> .....etc, meaning that $session will be only for the current server.
$function:getfiles
function getfiles {
New-Item -Force -Path C:\path\trace.txt
$remoteserver=$env:computername
$trace='C:\path\trace.txt'
$Include = #('*.keystore', '*.cer', '*.crt', '*.pfx', '*.jks', '*.ks')
$exclude = '^C:\\(Windows|Program Files|Documents and Settings|Users|ProgramData)|\bBackup\b|\breleases?\b|\bRECYCLE.BIN\b|\bPerfLogs\b|\bold\b|\bBackups\b|\brelease?\b|'
Get-ChildItem -Path 'C:\','D:\' -file -Include $include -Recurse -EA 0|
Where-Object { $_.DirectoryName -notmatch $exclude } |
Select-Object -ExpandProperty FullName |
Set-Content -Path $trace
$des = "C:\some\folder\$remoteserver"
$safe = Get-Content $trace
$safe | ForEach-Object{
#find drive-delimeter
$first=$_.IndexOf(":\");
if($first -eq 1){
#stripe it
$newdes=Join-Path -Path $des -ChildPath #($_.Substring(0,1)+$_.Substring(2))[0]
}
else{
$newdes=Join-Path -Path $des -ChildPath $_
}
$folder=Split-Path -Path $newdes -Parent
$err=0
#check if folder exists"
$void=Get-Item $folder -ErrorVariable err -ErrorAction SilentlyContinue
if($err.Count -ne 0){
#create when it doesn't
$void=New-Item -Path $folder -ItemType Directory -Force -Verbose
}
$void=Copy-Item -Path $_ -destination $newdes -Recurse -Container -Verbose
}
}
UPDATE
So I have found out that the file where the lines should be be redirected from the script is not populated, which explains why the next step for copy-item fails. I have tried redirecting in different ways, still cant get it populated. The file is created without issues.
Made a workaround - placed the function in a script which is copied to the remote server / execute it \ clean afterwards.
I made a logon PowerShell script to check files, if older than source then copy newer one to PC. I am trying to have the result logged but my log file is always empty. Where did I do wrong?
# Set Source
$S_P = "\\NETWORK\S_P.exe"
$S_T = "\\NETWORK\S_T.exe"
$S_P_Date = (Get-Item $S_P -ErrorAction SilentlyContinue).LastWriteTime.ToString("yyyy-MM-dd")
$S_T_Date = (Get-Item $S_T -ErrorAction SilentlyContinue).LastWriteTime.ToString("yyyy-MM-dd")
# Set Destination
$D_P = "C:\TEMP1\S_P.exe"
$D_T = "C:\TEMP2\S_T.exe"
$DF_P = "C:\TEMP1"
$DF_T = "C:\TEMP2"
$D_P_Date = (Get-Item $D_P -ErrorAction SilentlyContinue).LastWriteTime.ToString("yyyy-MM-dd")
$D_T_Date = (Get-Item $D_T -ErrorAction SilentlyContinue).LastWriteTime.ToString("yyyy-MM-dd")
$D_Log = "C:\TEMP\updated.txt"
# Compare date and Copy
function Check_Copy {
if (!(Test-Path $D_Log)) {New-Item $D_Log}
if ((Test-Path $D_P) -and (Test-Path $D_T)) {
if ($D_P_Date -le $S_P_Date) {Copy-Item $S_P $DF_P -Force}
if ($D_T_Date -le $S_T_Date) {Copy-Item $S_T $DF_T -Force}
} else {
Copy-Item $S_P $DF_P -Force
Copy-Item $S_T $DF_T -Force
}
}
Check_Copy | Out-File $D_Log -Append
You need to use the PassThru Parameter for the output to be piped out to your file. Otherwise, the output is hidden.
Simply add it to your Copy-Item lines so that it looks like this:
Copy-Item $S_P $DF_P -Force -PassThru
Copy-Item $S_T $DF_T -Force -PassThru
Read this TechNet blog to learn more about using object pass through in PowerShell.
I'm currently writing a simple PowerShell script.
Basically, it should get the list of servers from a notepad and start to unzip the .zip file on each server and extract it to the new folder.
However, the script is not extracting all files under the zip file.
It would only extract one file from it and I'm not sure why the foreach loop not working properly.
Please shed some light on this issue. Thanks.
$servers = Get-Content "C:\tmp\script\new_unzip\servers.txt"
$Date = ((Get-Date).ToString('dd-MM-yyyy_HH-mm-ss'))
foreach ($server in $servers) {
$shell = new-object -com shell.application
$target_path = "\\$server\c$\Temp\FFPLUS_Temp"
$location = $shell.namespace($target_path)
$ZipFiles = Get-ChildItem -Path $target_path -Filter *.zip
$ZipFiles | Unblock-File
foreach ($ZipFile in $ZipFiles) {
$ZipFile.fullname | out-default
$NewLocation = "\\$server\c$\Temp\FFPLUS_Temp\$Date"
New-Item $NewLocation -type Directory -Force -ErrorAction SilentlyContinue
Move-Item $ZipFile.fullname $NewLocation -Force -ErrorAction SilentlyContinue
$NewZipFile = Get-ChildItem $NewLocation *.zip
$NewLocation = $shell.namespace($NewLocation)
$ZipFolder = $shell.namespace($NewZipFile.fullname)
$NewLocation.copyhere($ZipFolder.items())
}
}
$servers = Get-Content "C:\tmp\script\updated\servers.txt"
$Date = ((Get-Date).ToString('dd-MM-yyyy_HH-mm-ss'))
foreach ($server in $servers)
{
$zipFolder = "\\$server\c$\Temp\FFPLUS_Temp"
Add-Type -assembly System.IO.Compression.Filesystem
$zipFiles = Get-ChildItem -Path $zipFolder -Filter *.zip
foreach($zip in $zipFiles)
{
$destPath = "\\$server\c$\Temp\FFPLUS_Temp\$Date"
New-Item -ItemType Directory $destPath
[io.compression.zipfile]::ExtractToDirectory([string]$zip.FullName, "$destPath")
Move-Item $zip.fullname $destPath -Force -ErrorAction SilentlyContinue
}
}
I'm attempting to write a microsoft powershell script which copies files from a single source to multiple destinations in parallel based on a config file. The config file is a CSV file which looks like this:
Server,Type
server1,Production
server2,Staging
My script is called with one argument (.\myscript.ps1 buildnumber) but it doesn't seem to actually do any deleting or copying of files.
I'm sure my copy-item and remove-item code works as I have tested them independently but I think its either an issue with how I am using script blocks or perhaps how I am using start-job.
Could anyone help me understand why this isn't working?
Thanks
Brad
<#
File Deployment Script
#>
#REQUIRES -Version 2
param($build)
$sourcepath = "\\server\software\$build\*"
$Config = import-csv -path C:\config\serverlist.txt
$scriptblock1 = {
$server = $args[0]
$destpath1 = "\\$server\share\Software Wizard\"
$destpath2 = "\\$server\share\Software Wizard V4.9XQA\"
remove-item "$destpath1\*" -recurse -force
remove-item "$destpath2\*" -recurse -force
copy-item $sourcepath -destination $destpath1 -recurse -force
copy-item $sourcepath -destination $destpath2 -recurse -force
}
$scriptblock2 = {
$server = $args[0]
$destpath = "\\$server\share\Software Wizard\"
#remove-item "$destpath\*" -recurse -force
copy-item $sourcepath -destination $destpath -recurse -force
}
foreach ($line in $Config) {
$server = $line.Server
$type = $line.Type
if ($type -match "Staging") {
Write-Host "Kicking job for $server off"
start-job -scriptblock $scriptblock2 -ArgumentList $server
}
if ($type -match "Production") {
Write-Host "Kicking job for $server off"
start-job -scriptblock $scriptblock2 -ArgumentList $server
}
}
Your script block doesn't have access to variables declared outside of it when it's called from start-job. So $scriptblock1 and $scriptblock2 can't see $sourcepath.
To elaborate on Jamey's answer, you can see that the $sourcepath variable declared in the caller scope is not available within the job by comparing the output of the two calls below:
$sourcepath = 'source path'
$scriptblock = { Write-Host "sourcepath = $sourcepath; args = $args" }
& $scriptblock 'server name'
Start-Job $scriptblock -ArgumentList 'server name' | Wait-Job | Receive-Job
To fix this, simply pass the outer variable as part of the argument list:
$scriptblock2 = {
param($sourcepath, $server)
$destpath = ...
Copy-Item $sourcepath -Destination $destpath -Recurse -Force
}
...
Start-Job -Scriptblock $scriptblock2 -ArgumentList $sourcepath,$server