I have a PS script which Zips up the previous months logs and names the zip file FILENAME-YYYY-MM.zip
This works
What I now want to do is copy these zip files off to a network share but keeping some of the folder structure. I currently a folder structure similar to the following;
C:\Folder1\
C:\Folder1\Folder2\
C:\Folder1\Folder3\
C:\Folder1\Folder4\Folder5\
There are .zip files in every folder below c:\Folder1
What I want is for the script to copy files from c:\folder1 to \\networkshare but keeping the folder structure, so I should have 3 folders and another subfolder in folder4.
Currently I can only get it to copy the whole structure so I get c:\folder1\... in my \\networkshare
I keep running into issues such as the new folder structure doesn't exist, I can't use the -recurse switch within the Get-ChildItem command etc...
The script I have so far is;
#This returns the date and formats it for you set value after AddMonths to set archive date -1 = last month
$LastWriteMonth = (Get-Date).AddMonths(-3).ToString('MM')
#Set destination for Zip Files
$DestinationLoc = "\\networkshare\LogArchive\$env:computername"
#Source files
$SourceFiles = Get-ChildItem C:\Sourcefiles\*.zip -Recurse | where-object {$_.lastwritetime.month -le $LastWriteMonth}
Copy-Item $SourceFiles -Destination $DestinationLoc\ZipFiles\
Remove-Item $SourceFiles
Sometimes, you just can't (easily) use a "pure PowerShell" solution. This is one of those times, and that's OK.
Robocopy will mirror directory structures, including any empty directories, and select your files (likely faster than a filter with get-childitem will). You can copy anything older than 90 days (about 3 months) like this:
robocopy C:\SourceFiles "\\networkshare\LogArchive\$($env:computername)\ZipFiles" /E /IS /MINAGE:90 *.zip
You can specify an actual date with /MINAGE too, if you have to be that precise.
How about Copy-Item "C:\SourceFiles\" -dest $DestinationLoc\ZipFiles -container -recurse? I have tested this and have found that it copies the folder structure intact. If you only need *.zip files, you first get them, then for each you call Resolve-Path with -Relative flag set and then add the resultant path into Destination parameter.
$oldloc=get-location
Set-Location "C:\SourceFiles\" # required for relative
$SourceFiles = Get-ChildItem C:\Sourcefiles\*.zip -Recurse | where-object {$_.lastwritetime.month -le $LastWriteMonth}
$SourceFiles | % {
$p=Resolve-Path $_.fullname -relative
copy-item $_ -destination "$DestinationLoc\ZipFiles\$p"
}
set-location $oldloc # return back
Related
The folder named "z:\original" has hundreds of sub-folders containing multiple copies of the same .jpg files. I wanted to copy all .jpg files into a folder named "z:\dump" WITHOUT the folder structure and hopefully overwrite most of the copies. I used
copy-Item -path "Z:\original" -filter "*.jpg" -Destination "Z:\dump" -recurse -verbose
but this recreated the structure with the .jpg files. How can I dump all files into a single folder, using PowerShell?
Use combination of dir/Get-ChildItem and Copy-Item with pipeline.
$_.FullName - Full path to jpg file
$_.Name is file name only
Get-ChildItem Z:\original\*.jpg -Recurse | %{Copy-Item $_.FullName "Z:\dump\$($_.Name)"}
You might need to add -Force to overwrite same files
I am trying to copy all the files in a directory that contains many subfolders into a single separate folder. When the code is run again, rather than replacing each file in the destination folder, it should skip files that have the same timestamp and only replace those that are older.
I have used robocopy to skip the copying of files that are of the current version/older in the destination folder. However, robocopy only copies the entire directory along with its folder structure so I am unable to obtain the desired folder with a list of all the files from the source.
I have also used get child-item and then copy-item. However, although this is able to get rid of the folder structure, it overwrites each file for each iteration and is thus time-consuming.
So what I want is to combine the capabilities of robocopy and copy-item. Note that there are no specific pattern to the files that I am to copy. It is simply to COPY each file in the subdirectories that are EITHER of a NEWER version or NON-existing into a single folder.
#For copying and ease of updating destination folder
robocopy /purge /np /S /xo 'source' 'destination'
#To copy items into the destination folder without keeping folder structure
Get-ChildItem -Path 'source' -Recurse -File | Copy-Item -Destination 'destination'
Was unable to combine both, So I am stuck with using the 'copy-item' code, which is quite time consuming when copying/updating large amounts of files.
The purpose of robocopy is to preserve the folder structure. If you want to mangle subfolders robocopy is not the right tool. Use the Get-ChildItem approach, group the results by file name, sort each group by date, pick the most recent file from each group, and copy it if the corresponding destination file either doesn't exist or is older.
Something like this should do what you want:
Get-ChildItem -Path 'C:\source' -Recurse -File |
Group-Object Name |
ForEach-Object {
$src = $_.Group | Sort-Object | Select-Object -Last 1
$dst = Join-Path 'C:\destination' $src.Name
if (-not (Test-Path $dst) -or ($src.LastWriteTime -gt (Get-Item $dst).LastWriteTime)) {
$src | Copy-Item -Destination $dst
}
}
My company has individual folders on a share for each project they are working on, and if no files inside one of those folders or its subfolders has been touched in the last six months, I want to move them to an archive location. If any one file within the folder or any of its subfolders have been modified in the last six months, I want to skip the entire parent directory. I'm most of the way there now, but my current iteration only skips the individual files, and I'm not sure how to specify skipping the entire parent. Here is my current script:
$Date = (Get-Date).AddMonths(-6)
$Source = 'C:\Scripts\Source'
$Dest = 'C:\Scripts\Test Target'
Get-ChildItem $Source -File -Recurse | Where {$_.LastWriteTime -lt $Date} | ForEach {
$actualSource = Split-Path $_.FullName
$actualDest = Split-Path $_.FullName.Replace($source,$dest)
robocopy $actualSource $actualDest $_.Name /SEC
}
When using my test directories, I have a folder C:\Scripts\Source\Drivers. The script copies that Drivers folder like I want it to, but if I put a newer file anywhere within that Drivers folder, I want the entire folder to be skipped. Currently, the folder and anything older than six months within the folder are still being copied, and it is just skipping the individual files which are newer.
Please let me know if any more information is needed.
Simply pull back your copy and recurse statement one level up. First you want to iterate through all the parent folders. Then for each parent folder, recurse and check to see if there is any modified files, if there is, then copy the folder:
$Date = (Get-Date).AddMonths(-6)
$Source = 'C:\Scripts\Source'
$Dest = 'C:\Scripts\Test Target'
$ParentFolders = Get-ChildItem $Source -Directory
Foreach($Folder in $ParentFolders){
$NewFiles = Get-ChildItem $Folder -File -Recurse | Where {$_.LastWriteTime -lt $Date}
if($NewFiles.Count -eq 0)
{
#Archive
robocopy $Folder $Dest /SEC
}
}
Hi I have a folder Called "A" and folder "A" has files and sub folders within it. I also have another folder directory called "Exclusion" with some copied files and folders from "A" within it. I'm looking for a Powershell script or Command Line option that will COPY & MOVE all the objects from A that are NOT found in the Exclusion directory to a new folder directory called "Output".
Thanks,
-B
Use Get-ChildItem to get a list of files in your exclusion directory, then take only the names of the files and hold those in an array.
Optionally use New-Item with the -Force parameter to ensure that your output directory exists before sending files there.
Next use Get-ChildItem to iterate through all files in our source (A) directory, use Where-Object and the -notin operator to exclude any files which have the same names as those gathered from your exclusion directory, then use Move-Item to move the files to your destination (Output) directory.
[string[]]$filenamesToExclude = Get-ChildItem -Path 'c:\somewhere\exclusion' -Recurse | Select-Object -ExpandProperty Name
New-Item -Path 'c:\somewhere\output\' -ItemType 'Directory' -Force | Out-Null #ensure the target directory exists / don't output this command's return value to the pipeline
Get-ChildItem -Path 'c:\somewhere\A' -Recurse | Where-Object {$_.Name -notin $filenamesToExclude} | Move-Item -Destination 'c:\somewhere\output\'
I'm trying to copy all of the files with .py extension from one directory to a new one using PowerShell, but I don't want to recreate the directory structure. I want this to work recursively as the the py files are in numerous subfolders. In my destination directory, I want nothing but the py files.
Here's what I have now but it copies the directory structure too:
Get-ChildItem "C:\Johns Stuff\Python\" |
Copy -Destination C:\Users\dread\python -Recurse -filter *.py
How about:
Get-ChildItem "C:\Johns Stuff\Python" -File -Filter *.py -Recurse | ForEach-Object {
Copy-Item $_.FullName C:\Users\dread\python -WhatIf
}
The -File parameter (new in PowerShell 3.0 and later) will get only files and not directories.
Of course, remove -WhatIf to actually execute the command.