My folder structure looks like this:
SourceFolder
├─file1.txt
├─file1.doc
└─Subfolder1
├─file2.txt
├─file2.doc
└─SubSubFolder
├─file3.txt
└─doc3.txt
This script copies all *.txt files from folders, whose (folder) names contains the eng, to a destination folder. Only the files inside the folder.
$dest = "C:\Users\username\Desktop\Final"
$source = "C:\Users\username\Desktop\Test1"
Get-ChildItem $source -Filter "*.txt" -Recurse |
Where-Object { $_.DirectoryName -match "eng" } |
ForEach-Object { Copy-Item $_.fullname $dest }
In my situation the folders are in .rar format and I want the script to search .rar folders and copy the *.txt files from folder eng to destination. Is that possible with PowerShell?
Rar is a proprietary archive format. Your .rar items aren't folders but compressed archives of that format. PowerShell can't transparently handle such archives, and I'm not aware of a third-party provider that would add this functionality.
Basically, no, what you're asking isn't possible.
What you can do is extract files from the archive with tools like 7-zip, e.g.:
7z.exe e your.rar *.txt -r
For extracting only files from a nested subfolder inside the archive prepend the extract pattern with that path:
7z.exe e your.rar "nested\subfolder\*.txt" -r
Related
I have a bit of a random task I have created for myself. I basically have a git repo in which there is a file structure and within a specific folder, I have several subfolders and nested in those folders are 3 config folders which have the same name. I am trying to create a powershell script thatll comb through the "Target Folder", copy the "Folder 1", "Folder 2", and "Folder 3", but only copy the contents of the 3 "Config Folder"s, maintaining that file structure, but only copying whats needed. Ideally, after that process, id love to rename these files with part of the name of the folder name to help differentiate. I do have plans to integrate a second part of the script to parse through those config files and export to an excel doc, but not sure how much I need that at the moment. The intended output is below, played around with a few misc file structure commands, but have not found much to help me achieve the below result.
File Structure:
Repo
TARGET FOLDER
DATA
FOLDER1
CONFIGFOLDER
MISC
FOLDER2
CONFIGFOLDER
MISC
FOLDER3
CONFIGFOLDER
ETC
Hoping to end up with
Export Folder
TARGET FOLDER
FOLDER1
CONFIGFOLDER
List of files with "FOLDER1_ogfilename.yaml"
FOLDER2
CONFIGFOLDER
List of files with "FOLDER2_ogfilename.yaml"
FOLDER3
CONFIGFOLDER
List of files with "FOLDER3_ogfilename.yaml"
I have created the following item to attempt this, and it copies the file structure, but it creates a folder for each .yaml file within that folder.
$sourceDir = "C:\Users\hhh\appdev\hhh\data\environments"
$targetDir = "C:\Users\hhh\appdev\targetfolder"
Get-ChildItem $sourceDir -Recurse | % {
$dest = $targetDir + $_.FullName.SubString($sourceDir.Length)
If (!($dest.Contains('research,qa,production,global')) -and !(Test-Path $dest))
{
mkdir $dest
}
Copy-Item $_.FullName -Destination $dest -Force
}
There are issues with your code.
you need to add switch -File to the Get-ChildItem cmdlet to have it look for files, not the directories inside $sourceDir
use Join-Path to construct your destination folder path. By adding the two strings together like you do, you will be missing a backslash
use the files DirectoryName property instead of its FullName when taking the substring from it, otherwise the $dest variable will also include the file name (creating folders for every file)
apparently you wish to not copy files from folders having certain keywords in their path name, so you need to put the copy command inside the test, not below it
Try:
$sourceDir = "C:\Users\hhh\appdev\hhh\data\environments"
$targetDir = "C:\Users\hhh\appdev\targetfolder"
Get-ChildItem $sourceDir -File -Recurse | ForEach-Object {
# use the files DirectoryName, not the FullName property, otherwise the path will include the file name as well
$dest = Join-Path -Path $targetDir -ChildPath $_.DirectoryName.SubString($sourceDir.Length)
# exclude paths containing these words
if ($dest -notmatch 'research|qa|production|global') {
# create the new folder if it does not already exist
$null = New-Item -Path $dest -ItemType Directory -Force
$_ | Copy-Item -Destination $dest -Force
}
}
The folder named "z:\original" has hundreds of sub-folders containing multiple copies of the same .jpg files. I wanted to copy all .jpg files into a folder named "z:\dump" WITHOUT the folder structure and hopefully overwrite most of the copies. I used
copy-Item -path "Z:\original" -filter "*.jpg" -Destination "Z:\dump" -recurse -verbose
but this recreated the structure with the .jpg files. How can I dump all files into a single folder, using PowerShell?
Use combination of dir/Get-ChildItem and Copy-Item with pipeline.
$_.FullName - Full path to jpg file
$_.Name is file name only
Get-ChildItem Z:\original\*.jpg -Recurse | %{Copy-Item $_.FullName "Z:\dump\$($_.Name)"}
You might need to add -Force to overwrite same files
I am trying to copy all the files in a directory that contains many subfolders into a single separate folder. When the code is run again, rather than replacing each file in the destination folder, it should skip files that have the same timestamp and only replace those that are older.
I have used robocopy to skip the copying of files that are of the current version/older in the destination folder. However, robocopy only copies the entire directory along with its folder structure so I am unable to obtain the desired folder with a list of all the files from the source.
I have also used get child-item and then copy-item. However, although this is able to get rid of the folder structure, it overwrites each file for each iteration and is thus time-consuming.
So what I want is to combine the capabilities of robocopy and copy-item. Note that there are no specific pattern to the files that I am to copy. It is simply to COPY each file in the subdirectories that are EITHER of a NEWER version or NON-existing into a single folder.
#For copying and ease of updating destination folder
robocopy /purge /np /S /xo 'source' 'destination'
#To copy items into the destination folder without keeping folder structure
Get-ChildItem -Path 'source' -Recurse -File | Copy-Item -Destination 'destination'
Was unable to combine both, So I am stuck with using the 'copy-item' code, which is quite time consuming when copying/updating large amounts of files.
The purpose of robocopy is to preserve the folder structure. If you want to mangle subfolders robocopy is not the right tool. Use the Get-ChildItem approach, group the results by file name, sort each group by date, pick the most recent file from each group, and copy it if the corresponding destination file either doesn't exist or is older.
Something like this should do what you want:
Get-ChildItem -Path 'C:\source' -Recurse -File |
Group-Object Name |
ForEach-Object {
$src = $_.Group | Sort-Object | Select-Object -Last 1
$dst = Join-Path 'C:\destination' $src.Name
if (-not (Test-Path $dst) -or ($src.LastWriteTime -gt (Get-Item $dst).LastWriteTime)) {
$src | Copy-Item -Destination $dst
}
}
Inside c:\test\ I have:
.\dir1
.\dir2
.\dir3
and
file1.txt
file2.txt
file3.txt
file4.txt
I want to compress only dir1, dir2, file1.txt and file2.txt
I use the following script to compress selected folders
$YourDirToCompress="C:\test\"
$ZipFileResult=".\result.zip"
$DirToInclude=#("dir1", "dir2")
Get-ChildItem $YourDirToCompress -Directory |
where { $_.Name -in $DirToInclude} |
Compress-Archive -DestinationPath $ZipFileResult -Update
How would I be able to add my selected files (file1.txt & file2.txt) to final compressed result.zip?
Ideally I want the compression to happen in one go meaning I pass the list of selected files and folders and then do the compression.
Try something like this:
Get-ChildItem -Path $YourDirToCompress | Where {$_.Name -in $DirToInclude} | Compress-Archive -DestinationPath $ZipfFileResult -Update
Compress-Archrive is capable of zipping both files and directories. So in this code we are using Get-ChildItem without the -Directory flag which will return all files and directories at the root-level of $YourDirToCompress
Then we pass those files/folders to the Compress-Archive cmdlet just as we did before.
Assuming that the files are at the root level of $YourDirToCompress
If you compress a folder, all files and subfolders are also compressed.
So, what effort to compress specific files in a folder that you've already compressed?
Are you saying the files only live in \dir3?
If so, you just use a for loop and if statement.
Match dir1 and dir2 then compress, then match file1 and file2 in dir3 then compress.
I have a PS script which Zips up the previous months logs and names the zip file FILENAME-YYYY-MM.zip
This works
What I now want to do is copy these zip files off to a network share but keeping some of the folder structure. I currently a folder structure similar to the following;
C:\Folder1\
C:\Folder1\Folder2\
C:\Folder1\Folder3\
C:\Folder1\Folder4\Folder5\
There are .zip files in every folder below c:\Folder1
What I want is for the script to copy files from c:\folder1 to \\networkshare but keeping the folder structure, so I should have 3 folders and another subfolder in folder4.
Currently I can only get it to copy the whole structure so I get c:\folder1\... in my \\networkshare
I keep running into issues such as the new folder structure doesn't exist, I can't use the -recurse switch within the Get-ChildItem command etc...
The script I have so far is;
#This returns the date and formats it for you set value after AddMonths to set archive date -1 = last month
$LastWriteMonth = (Get-Date).AddMonths(-3).ToString('MM')
#Set destination for Zip Files
$DestinationLoc = "\\networkshare\LogArchive\$env:computername"
#Source files
$SourceFiles = Get-ChildItem C:\Sourcefiles\*.zip -Recurse | where-object {$_.lastwritetime.month -le $LastWriteMonth}
Copy-Item $SourceFiles -Destination $DestinationLoc\ZipFiles\
Remove-Item $SourceFiles
Sometimes, you just can't (easily) use a "pure PowerShell" solution. This is one of those times, and that's OK.
Robocopy will mirror directory structures, including any empty directories, and select your files (likely faster than a filter with get-childitem will). You can copy anything older than 90 days (about 3 months) like this:
robocopy C:\SourceFiles "\\networkshare\LogArchive\$($env:computername)\ZipFiles" /E /IS /MINAGE:90 *.zip
You can specify an actual date with /MINAGE too, if you have to be that precise.
How about Copy-Item "C:\SourceFiles\" -dest $DestinationLoc\ZipFiles -container -recurse? I have tested this and have found that it copies the folder structure intact. If you only need *.zip files, you first get them, then for each you call Resolve-Path with -Relative flag set and then add the resultant path into Destination parameter.
$oldloc=get-location
Set-Location "C:\SourceFiles\" # required for relative
$SourceFiles = Get-ChildItem C:\Sourcefiles\*.zip -Recurse | where-object {$_.lastwritetime.month -le $LastWriteMonth}
$SourceFiles | % {
$p=Resolve-Path $_.fullname -relative
copy-item $_ -destination "$DestinationLoc\ZipFiles\$p"
}
set-location $oldloc # return back