Send only selected files and folders to Compress-Archive - powershell

Inside c:\test\ I have:
.\dir1
.\dir2
.\dir3
and
file1.txt
file2.txt
file3.txt
file4.txt
I want to compress only dir1, dir2, file1.txt and file2.txt
I use the following script to compress selected folders
$YourDirToCompress="C:\test\"
$ZipFileResult=".\result.zip"
$DirToInclude=#("dir1", "dir2")
Get-ChildItem $YourDirToCompress -Directory |
where { $_.Name -in $DirToInclude} |
Compress-Archive -DestinationPath $ZipFileResult -Update
How would I be able to add my selected files (file1.txt & file2.txt) to final compressed result.zip?
Ideally I want the compression to happen in one go meaning I pass the list of selected files and folders and then do the compression.

Try something like this:
Get-ChildItem -Path $YourDirToCompress | Where {$_.Name -in $DirToInclude} | Compress-Archive -DestinationPath $ZipfFileResult -Update
Compress-Archrive is capable of zipping both files and directories. So in this code we are using Get-ChildItem without the -Directory flag which will return all files and directories at the root-level of $YourDirToCompress
Then we pass those files/folders to the Compress-Archive cmdlet just as we did before.
Assuming that the files are at the root level of $YourDirToCompress

If you compress a folder, all files and subfolders are also compressed.
So, what effort to compress specific files in a folder that you've already compressed?
Are you saying the files only live in \dir3?
If so, you just use a for loop and if statement.
Match dir1 and dir2 then compress, then match file1 and file2 in dir3 then compress.

Related

PowerShell copy-item folder structure is NOT wanted

The folder named "z:\original" has hundreds of sub-folders containing multiple copies of the same .jpg files. I wanted to copy all .jpg files into a folder named "z:\dump" WITHOUT the folder structure and hopefully overwrite most of the copies. I used
copy-Item -path "Z:\original" -filter "*.jpg" -Destination "Z:\dump" -recurse -verbose
but this recreated the structure with the .jpg files. How can I dump all files into a single folder, using PowerShell?
Use combination of dir/Get-ChildItem and Copy-Item with pipeline.
$_.FullName - Full path to jpg file
$_.Name is file name only
Get-ChildItem Z:\original\*.jpg -Recurse | %{Copy-Item $_.FullName "Z:\dump\$($_.Name)"}
You might need to add -Force to overwrite same files

Copying all Files in Subdirectories to a Single Folder with Robocopy Updating Capabilities

I am trying to copy all the files in a directory that contains many subfolders into a single separate folder. When the code is run again, rather than replacing each file in the destination folder, it should skip files that have the same timestamp and only replace those that are older.
I have used robocopy to skip the copying of files that are of the current version/older in the destination folder. However, robocopy only copies the entire directory along with its folder structure so I am unable to obtain the desired folder with a list of all the files from the source.
I have also used get child-item and then copy-item. However, although this is able to get rid of the folder structure, it overwrites each file for each iteration and is thus time-consuming.
So what I want is to combine the capabilities of robocopy and copy-item. Note that there are no specific pattern to the files that I am to copy. It is simply to COPY each file in the subdirectories that are EITHER of a NEWER version or NON-existing into a single folder.
#For copying and ease of updating destination folder
robocopy /purge /np /S /xo 'source' 'destination'
#To copy items into the destination folder without keeping folder structure
Get-ChildItem -Path 'source' -Recurse -File | Copy-Item -Destination 'destination'
Was unable to combine both, So I am stuck with using the 'copy-item' code, which is quite time consuming when copying/updating large amounts of files.
The purpose of robocopy is to preserve the folder structure. If you want to mangle subfolders robocopy is not the right tool. Use the Get-ChildItem approach, group the results by file name, sort each group by date, pick the most recent file from each group, and copy it if the corresponding destination file either doesn't exist or is older.
Something like this should do what you want:
Get-ChildItem -Path 'C:\source' -Recurse -File |
Group-Object Name |
ForEach-Object {
$src = $_.Group | Sort-Object | Select-Object -Last 1
$dst = Join-Path 'C:\destination' $src.Name
if (-not (Test-Path $dst) -or ($src.LastWriteTime -gt (Get-Item $dst).LastWriteTime)) {
$src | Copy-Item -Destination $dst
}
}

Copy specific files from specific folders (folders in .rar)

My folder structure looks like this:
SourceFolder
├─file1.txt
├─file1.doc
└─Subfolder1
├─file2.txt
├─file2.doc
└─SubSubFolder
├─file3.txt
└─doc3.txt
This script copies all *.txt files from folders, whose (folder) names contains the eng, to a destination folder. Only the files inside the folder.
$dest = "C:\Users\username\Desktop\Final"
$source = "C:\Users\username\Desktop\Test1"
Get-ChildItem $source -Filter "*.txt" -Recurse |
Where-Object { $_.DirectoryName -match "eng" } |
ForEach-Object { Copy-Item $_.fullname $dest }
In my situation the folders are in .rar format and I want the script to search .rar folders and copy the *.txt files from folder eng to destination. Is that possible with PowerShell?
Rar is a proprietary archive format. Your .rar items aren't folders but compressed archives of that format. PowerShell can't transparently handle such archives, and I'm not aware of a third-party provider that would add this functionality.
Basically, no, what you're asking isn't possible.
What you can do is extract files from the archive with tools like 7-zip, e.g.:
7z.exe e your.rar *.txt -r
For extracting only files from a nested subfolder inside the archive prepend the extract pattern with that path:
7z.exe e your.rar "nested\subfolder\*.txt" -r

Powershell Copy files and folders

I have a PS script which Zips up the previous months logs and names the zip file FILENAME-YYYY-MM.zip
This works
What I now want to do is copy these zip files off to a network share but keeping some of the folder structure. I currently a folder structure similar to the following;
C:\Folder1\
C:\Folder1\Folder2\
C:\Folder1\Folder3\
C:\Folder1\Folder4\Folder5\
There are .zip files in every folder below c:\Folder1
What I want is for the script to copy files from c:\folder1 to \\networkshare but keeping the folder structure, so I should have 3 folders and another subfolder in folder4.
Currently I can only get it to copy the whole structure so I get c:\folder1\... in my \\networkshare
I keep running into issues such as the new folder structure doesn't exist, I can't use the -recurse switch within the Get-ChildItem command etc...
The script I have so far is;
#This returns the date and formats it for you set value after AddMonths to set archive date -1 = last month
$LastWriteMonth = (Get-Date).AddMonths(-3).ToString('MM')
#Set destination for Zip Files
$DestinationLoc = "\\networkshare\LogArchive\$env:computername"
#Source files
$SourceFiles = Get-ChildItem C:\Sourcefiles\*.zip -Recurse | where-object {$_.lastwritetime.month -le $LastWriteMonth}
Copy-Item $SourceFiles -Destination $DestinationLoc\ZipFiles\
Remove-Item $SourceFiles
Sometimes, you just can't (easily) use a "pure PowerShell" solution. This is one of those times, and that's OK.
Robocopy will mirror directory structures, including any empty directories, and select your files (likely faster than a filter with get-childitem will). You can copy anything older than 90 days (about 3 months) like this:
robocopy C:\SourceFiles "\\networkshare\LogArchive\$($env:computername)\ZipFiles" /E /IS /MINAGE:90 *.zip
You can specify an actual date with /MINAGE too, if you have to be that precise.
How about Copy-Item "C:\SourceFiles\" -dest $DestinationLoc\ZipFiles -container -recurse? I have tested this and have found that it copies the folder structure intact. If you only need *.zip files, you first get them, then for each you call Resolve-Path with -Relative flag set and then add the resultant path into Destination parameter.
$oldloc=get-location
Set-Location "C:\SourceFiles\" # required for relative
$SourceFiles = Get-ChildItem C:\Sourcefiles\*.zip -Recurse | where-object {$_.lastwritetime.month -le $LastWriteMonth}
$SourceFiles | % {
$p=Resolve-Path $_.fullname -relative
copy-item $_ -destination "$DestinationLoc\ZipFiles\$p"
}
set-location $oldloc # return back

Combine content of several files in folder

I have around 30 directories with .log files in them. I want to go into each folder and combine the text of all the files in the sub-directories separately. I do not want to combine the text of all the files in all the sub-directories.
Example
I have a directory called Machines
in Machines\ I have
Machine2\
Machine3\
Machine4\
Within each Machine* folder, I have :
1.log
2.log
3.log
etc..
I want to create a script that will do:
First: Go into the directory Machine2 and combine the text of all text files in that directory
Second: Go into the Machine3 directory and combine the text of all text file in that directory.
I can use the below if only had one folder, but I need it to loop through several sub folders so I do not have to enter the sub-directory in the command below.
Get-ChildItem -path "W:\Machines\Machine2" -recurse |?{ ! $_.PSIsContainer } |?{($_.name).contains(".log")} | %{ Out-File -filepath c:\machine1.txt -inputobject (get-content $_.fullname) -Append}
I think a recursive solution would work well. Given a directory, grab the content of all *.log files and dump into COMBINED.txt. Then pull the names of all subdirectories, and repeat for each.
function CombineLogs
{
param([string] $startingDir)
dir $startingDir -Filter *.log | Get-Content | Out-File (Join-Path $startingDir COMBINED.txt)
dir $startingDir |?{ $_.PsIsContainer } |%{ CombineLogs $_.FullName }
}
CombineLogs 'c:\logs'