Powershell folder archiving - powershell

I have a powershell script that simply uses the Get-ChildItem command to search through a directory for a folder matching a keyword. When found I need it to zip it and leave it right in that same directory.
Here's what I've tried by piping the command into both 7zip and the native compress:
set-alias zip "$env:ProgramFiles\7-Zip\7z.exe"
Get-ChildItem $path "keyword" -Recurse -Directory | zip a
AND
Get-ChildItem $path"keyword" -Recurse -Directory | compress-archive
Both times it always asks for a source and destination which is hard to define since I'm having it search through a drive with many sub-folders. I though using the pipe would imply the source as well.
Any ideas? Thanks!
EDIT:
I suppose I could set the Get-ChildItem to a variable and use that as the "source" and have the destination be a generic location for them all but I'd have to name them differently, no?

Give this a try:
$path = "INSERT SOURCE ROOT"
foreach ($directory in Get-ChildItem $path -Recurse -Directory -Filter "keyword"| Select-Object FullName | foreach { $_.FullName}) {
$destination = Split-Path -Path $directory -Parent
Compress-Archive -Path $directory -DestinationPath $destination
}
This is looking inside the path for anything matching the "keyword", going up 1 level, and then zipping the found file.

With temp and temp2 directories under C:\ this is working for me (note that the directories must have content):
Get-ChildItem "C:\" "temp*" -directory | compress-archive -DestinationPath "C:\tempzip.zip"
It zips all directories found to C:\tempzip.zip.
I believe what you actually want is:
$dirs = Get-ChildItem "C:\" "temp*" -directory
foreach ($dir in $dirs){
compress-archive $dir.fullname -DestinationPath "$($dir.fullname).zip"
}
Note that I omitted -recurse for my testing.

You can do this:
get-childitem -path "The Source Path" -recurse | where {$_.Name -match "Keyword"} | foreach {
$parent = Split-Path -Path $_ -Parent
Compress-Archive -Path $_ -DestinationPath $parent
}

Related

Using powershell to move sub-folder contents up one level; different parent folders

I've looked through various questions on SO but can't get the script to work. New to Powershell and I'm sure this easy - any help would be great
I have a directory structure of 170 folders:
G:\Data\
folder1\
DCIM1\*jpgs
DCIM2\*jpgs
folder2\
DCIM1\*jpgs
DCIM2\*jpgs
I would like to move all the jpgs from each DCIM subfolder into the parent folder one level up:
G:\Data\
folder1\*jpgs
folder2\*jpgs
Each parent folder has a different name, and potentially differently named DCIM subfolders, but all the jpgs are named with their folder prefix (e.g., DCIM1_001.jpg).
I have tried Powershell:
G:\> $files = Get-ChildItem "G:\data"
>> Get-ChildItem $files | Move-Item -Destination { $_.Directory.Parent.FullName }
>> $files | Remove-Item -Recurse
but get a destination is null error. I have tried a wildcard too:
G:\> $files = Get-ChildItem "G:\data\*"
>> Get-ChildItem $files | Move-Item -Destination { $_.Directory.Parent.FullName }
>> $files | Remove-Item -Recurse
But I take it I have that completely wrong. What am I missing?
You can use Split-Path to get the parent directory:
$JPGs = Get-ChildItem -Path "C:\brief\datefolder" -Recurse -Filter "*.jpg"
foreach ($JPG in $JPGs) {
$Parent_Directory = Split-Path -Path $JPG.FullName -Parent
$Destination_Path = Split-Path -Path $Parent_Directory -Parent
Move-Item -Path $JPG.FullName -Destination $Destination_Path
if ($null -eq (Get-ChildItem -Path $Parent_Directory)) {
Remove-Item -Path $Parent_Directory
}
}
It's just a means of assigning it to a variable and moving along the line.

How can I pipe multiple files into Copy-Item and keep the directory structure?

I have a very large directory located at D:\Stuff and I want to create a copy of it at D:\CopyStuff, but I only want to take files with a certain extension as well as keep the folder structure.
Getting the files I want seems simple enough:
$from = "D:\stuff"
$to = "D:\CopyStuff"
$files = Get-ChildItem -Recurse -Path $from -Include *.config, *.txt, *.ini
However, copying the files and keeping the structure is a bit more challenging. I could use a for-loop, but that seems against the very nature of Powershell. Here https://stackoverflow.com/a/25780745/782880, it suggests to do it this way:
Get-ChildItem -Path $sourceDir | Copy-Item -Destination $targetDir -Recurse -Container
But that copies files to D:\CopyStuff with no folders, much less my original structure. What am I doing wrong? I'm using Powershell 5.
try this :
$Source="C:\temp\test1"
$Dest="C:\temp\test2"
$EnableExt=".config", ".txt" , ".ini"
Get-ChildItem $Source -Recurse | % {
$NewPath=$_.FullName.Replace($Source, $Dest)
if ($_.psiscontainer)
{
New-Item -Path $NewPath -ItemType Directory -Force
}
elseif ($_.Extension -in $EnableExt)
{
Copy-Item $_.FullName $NewPath -Force
}
}
First of all, Copy-Item can do it on its own like:
$fromFolder = "C:\Temp\Source"
$toFolder = "C:\Temp\Dest"
Copy-Item -Path $fromFolder -Destination $toFolder -Recurse -Filter *.txt
But, you may not like the result: it will make folder "Source" inside the "Dest" folder, and then copy the structure. I reckon, you need the same files/folders from inside "Source" folder to be copy to the "Dest" folder. Well, it's a bit more complex, but here it is:
$fromFolder = "C:\Temp\Source"
$toFolder = "C:\Temp\Dest"
Get-ChildItem -Path $fromFolder -Directory -Recurse | Select-Object FullName, #{N="NewPath";E={$_.FullName.Replace($fromFolder, $toFolder)}} | ForEach-Object { New-Item -Path $_.NewPath -ItemType "Directory" }
Get-ChildItem -Path $fromFolder -Include "*.txt" -Recurse | Select-Object FullName, #{N="NewPath";E={$_.FullName.Replace($fromFolder, $toFolder)}} | ForEach-Object { Copy-Item -Path $_.FullName -Destination $_.NewPath }
It copies folder structure first, then files.
NB! I do strongly recommend to use absolute paths only. Otherwise, the Replace method may give unexpected results.
Note: The solution below creates analogous target folders only for those source folders that contain files matching the -Include filter, not for all source folders.
You can get away with a single-pipeline solution by combining Get-ChildItem -Name with delay-bind script blocks:
$from = 'D:\stuff'
$to = 'D:\CopyStuff'
Get-ChildItem -Name -Recurse -LiteralPath $from -Include *.config, *.txt, *.ini |
Copy-Item `
-LiteralPath { Join-Path $from $_ } `
-Destination { New-Item -Type Directory -Force (Split-Path (Join-Path $to $_)) }
-Name emits paths relative to the input directory as strings.
Delay-bind script block { Join-Path $from $_ } builds the full input file name from each relative input path.
Delay-bind script block { New-Item -Type Directory -Force (Split-Path (Join-Path $to $_)) } builds the full path of the target directory from the target root path and the relative input path and creates that directory on demand, using a preexisting one if present (-Force).

PowerShell - Copy specific files from specific folders

So, the folder structure looks like this:
SourceFolder
file1.txt
file1.doc
Subfolder1
file2.txt
file2.doc
SubSubFolder
file3.txt
doc3.txt
What I want to do is copy all .txt files from folders, whose (folder) names contains the eng, to a destination folder. Just all the files inside the folder - not the file structure.
What I used is this:
$dest = "C:\Users\username\Desktop\Final"
$source = "C:\Users\username\Desktop\Test1"
Copy-Item $source\eng*\*.txt $dest -Recurse
The problem is that it copies the .txt files only from each parent folder but not the sub-folders.
How can I include all the sub-folders in this script and keep the eng name check as well? Can you please help me?
I am talking about PowerShell commands. Should I use robocopy instead?
Yet another PowerShell solution :)
# Setup variables
$Dst = 'C:\Users\username\Desktop\Final'
$Src = 'C:\Users\username\Desktop\Test1'
$FolderName = 'eng*'
$FileType = '*.txt'
# Get list of 'eng*' file objects
Get-ChildItem -Path $Src -Filter $FolderName -Recurse -Force |
# Those 'eng*' file objects should be folders
Where-Object {$_.PSIsContainer} |
# For each 'eng*' folder
ForEach-Object {
# Copy all '*.txt' files in it to the destination folder
Copy-Item -Path (Join-Path -Path $_.FullName -ChildPath '\*') -Filter $FileType -Destination $Dst -Force
}
You can do this :
$dest = "C:\NewFolder"
$source = "C:\TestFolder"
$files = Get-ChildItem $source -File -include "*.txt" -Recurse | Where-Object { $_.DirectoryName -like "*eng*" }
Copy-Item -Path $files -Destination $dest
Another take:
$SourceRoot = <Source folder path>
$TargetFolder = <Target folder path>
#(Get-ChildItem $SourceRoot -Recurse -File -Filter *.txt| Select -ExpandProperty Fullname) -like '*\eng*\*' |
foreach {Copy-Item $_ -Destination $TargetFolder}
It may be easier to first get a list of all the folders that contain eng in the name.
$dest = "C:\Users\username\Desktop\Final"
$source = "C:\Users\username\Desktop\Test1"
$engFolders = Get-ChildItem $source -Directory -Recurse | Where { $_.BaseName -match "^eng" }
Foreach ($folder In $engFolders) {
Copy-Item ($folder.FullName + "\*.txt") $dest
}
Fine to do that with powershell. Try:
$dest = "C:\Users\username\Desktop\Final"
$source = "C:\Users\username\Desktop\Test1"
Get-ChildItem $source -filter "*.txt" -Recurse | Where-Object { $_.DirectoryName -match "eng"} | ForEach-Object { Copy-Item $_.fullname $dest }

Powershell script to search for file types from a list of machines, then copy results into directories named after the machine

Here's what I have working so far. It's searching a list of computers placing all the files from all the machines in one folder.
I'm trying to get the files located to be placed in a folder named after the machine that they came from. Any ideas anyone?
Get-Content C:\computers.txt | Foreach-Object {
$ComputerName = $_
Get-Childitem "\\$ComputerName\c$\Documents and Settings\**\desktop","\\$ComputerName\c$\Documents and Settings\**\My Documents" -Include *.xls*, *.doc*, *.txt, *.pdf, *.jpg -Recurse -Force
} | Copy-Item -Destination \\destination\share
Move the Copy-Item inside the loop and add a statement to create the destination folder:
$extensions = '*.xls*', '*.doc*', '*.txt', '*.pdf', '*.jpg', '*.pub'
Get-Content C:\computers.txt | % {
$ComputerName = $_
$dst = "\\destination\share\$ComputerName"
$src = "\\$ComputerName\c$\Documents and Settings\**\desktop",
"\\$ComputerName\c$\Documents and Settings\**\My Documents"
New-Item -ItemType Directory $dst
Get-Childitem $src -Include $extensions -Recurse -Force |
Copy-Item -Destination $dst\
}

Copy a file including it's relative path

I need to copy a large number of files to a backup folder but I want to maintain their relative paths. I only need specific files; i.e.
C:\scripts\folder\File.ext1
C:\scripts\folder2\file2.ext2
C:\scripts\file3.ext1
But I only need to copy the ext1 files like so:
C:\backup\folder\File.ext1.bak
C:\backup\file3.ext1.bak
The source paths are of multiple depths.
This is what I have to copy the files:
$files = gci -path C:\scripts\ -recurse -include *.ext1
$files | % { Copy-Item $_ "$($_).bak"; move-item $_ -destination C:\backup\ }
This just dumps all the files into C:\backup\ and does not appear to get any of the paths. Not sure how that part would be done.
Something like this could work:
gci -path C:\scripts\ -recurse -include *.ext1 |
% { Copy-Item $_.FullName "$($_.FullName).bak"
move-item $_.FullName -destination ($_.FullName -replace 'C:\\scripts\\','C:\backup\') }
It is not clever, but it's quick & dirty and works without a lot of effort.
get-childitem returns absolute paths, but you can make them relative to the current working directory as follows:
resolve-path -relative
So to copy a filtered set of files from the current directory recursively to a destination directory:
$dest = "c:\dest"
$filter = "*.txt"
get-childitem -recurse -include $filter | `
where-object { !$_.PSIsContainer } | `
resolve-path -relative | `
% { $destFile = join-path $dest $_; new-item -type f $destFile -force | out-null; copy-item $_ $destFile; get-item $destfile; }
new-item is needed to create the parent directories
get-item provides a display of all the new files it created
Of course robocopy does all this, but there will be times when you want to do more special filtering or filename mangling...
Use robocopy.
robocopy c:\scripts c:\backup *.ext1 /s
Oops. I failed to notice you wanted to add the .bak extension too. I still think it is a good idea to use robocopy to copy the files then:
dir c:\backup -recurse -include *.ext1 | % { ren $_ "$_.bak" }
You can try this
Clear-Host
$from = "'C:\scripts\"
$to = "'C:\backup\"
$inc = #('*.ext1', '*.extx')
$files = get-childItem -path $from -include $inc -Recurse
$files | % {$dest = (Join-Path $to $($_.FullName+".bak").SubString($from.length)); $dum = New-Item -ItemType file $dest -Force; Copy-Item -Path $_ -Destination $dest -Recurse -Force }
the new-item is there in order to force path creation.
Jean Paul