I can get this to work if I set up an array for my $destination but this creates a seperate .zip for each directory selected.
How can I select every directory and zip them under a single destination instead of multiple destination files? Here is my code:
$type = "*.txt"
$destination = "LogsBackup.zip"
Add-Type -Assembly "System.IO.Compression.FileSystem" ;
$_sources = dir c:\Logs\$type -Recurse | Select Directory -Unique |
Out-GridView -OutputMode Multiple |
Select #{Name="Path";Expression={$_.Directory -As [string]}}
for ($i = 0; $i -lt $_sources.Length; $i++)
{
$compressionLevel = [System.IO.Compression.CompressionLevel]::Optimal
[System.IO.Compression.ZipFile]::CreateFromDirectory( $_sources[$i].Path , $destination)
}
I want to keep my Out-GridView option the way it is so if it's a single directory or multiple I can select them and store them as an array.
The -update parameter will make sure that your existing archive is updated with new entries that you select from out-gridview.
$destination = "C:\logs\LogsBackup.zip"
#Create a single archive
Get-ChildItem -Path C:\logs -Filter *.txt -Recurse |
Select #{Name="Path";Expression={$_.DirectoryName}} -Unique |
Out-GridView -PassThru | Compress-Archive -CompressionLevel Optimal -DestinationPath $destination -Update
Related
I am trying to copy the latest file from every folder/sub-folder into a same file structure on a different drive.
Latest file from source copied to the same name corresponding destination.
The destination folder hierarchy already exists & cannot be copied over or recreated. This & other versions are not behaving. Can anyone help?
$sourceDir = 'test F Drive\Shares\SSRSFileExtract\'
$destDir = 'test X Drive\SSRSFileExtract\'
$date = Get-Date
$list = Get-ChildItem -Path $sourceDir | Sort-Object -Property LastWriteTime -Descending | Select-Object -First 1
foreach ($item in $list)
{
Copy-Item -Verbose -LiteralPath $item.FullName -Destination $destDir -Force |
Get-Acl -Path $item.FullName | Set-Acl -Path $destDir\$(Split-Path -Path $item.FullName -Leaf)
}
Get-ChildItem –Path $destDir -Recurse | Where-Object {($_.LastWriteTime -lt (Get-Date).AddDays(-5))} | Remove-Item -Verbose -Recurse -Force
I found a solution for this which copies/moves all the files from all sub folders in to all corresponding sub folders:
Powershell: Move all files from folders and subfolders into single folder
The way your code retrieves the list of files will only return one single object because of Select-Object -First 1. Also, because you don't specify the -File switch, Get-ChildItem will also return DirectoryInfo objects, not just FileInfo objects..
What you could do is get an array of FileInfo objects recursively from the source folder and group them by the DirectoryName property
Then loop over these groups of files and from each of these groups, select the most recent file and copy that over to the destination folder.
Try:
$sourceDir = 'F:\Shares\SSRSFileExtract'
$destDir = 'X:\SSRSFileExtract'
Get-ChildItem -Path $sourceDir -File -Recurse | Group-Object DirectoryName | ForEach-Object {
# the $_ automatic variable represents one group at a time inside the loop
$newestFile = $_.Group | Sort-Object -Property LastWriteTime -Descending | Select-Object -First 1
# construct the target sub directory
# you could also use $_.Name (the name of this group) instead of $newestFile.DirectoryName here, because
# we grouped on that DirectoryName file property.
$targetDir = Join-Path -Path $destDir -ChildPath $newestFile.DirectoryName.Substring($sourceDir.Length)
# if you're not sure the targetpath exists, uncomment the next line to have it created first
# if (!(Test-Path -Path $targetDir -PathType Container)) { $null = New-Item -Path $target -ItemType Directory }
# copy the file
$newestFile | Copy-Item -Destination $targetDir -Force -Verbose
# copy the file's ACL
$targetFile = Join-Path -Path $targetDir -ChildPath $newestFile.Name
$newestFile | Get-Acl | Set-Acl -Path $targetFile
}
Apparently you would also like to clean up older files in the destination folder
Get-ChildItem –Path $destDir -File -Recurse |
Where-Object {$_.LastWriteTime -lt (Get-Date).AddDays(-5).Date} |
Remove-Item -Verbose -Recurse -Force
Be aware that the final code to remove older files could potentially remove all files from a subfolder if all happen to be older than 5 days..
I need some assistance with powershell - I would like to search within all subfolders of a particular folder, and copy the latest file from each subfolder to a new folder every day at 9.00 AM. So, I want to search within folder A's subfolder a, b and c to pick out the latest file in a, b and c each, and move all three files into outside folder B (a single folder). I am new to PowerShell - any help is appreciated. I've basically tried to use this but it creates a backup: Copy most recent file from folder to destination
Clear-Host
$ChildFolders = #('In_a', 'In_b', 'In_c')
for($i = 0; $i -lt $ChildFolders.Count; $i++){
$FolderPath = "C:\FolderA\" + $ChildFolders[$i]
$DestinationPath = "C:\FolderB\" [$i]
gci -Path $FolderPath -File | Sort-Object -Property LastWriteTime -Descending | Select FullName -First 1 | %($_){
$_.FullName
Copy-Item $_.FullName -Destination $DestinationPath
}
Gets subfolders
Gets all files in subfolders
Sorts files by Creation Date into a array
Gets first Entry
Moves file to Destination directory
*It will overwrite files with the same name in the destination folder
Function Get-LatestFiles($SourceFolder,$Destination){
$Subfolders = Get-ChildItem $SourceFolder -Directory
[System.Collections.ArrayList]$SubFoldersExpanded = new-object System.Collections.ArrayList
Foreach($SubFolder in $SubFolders){
$SubFolderExpanded = $Subfolder | %{(Get-ChildItem $_.FullName -File -Depth 1 | Sort-Object -Property CreationTime -Descending)}
if($SubFolderExpanded.Count -gt 0){
$SubFolderExpanded[0] | %{Move-Item $_.FullName -Destination $Destination -force}
}
}
}
Get-LatestFiles -SourceFolder C:\test -Destination C:\test01
I am trying to use Powershell to
scan folder D://Mediafolder for names of media files
create a folder for each media file scanned, with same name
insert each media file in to matching folder name.
I can find no documentation or thread of this, and I am more fluent in Linux than Windows. I've tried many times to piece this together, but to no avail.
Hope this will help :)
This will create a folder for each file with the same name, so if you have a file called xyz.txt, it will create a folder called xyz and move the file to this folder.
$path = "D:\MediaFolder"
$items = Get-ChildItem $path
Foreach ($item in $items)
{
$folderName = $item.name.Split('.')[0]
New-Item "$path\$folderName" -ItemType Directory
Move-Item -Path "$path\$item" -Destination "$path\$foldername"
}
File Sorting based on extension should do the job:
$folder_path = read-host "Enter the folder path without space"
$file = gci $folder_path -Recurse | ? {-not $_.psiscontainer}
$file | group -property extension | % {if(!(test-path(join-path $folder_path -child $_.name.replace('.','')))){new-item -type directory $(join-path $folder_path -child $_.name.replace('.','')).toupper()}}
$file | % { move-item $_.fullname -destination $(join-path $folder_path -child $_.extension.replace(".",""))}
$a = Get-ChildItem $folder_path -recurse | Where-Object {$_.PSIsContainer -eq $True}
$a | Where-Object {$_.GetFiles().Count -eq 0} | Remove-Item -Force
This will iterate over the files in the media_dir and move those with the extensions in media_types to a folder with the same basename. When you are satisfied that the files will be moved to the correct directory, remove the -WhatIf from the Move-Item statement.
PS C:\src\t> type .\ms.ps1
$media_dir = 'C:\src\t\media'
$new_dir = 'C:\src\t\newmedia'
$media_types = #('.mp3', '.mp4', '.jpeg')
Get-ChildItem -Path $media_dir |
ForEach-Object {
$base_name = $_.BaseName
if ($media_types -contains $_.Extension) {
if (-not (Test-Path $new_dir\$base_name)) {
New-Item -Path $new_dir\$base_name -ItemType Directory | Out-Null
}
Move-Item $_.FullName $new_dir\$base_name -WhatIf
}
}
I have a powershell script that takes the list of folders in a directory and zips the latest .bak file and copies it into another directory.
There are two folders that I do not want it to look in for the .bak files. How do I exclude these folders? I have tried multiple ways of -Exclude statements and I haven't had any luck.
The folders I would like to ignore are "New folder" and "New folder1"
$source = "C:\DigiHDBlah"
$filetype = "bak"
$list=Get-ChildItem -Path $source -ErrorAction SilentlyContinue
foreach ($element in $list) {
$fn = Get-ChildItem "$source\$element\*" -Include "*.$filetype" | sort LastWriteTime | select -last 1
$bn=(Get-Item $fn).Basename
$CompressedFile=$bn + ".zip"
$fn| Compress-Archive -DestinationPath "$source\$element\$bn.zip"
Copy-Item -path "$source\$element\$CompressedFile" -Destination "C:\DigiHDBlah2"
}
Thank you!
What I would do is use the Directory property on the files that you find, and the -NotLike operator to do a simple match for the folders you don't want. I would also simplify the search by using a wildcard:
$Dest = "C:\DigiHDBlah2"
$files = Get-ChildItem "$source\*\*.$filetype" | Where{$_.Directory -NotLike '*\New Folder' -and $_.Directory -NotLike '*\New Folder1'} | Sort LastWriteTime | Group Directory | ForEach{$_.Group[0]}
ForEach($file in $Files){
$CompressedFilePath = $File.FullName + ".zip"
$file | Compress-Archive -DestinationPath $CompressedFilePath
Copy-Item $CompressedFilePath -Dest $Dest
}
Or, if you want to just supply a list of folders to exclude you could do a little string manipulation on the directoryName property to just get the last folder, and see if it is in a list of excludes like:
$Excludes = #('New Folder','New Folder1')
$Dest = "C:\DigiHDBlah2"
$files = Get-ChildItem "$source\*\*.$filetype" | Where{$_.DirectoryName.Split('\')[-1] -NotIn $Excludes} | Sort LastWriteTime | Group Directory | ForEach{$_.Group[0]}
ForEach($file in $Files){
$CompressedFilePath = $File.FullName + ".zip"
$file | Compress-Archive -DestinationPath $CompressedFilePath
Copy-Item $CompressedFilePath -Dest $Dest
}
I created ps script to copy only files in the folder structure- recursive
cp $source.Text -Recurse -Container:$false -destination $destination.Text
$dirs = gci $destination.Text -directory -recurse | Where { (gci $_.fullName).count -eq 0 } | select -expandproperty FullName
$dirs | Foreach-Object { Remove-Item $_ }
it is working fine. but the problem i have files in the same names. it is not copying duplicated files. i need to rename if file already exist
source:
folderA--> xxx.txt,yyy.txt,
folderB-->xxx.txt,yyy.txt,zzz.txt,
folderc-->xxx.txt
destination (requirement)
xxx.txt
xxx1.txt
xxx2.txt
yyy.txt
yyy1.txt
zzz.txt
Here a solution where I use the Group-Object cmdlet to group all items by the filename. I then iterate over each group and if the group contains more then one file, I append _$ito it where $i starts by one and gets incremented:
$source = $source.Text
$destination = $destination.Text
Get-ChildItem $source -File -Recurse | Group-Object Name | ForEach-Object {
if ($_.Count -gt 1) { # rename duplicated files
$_.Group | ForEach-Object -Begin {$i = 1} -Process {
$newFileName = $_.Name -replace '(.*)\.(.*)', "`$1_$i.`$2"
$i++
Copy-Item -Path $_.FullName -Destination (Join-Path $destination $newFileName)
}
}
else # the filename is unique, just copy it.
{
$_.Group | Copy-Item -Destination $destination
}
}
Note:
You may change the -File to -Container:$false if your PowerShell version doesn't support it. Also note that the script doesn't look into the destination folder whether a file with the same name already exist.