Moving files to appropriate location based on their extension - powershell

I have a cleanup script that moves files based on their extension to appropriate preset locations.
For example, a file with the extension .xls will be moved to ~\XLS folder, .sql to ~\SQL and so on. Here is the my script.
$dirtyfolder = "\\server\c$\Documents and Settings\user\Desktop\"
$org = "\\BACKUPS\users\"
dir $dirtyfolder -fil *.doc | mv -dest "$($org)ORG\doc"
dir $dirtyfolder -fil *.txt | mv -dest "$($org)ORG\txt"
dir $dirtyfolder -fil *.sql | mv -dest "$($org)ORG\sql"
dir $dirtyfolder -fil *.log | mv -dest "$($org)ORG\log"
dir $dirtyfolder -fil *.zip | mv -dest "$($org)ORG\zip"
dir $dirtyfolder -fil *.7z | mv -dest "$($org)ORG\zip"
dir $dirtyfolder -fil *.png | mv -dest "$($org)ORG\img"
dir $dirtyfolder -fil *.jpg | mv -dest "$($org)ORG\img"
dir $dirtyfolder -fil *.mp3 | mv -dest "$($org)ORG\mp3"
I am fully aware that this in an inelegant way to achieve my objective. So I would like to know how I can modify the script so that I can
reuse repetitive code
if the destination folder does not exist, it should be created.
group similar extensions, like png and jpg

Tested. A (not-recursive) solution that does not manage grouping:
ls $dirtyfolder/* | ? {!$_.PSIsContainer} | %{
$dest = "$($org)ORG\$($_.extension)"
if (! (Test-Path -path $dest ) ) {
new-item $dest -type directory
}
mv -path $_.fullname -destination $dest
}
Solution with grouping:
ls $dirtyfolder/* | ? {!$_.PSIsContainer} | %{
$dest = "$($org)ORG\$(get-destbytype $_.extension)"
if (! (Test-Path -path $dest ) ) {
new-item $dest -type directory
}
mv -path $_.fullname -destination $dest
}
where get-destbytype is the following function:
function get-destbytype($ext) {
Switch ($ext)
{
{$ext -match '(jpg|png|gif)'} { "images" }
{$ext -match '(sql|ps1)'} { "scripts" }
default {"$ext" }
}
}

This is my working test
$source = "e:\source"
$dest = "e:\dest"
$file = gci $source | ? {-not $_.psiscontainer}
$file | group -property extension |
% {if(!(test-path(join-path $dest -child $_.name.replace('.','')))) { new-item -type directory $(join-path $dest -child $_.name.replace('.','')).toupper() }}
$file | % { move-item $_.fullname -destination $(join-path $dest -child $_.extension.replace(".",""))}
The script will find all different extensions within source folder. For each extension, if the folder doesn't already exist within destination, it will be created.
Last row will loop each file from source and move it to the right subfolder destination.
If you want to put images with different extensions within the same folder you need to make some further check, using an if or a switch statement.

Related

Move subdirectories up one level for folder names that have certain string

The folders structure is as follows:
-User1folder
---->User1upload
---------->Upload Subfolder
---->miscfolder
-User2folder
---->User2upload
---->othermisc folder
There are root user folders (e.g., User1folder,user2folder).
Under that main root folder there are miscellaneous folders, the important one is the folder that contains the text 'upload' in its name.How can I move any subfolders under the user folders that contain 'upload' up to the main user folder?
Example:
-User1Folder
---->Upload subfolder\Client1\Sub1
The script should move subfolders under the upload folder to the main root account so it looks like this
->User1Folder
---->Upload subfolder
---->Client1\Sub1\
Current code I am using which only moves files up a single level:
Get-ChildItem -Directory -Path 'C:\inetpub\EFTRoot\NEIS Site\Usr\*\*upload*\' |
ForEach-Object {
Get-ChildItem -File -Path $_.FullName |
ForEach-Object {
$nextName = $_.FullName
$num = 1
while(Test-Path -Path $nextName)
{
$nextName = Join-Path (Split-Path -Parent (Split-Path -Parent $_.FullName)) ($_.BaseName + "_$num" + $_.Extension)
$num++
}
Move-Item -Path $_.FullName -Destination $nextName
Move-Item -Destination { $_.Directory.Parent.FullName }
$files | Remove-Item -Recurse
}
}
I figured it out. First I CD into the folder and run the following:
dir -Directory | % {Push-Location $_.FullName; dir './*upload*/*' | % {Move-Item $_.FullName .}; del './*upload*/*'; Pop-Location}
Found info here: https://superuser.com/questions/1251260/move-subfolders-up-one-level

Powershell - List all folders in a directory, pull latest .bak file in each folder, zip it, copy it to a directory

I have a powershell script that takes the list of folders in a directory and zips the latest .bak file and copies it into another directory.
There are two folders that I do not want it to look in for the .bak files. How do I exclude these folders? I have tried multiple ways of -Exclude statements and I haven't had any luck.
The folders I would like to ignore are "New folder" and "New folder1"
$source = "C:\DigiHDBlah"
$filetype = "bak"
$list=Get-ChildItem -Path $source -ErrorAction SilentlyContinue
foreach ($element in $list) {
$fn = Get-ChildItem "$source\$element\*" -Include "*.$filetype" | sort LastWriteTime | select -last 1
$bn=(Get-Item $fn).Basename
$CompressedFile=$bn + ".zip"
$fn| Compress-Archive -DestinationPath "$source\$element\$bn.zip"
Copy-Item -path "$source\$element\$CompressedFile" -Destination "C:\DigiHDBlah2"
}
Thank you!
What I would do is use the Directory property on the files that you find, and the -NotLike operator to do a simple match for the folders you don't want. I would also simplify the search by using a wildcard:
$Dest = "C:\DigiHDBlah2"
$files = Get-ChildItem "$source\*\*.$filetype" | Where{$_.Directory -NotLike '*\New Folder' -and $_.Directory -NotLike '*\New Folder1'} | Sort LastWriteTime | Group Directory | ForEach{$_.Group[0]}
ForEach($file in $Files){
$CompressedFilePath = $File.FullName + ".zip"
$file | Compress-Archive -DestinationPath $CompressedFilePath
Copy-Item $CompressedFilePath -Dest $Dest
}
Or, if you want to just supply a list of folders to exclude you could do a little string manipulation on the directoryName property to just get the last folder, and see if it is in a list of excludes like:
$Excludes = #('New Folder','New Folder1')
$Dest = "C:\DigiHDBlah2"
$files = Get-ChildItem "$source\*\*.$filetype" | Where{$_.DirectoryName.Split('\')[-1] -NotIn $Excludes} | Sort LastWriteTime | Group Directory | ForEach{$_.Group[0]}
ForEach($file in $Files){
$CompressedFilePath = $File.FullName + ".zip"
$file | Compress-Archive -DestinationPath $CompressedFilePath
Copy-Item $CompressedFilePath -Dest $Dest
}

PowerShell - Copy specific files from specific folders

So, the folder structure looks like this:
SourceFolder
file1.txt
file1.doc
Subfolder1
file2.txt
file2.doc
SubSubFolder
file3.txt
doc3.txt
What I want to do is copy all .txt files from folders, whose (folder) names contains the eng, to a destination folder. Just all the files inside the folder - not the file structure.
What I used is this:
$dest = "C:\Users\username\Desktop\Final"
$source = "C:\Users\username\Desktop\Test1"
Copy-Item $source\eng*\*.txt $dest -Recurse
The problem is that it copies the .txt files only from each parent folder but not the sub-folders.
How can I include all the sub-folders in this script and keep the eng name check as well? Can you please help me?
I am talking about PowerShell commands. Should I use robocopy instead?
Yet another PowerShell solution :)
# Setup variables
$Dst = 'C:\Users\username\Desktop\Final'
$Src = 'C:\Users\username\Desktop\Test1'
$FolderName = 'eng*'
$FileType = '*.txt'
# Get list of 'eng*' file objects
Get-ChildItem -Path $Src -Filter $FolderName -Recurse -Force |
# Those 'eng*' file objects should be folders
Where-Object {$_.PSIsContainer} |
# For each 'eng*' folder
ForEach-Object {
# Copy all '*.txt' files in it to the destination folder
Copy-Item -Path (Join-Path -Path $_.FullName -ChildPath '\*') -Filter $FileType -Destination $Dst -Force
}
You can do this :
$dest = "C:\NewFolder"
$source = "C:\TestFolder"
$files = Get-ChildItem $source -File -include "*.txt" -Recurse | Where-Object { $_.DirectoryName -like "*eng*" }
Copy-Item -Path $files -Destination $dest
Another take:
$SourceRoot = <Source folder path>
$TargetFolder = <Target folder path>
#(Get-ChildItem $SourceRoot -Recurse -File -Filter *.txt| Select -ExpandProperty Fullname) -like '*\eng*\*' |
foreach {Copy-Item $_ -Destination $TargetFolder}
It may be easier to first get a list of all the folders that contain eng in the name.
$dest = "C:\Users\username\Desktop\Final"
$source = "C:\Users\username\Desktop\Test1"
$engFolders = Get-ChildItem $source -Directory -Recurse | Where { $_.BaseName -match "^eng" }
Foreach ($folder In $engFolders) {
Copy-Item ($folder.FullName + "\*.txt") $dest
}
Fine to do that with powershell. Try:
$dest = "C:\Users\username\Desktop\Final"
$source = "C:\Users\username\Desktop\Test1"
Get-ChildItem $source -filter "*.txt" -Recurse | Where-Object { $_.DirectoryName -match "eng"} | ForEach-Object { Copy-Item $_.fullname $dest }

make copy of folder tree without files

I need to make copy of folder with subfolders, but do it without any files, except data that include folder "Project".
So I need to do new folders tree, but it should include only files that was present in subfolder named "Project".
ok, My solution:
$folder = dir D:\ -r
$folder
foreach ($f in $folder)
{
switch ($f.name)
{
"project"
{
Copy-Item -i *.* $f.FullName D:\test2
}
default
{
Copy-Item -exclude *.* $f.FullName D:\test2
}
}
}
Use xcopy /t to copy only the folder structure and then copy the Project folders separately. Something like this:
'test2\' | Out-File D:\exclude -Encoding ASCII
xcopy /t /exclude:d:\exclude D:\ D:\test2
gci -r -filter Project | ?{$_.PSIsContainer} | %{ copy -r $_.FullName d:\test2}
ri d:\exclude
Another solution:
$source = "c:\dev"
$destination = "c:\temp\copydev"
Get-ChildItem -Path $source -Recurse -Force |
Where-Object { $_.psIsContainer } |
ForEach-Object { $_.FullName -replace [regex]::Escape($source), $destination } |
ForEach-Object { $null = New-Item -ItemType Container -Path $_ -Force }
Get-ChildItem -Path $source -Recurse -Force |
Where-Object { -not $_.psIsContainer -and (Split-Path $_.PSParentPath -Leaf) -eq "Project"} |
Copy-Item -Force -Destination { $_.FullName -replace [regex]::Escape($source), $destination }
Use Get-ChildItem to recurse over folders and remap structure using New-Item. Within recursion, you can easily check for "Project".
First, create the directory structure:
xcopy D:\source D:\destination /t /e
Now, iterate through the source directory, copying every file in a Project directory:
Get-ChildItem D:\Source * -Recurse |
# filter out directories
Where-Object { -not $_.PsIsContainer } |
# grab files that are in Project directories
Where-Object { (Split-Path -Leaf (Split-Path -Parent $_.FullName)) -eq 'Project' } |
# copy the files from source to destination
Copy-Item -Destination ($_.FullName.Replace('D:\source', 'D:\destination'))

Copy a file including it's relative path

I need to copy a large number of files to a backup folder but I want to maintain their relative paths. I only need specific files; i.e.
C:\scripts\folder\File.ext1
C:\scripts\folder2\file2.ext2
C:\scripts\file3.ext1
But I only need to copy the ext1 files like so:
C:\backup\folder\File.ext1.bak
C:\backup\file3.ext1.bak
The source paths are of multiple depths.
This is what I have to copy the files:
$files = gci -path C:\scripts\ -recurse -include *.ext1
$files | % { Copy-Item $_ "$($_).bak"; move-item $_ -destination C:\backup\ }
This just dumps all the files into C:\backup\ and does not appear to get any of the paths. Not sure how that part would be done.
Something like this could work:
gci -path C:\scripts\ -recurse -include *.ext1 |
% { Copy-Item $_.FullName "$($_.FullName).bak"
move-item $_.FullName -destination ($_.FullName -replace 'C:\\scripts\\','C:\backup\') }
It is not clever, but it's quick & dirty and works without a lot of effort.
get-childitem returns absolute paths, but you can make them relative to the current working directory as follows:
resolve-path -relative
So to copy a filtered set of files from the current directory recursively to a destination directory:
$dest = "c:\dest"
$filter = "*.txt"
get-childitem -recurse -include $filter | `
where-object { !$_.PSIsContainer } | `
resolve-path -relative | `
% { $destFile = join-path $dest $_; new-item -type f $destFile -force | out-null; copy-item $_ $destFile; get-item $destfile; }
new-item is needed to create the parent directories
get-item provides a display of all the new files it created
Of course robocopy does all this, but there will be times when you want to do more special filtering or filename mangling...
Use robocopy.
robocopy c:\scripts c:\backup *.ext1 /s
Oops. I failed to notice you wanted to add the .bak extension too. I still think it is a good idea to use robocopy to copy the files then:
dir c:\backup -recurse -include *.ext1 | % { ren $_ "$_.bak" }
You can try this
Clear-Host
$from = "'C:\scripts\"
$to = "'C:\backup\"
$inc = #('*.ext1', '*.extx')
$files = get-childItem -path $from -include $inc -Recurse
$files | % {$dest = (Join-Path $to $($_.FullName+".bak").SubString($from.length)); $dum = New-Item -ItemType file $dest -Force; Copy-Item -Path $_ -Destination $dest -Recurse -Force }
the new-item is there in order to force path creation.
Jean Paul