creating powershell script to backup a file and append the date - powershell

Currently I have a one line batch file to back up a file. I run it manually when I need to back up a file. The only thing I would like to add to it is the current date. Here is what I have:
xcopy /W /Y ACTIVE.DB ACTIVE.DB.BACKUP
the destination file should simply be ACTIVE.DB.BACKUP.YYYYMMDD. How would I go about creating a script that will allow me to double click on it from Windows Explorer and make the xcopy happen?

Just to point out that you can do this with Copy-Item e.g.:
Set-Location $path
Copy-Item ACTIVE.DB "ACTIVE.DB.$(get-date -f yyyyMMdd)" -Force -Confirm
If you're going for robust then I'd use robocopy.exe.

You can customize your filename by embedding a formatted [datetime]::now in the file name in PowerShell like so:
xcopy /W /Y ACTIVE.DB "ACTIVE.DB.BACKUP.$([datetime]::now.ToString('yyyy-MM-dd'))"
If the line feels busy and unmaintainable, you can refactor it to multiple lines:
$now = [datetime]::now.ToString('yyyy-MM-dd')
xcopy /W /Y ACTIVE.DB "ACTIVE.DB.BACKUP.$now"
To get double-click execution, I usually make a batch file that runs the PowerShell command as described here:
Set up PowerShell Script for Automatic Execution

I just made a Daily/Weekly/Monthly/Quarterly/Yearly backup script in Powershell this month, and hope it helps.
This DWMQY backup scenario is to zip a source folder to a date-named zip file, then keep the zips of:
last 7 days
4 weeks (each Friday's)
6 months (the last Friday's of each month)
4 quarters (the last month's of quarter)
2 years (the last quarter's of year).
Running as scheduled task on daily basis, it puts the zips into a target folder which is also a Microsoft OneDrive's local folder, so the zips also being remotely sync'd to OneDrive server. Those outdated (non-Friday daily or non-last-DWMQY) zips will be moved to a non-remotely-sync'd folder.
It's March 5, 2016 today, the following zips should be in the target folder:
7 days: 160304-160229-160227
4 weeks: 160304, 160226, 160219,160212
6 months: 160226, 160129, 161225, 151127, 151025, 150925
4 quarters: 151225, 150925,150626,150327
2 years: 151225, 141226
So there will be 23 zips (actually less since the dups among DWMQY), our files are 250 text documents which is 0.4 GB after zipping, so it's 23*0.4 = 9.2 GB in total, which is less than OneDrive free 15 GB quota.
For large source data, 7-zip can be used, which provides maximum 16 mil TB zip size. For directly backup folders instead of zips, haven't tried. Guessing it's a transferable procedure from the current zip way.
# Note: there are following paths:
# 1. source path: path to be backed up.
# 2. target path: current zips stored at, which is also a remote-sync pair's local path.
# 3. moved-to path: outdated zips to be moved in this non-sync'able location.
# 4. temp path: to copy the source file in to avoid zip.exe failing of compressing them if they are occupied by some other process.
# Function declaration
. C:\Source\zipSaveDated\Functions.ps1
# <1> Zip data
$sourcePath = '\\remoteMachine1\c$\SourceDocs\*'
$TempStorage = 'C:\Source\TempStorage'
$enddate = (Get-Date).tostring("yyyyMMdd")
$zipFilename = '\\remoteMachine2\d$\DailyBackupRemote\OneDrive\DailyBackupRemote_OneDrive\' + $enddate + '_CompanyDoc.zip'
Remove-Item ($TempStorage + '\*') -recurse -Force
Copy-Item $sourcePath $TempStorage -recurse -Force
Add-Type -A System.IO.Compression.FileSystem
[IO.Compression.ZipFile]::CreateFromDirectory($TempStorage, $zipFilename)
# <2> Move old files
$SourceDir = "\\remoteMachine2\d$\DailyBackupRemote\OneDrive\DailyBackupRemote_OneDrive"
$DestinationDir = "\\remoteMachine2\d$\DailyBackupRemote\bak" # to store files moved out of the working folder (OneDrive)
$KeepDays = 7
$KeepWeeks = 4
$KeepMonths = 6
$KeepQuarters = 4
$KeepYears = 2
# <2.1>: Loop files
$Directory = $DestinationDir
if (!(Test-Path $Directory))
{
New-Item $directory -type directory -Force
}
$files = get-childitem $SourceDir *.*
foreach ($file in $files)
{ # L1
# daily removal will not remove weekly copy, 7
If($file.LastWriteTime -lt (Get-Date).adddays(-$KeepDays).date `
-and $file.LastWriteTime.DayOfWeek -NotMatch "Friday" `
)
{
Move-Item $file.fullname $Directory -force
}
} # L1 >>
$files = get-childitem $SourceDir *.*
foreach ($file in $files)
{ # L1
# weekly removal will not remove monthly copy, 4
If($file.LastWriteTime -lt (Get-Date).adddays(-$KeepWeeks * 7).date `
-and (Get-LastFridayOfMonth ($file.LastWriteTime)).Date.ToString("yyyyMMdd") -NotMatch $file.LastWriteTime.Date.ToString("yyyyMMdd")
)
{
Move-Item $file.fullname $Directory -force
}
} # L1 >>
$files = get-childitem $SourceDir *.*
foreach ($file in $files)
{ # L1
# monthly removal will not remove quarterly copy, 6
If($file.LastWriteTime.Month -lt ((Get-Date).Year - $file.LastWriteTime.Year) * 12 + (Get-Date).Month - $KeepMonths `
-and $file.LastWriteTime.Month -NotIn 3, 6, 9, 12
)
{
Move-Item $file.fullname $Directory -force
}
} # L1 >>
$files = get-childitem $SourceDir *.*
foreach ($file in $files)
{ # L1
# quarterly removal will not remove yearly copy, 4
If($file.LastWriteTime.Month -lt ( (Get-Date).Year - $file.LastWriteTime.Year) * 12 + (Get-Date).Month - $KeepQuarters * 3 `
-and $file.LastWriteTime.Month -NotIn 12
)
{
Move-Item $file.fullname $Directory -force
}
} # L1 >>
$files = get-childitem $SourceDir *.*
foreach ($file in $files)
{ # L1
# yearly removal will just go straight ahead. 2
If($file.LastWriteTime.Year -lt (Get-Date).Year - $KeepYears )
{
Move-Item $file.fullname $Directory -force
}
} # L1 >>
<Functions.ps1>
function Get-TimesResult3
{
Param ([int]$a,[int]$b)
$c = $a * $b
Write-Output $c
}
function Get-Weekday {
param(
$Month = $(Get-Date -format 'MM'),
$Year = $(Get-Date -format 'yyyy'),
$Days = 1..5
)
$MaxDays = [System.DateTime]::DaysInMonth($Year, $Month)
1..$MaxDays | ForEach-Object {
Get-Date -day $_ -Month $Month -Year $Year |
Where-Object { $Days -contains $_.DayOfWeek }
}
}
function Get-LastFridayOfMonth([DateTime] $d) {
$lastDay = new-object DateTime($d.Year, $d.Month, [DateTime]::DaysInMonth($d.Year, $d.Month))
$diff = ([int] [DayOfWeek]::Friday) - ([int] $lastDay.DayOfWeek)
if ($diff -ge 0) {
return $lastDay.AddDays(- (7-$diff))
}
else {
return $lastDay.AddDays($diff)
}
}

Related

Move multiple files from folder A to folder B

We need to move .csv files from folder where the are stored down to external server using powershell.
this is what i've tried so far but for some reason i only get message not copying and name of the files:
$DestinationFolder = "C:\d1\"
$SourceFolder = "C:\s1\"
If (-not (Test-Path $DestinationFolder) ) {
New-Item -ItemType Directory -Force -Path $DestinationFolder
}
$EarliestModifiedTime = (Get-Date).AddMinutes(200).Date # get's current time + adds 30 min
$LatestModifiedTime = (Get-Date).Date
echo($DestinationFolder); # will check time
Get-ChildItem "C:\s1\*.*" |
ForEach-Object {
if ( ($_.CreationTime -ge $EarliestModifiedTime) -and ($_.CreationTime -lt $LatestModifiedTime) ){ # i.e., all day yesterday
Copy-Item $_ -Destination $DestinationFolder -Force
Write-Host "Copied $_" }
else {
Write-Host "Not copying $_"
}
}
does it work if you simplify it and just try for a test (e.g. pick one file rather than trying to run for a delta of every 30 mins / last day)? Just thinking first you need to see if the problem is with accessing (or the formatting) of your source / destination directories or the delta logic itself. Perhaps some more error conditions would help....

PowerShell create a duplicate folder with zero-size files

I want to create a 0-filesize mirror image of a set of folder, but while robocopy is really good, it doesn't save all of the information that I would like:
robocopy D:\documents E:\backups\documents_$(Get-Date -format "yyyyMMdd_HHmm")\ /mir /create
The /create switch makes each file in the duplicate folder have zero-size, and that is good, but I would like each file in the duplicate folder to have [size] appended to the end of the name with the size in KB or MB or GB, and the create / last modified time on every file to exactly match the original file. This way, I will have a zero-size duplicate of the folder that I can archive, but which contains all of the relevant information for the files in that directory, showing the size of each and the exact create / last modified times.
Are there good / simple ways to iterate through a tree in PowerShell, and for each item create a zero size file with all relevant information like this?
This would be one way to implement the copy command using the approach I mentioned in the comments. This should give you something to pull ideas from. I didn't intend to spend as much time on it as I did, but I ran it on several directories and found some problems and debugged each problem I encountered. This is a pretty solid example at this point.
function Copy-FolderZeroSizeFiles {
[CmdletBinding()]
param( [Parameter(Mandatory)] [string] $FolderPath,
[Parameter(Mandatory)] [string] $DestinationPath )
$dest = New-Item $DestinationPath -Type Directory -Force
Push-Location -LiteralPath $FolderPath
try {
foreach ($item in Get-ChildItem '.' -Recurse) {
$relPath = Resolve-Path -LiteralPath $item -Relative
$type = if ($item.Attributes -match 'Directory')
{ 'Directory' }
else { 'File' }
$destItem = New-Item "$dest\$relPath" -Type $type -Force
$destItem.Attributes = $item.Attributes
$destItem.LastWriteTime = $item.LastWriteTime
}
} finally {
Pop-Location
}
}
Note: the above implementation is simplistic and represents anything that isn't a directory as a file. That means symbolic links, et al. will be files with no information what they would be linked to.
Here's a function to get the conversion from number of bytes to N.N B/K/M/G format. To get more decimal places, just add 0's to the end of the format strings.
function ConvertTo-FriendlySize($NumBytes) {
switch ($NumBytes) {
{$_ -lt 1024} { "{0,7:0.0}B" -f ($NumBytes) ; break }
{$_ -lt 1048576} { "{0,7:0.0}K" -f ($NumBytes / 1024) ; break }
{$_ -lt 1073741824} { "{0,7:0.0}M" -f ($NumBytes / 1048576) ; break }
default { "{0,7:0.0}G" -f ($NumBytes / 1073741824); break }
}
}
Often, people get these conversions wrong. For instance, it's a common error to use 1024 * 1000 to get Megabytes (which is mixing the base10 value for 1K with the base2 value for 1K) and follow that same logic to get GB and TB.
Here is what I came up with with the additional parts in the question, change $src / $dst as required (D:\VMs is where I keep a lot of Virtual Machines). I have included setting all of CreationTime, LastWriteTime, LastAccessTime so that the backup location with zero-size files is a perfect representation of the source. As I want to use this for archival purposes, I have finally zipped things up and included a date-time stamp in the zipfile name.
# Copy-FolderZeroSizeFiles
$src = "D:\VMs"
$dst = "D:\VMs-Backup"
function ConvertTo-FriendlySize($NumBytes) {
switch ($NumBytes) {
{$_ -lt 1024} { "{0:0.0}B" -f ($NumBytes) ; break } # Change {0: to {0,7: to align to 7 characters
{$_ -lt 1048576} { "{0:0.0}K" -f ($NumBytes / 1024) ; break }
{$_ -lt 1073741824} { "{0:0.0}M" -f ($NumBytes / 1048576) ; break }
default { "{0:0.0}G" -f ($NumBytes / 1073741824); break }
}
}
function Copy-FolderZeroSizeFiles($FolderPath, $DestinationPath) {
Push-Location $FolderPath
if (!(Test-Path $DestinationPath)) { New-Item $DestinationPath -Type Directory }
foreach ($item in Get-ChildItem $FolderPath -Recurse -Force) {
$relPath = Resolve-Path $item.FullName -Relative
if ($item.Attributes -match 'Directory') {
$new = New-Item "$DestinationPath\$relPath" -ItemType Directory -Force -EA Silent
}
else {
$fileBaseName = [System.IO.Path]::GetFileNameWithoutExtension($item.Name)
$fileExt = [System.IO.Path]::GetExtension($item.Name)
$fileSize = ConvertTo-FriendlySize($item.Length)
$new = New-Item "$DestinationPath\$(Split-Path $relPath)\$fileBaseName ($fileSize)$fileExt" -ItemType File
}
"$($new.Name) : creation $($item.CreationTime), lastwrite $($item.CreationTime), lastaccess $($item.LastAccessTime)"
$new.CreationTime = $item.CreationTime
$new.LastWriteTime = $item.LastWriteTime
$new.LastAccessTime = $item.LastAccessTime
$new.Attributes = $item.Attributes # Must this after setting creation/write/access times or get error on Read-Only files
}
Pop-Location
}
Copy-FolderZeroSizeFiles $src $dst
$dateTime = Get-Date -Format "yyyyMMdd_HHmm"
$zipName = "$([System.IO.Path]::GetPathRoot($dst))\$([System.IO.Path]::GetFileName($dst))_$dateTime.zip"
Add-Type -AssemblyName System.IO.Compression.FileSystem
[IO.Compression.ZipFile]::CreateFromDirectory($dst, $zipName)

Powershell script to execute different actions on multiple servers based on their ending domain

I've been trying for a while to write a script that would execute a few actions on ALL hosts given in a text form (coming from a .txt or .csv) --> demo:
**## C:\hosts.txt**
- Machine4.Int.ecom.domain
- Machine3.emea.domain.com
- Machine1.production.domain.com
- Machine2.quality.domain.com
The task of the script ideally is by using Get-Date
Clear-Host
Set-ExecutionPolicy RemoteSigned
$Now = Get-Date
$Days = #("90","30","14","7") ## THIS LINE SHOULD BE DEPENDING DOMAIN IN HOSTNAME
$TargetFolder = #("C:\Windows\SoftwareDistribution\Downloads","C:\Windows\Temp","C:\Windows\CCMCache")
$Extension = #("*.vhk*","*.txt*")
$LastWrite = $Now.AddDays(-$Days)
$Files = Get-ChildItem $TargetFolder -Include $Extension -Recurse | Where {$_.LastWriteTime -le "$LastWrite"}
For example, let's say I have the following hostnames:
Machine1.production.domain.com
Machine2.quality.domain.com
Machine3.emea.domain.com
Machine4.Int.ecom.domain
Now, based on the domains ".production" or ".quality" or ".emea" or ".int" I would like to perform the following actions.
In Production servers -> Delete files older than 90 days.
In Quality servers -> Delete files older than 30 days.
In emea servers ->Delete files older than 14 days.
In INT servers -> Delete files older than 7 days.
After deleting, it would also save the file paths in a CSV file, so that I can double-check and restore them if necessary.
Can you please help me with this?
Thanks in advance.
If your are using a text file containing the servers, you could add use switch inside the loop where you go through the servers and set the cleanup reference date there:
$servers = Get-Content -Path 'C:\hosts.txt'
$targetFolders = "C:\Windows\SoftwareDistribution\Downloads","C:\Windows\Temp","C:\Windows\CCMCache"
$extensions = "*.vhk","*.txt"
$today = (Get-Date).Date
foreach ($machine in $servers) {
# determine the reference date by the machine's name
$refDate = switch -Regex ($machine) {
'\.production\.' { $today.AddDays(-90); break }
'\.quality\.' { $today.AddDays(-30); break }
'\.emea\.' { $today.AddDays(-14); break }
'\.int\.' { $today.AddDays(-7) }
}
Invoke-Command -ComputerName $machine -ScriptBlock {
$Files = Get-ChildItem -Path $using:targetFolders -File -Include $using:extensions -Recurse |
Where-Object {$_.LastWriteTime -le $using:refDate}
# do your clean-up here on the files you have gathered
# maybe write a log first or simply delete these files?
}
}

Powershell to Split huge folder in multiple folders

I have a folder that contains many huge files. I want to split these files in 3 folders. The requirement is to get the count of files in main folder and then equally split those files in 3 child folders.
Example - Main folder has 100 files. When I run the powershell, 3 child folders should be created with 33, 33 and 34 files respectively.
How can we do this using Powershell?
I've tried the following:
$FileCount = (Get-ChildItem C:\MainFolder).count
Get-ChildItem C:\MainFolder -r | Foreach -Begin {$i = $j = 0} -Process {
if ($i++ % $FileCount -eq 0) {
$dest = "C:\Child$j"
md $dest
$j++
}
Move-Item $_ $dest
}
Here is another solution. This one accounts for the sub folders not existing.
# Number of groups to support
$groupCount = 3
$path = "D:\temp\testing"
$files = Get-ChildItem $path -File
For($fileIndex = 0; $fileIndex -lt $files.Count; $fileIndex++){
$targetIndex = $fileIndex % $groupCount
$targetPath = Join-Path $path $targetIndex
If(!(Test-Path $targetPath -PathType Container)){[void](new-item -Path $path -name $targetIndex -Type Directory)}
$files[$fileIndex] | Move-Item -Destination $targetPath -Force
}
If you need to split up the files into a different number of groups the use $groupCount of higher that 3. Can also work logic with a switch that would change $groupCount to something else if the count was greater that 500 for example.
Loop though the files one by one. Using $fileIndex as a tracker we determine the folder 0,1 or 2 in my case that the file will be put into. Then using that value we check to be sure the target folder exists. Yes, this logic could easily be placed outside the loop but if you have file and folder changes while the script is running you could argue it is more resilient.
Ensure the folder exists, if not make it. Then move that one item. Using the modulus operator, like in the other answers, we don't have to worry about how many files are there. Let PowerShell do the math.
This is super quick and dirty, but it does the job.
#Get the collection of files
$files = get-childitem "c:\MainFolder"
#initialize a counter to 0 or 1 depending on if there is a
#remainder after dividing the number of files by 3.
if($files.count % 3 -eq 0){
$counter = 0
} else {
$counter = 1
}
#Iterate through the files
Foreach($file in $files){
#Determine which subdirectory to put the file in
If($counter -lt $files.count / 3){
$d = "Dir1"
} elseif($counter -ge $files.count / 3 * 2){
$d = "Dir3"
} else {
$d = "Dir2"
}
#Create the subdirectory if it doesn't exist
#You could just create the three subdirectories
#before the loop starts and skip this
if(-Not (test-path c:\Child\$d)){
md c:\Child\$d
}
#Move the file and increment the counter
move-item $file.FullName -Destination c:\Child\$d
$counter ++
}
I think it's possible to do without doing the counting and allocating yourself. This solution:
Lists all the files
Adds a counter property which cycles 0,1,2,0,1,2,0,1,2 to each file
groups them into buckets based on the counter
moves each bucket in one command
There's scope for rewriting it in a lot of ways to make it nicer, but this saves doing the math, handling uneven allocations, iterating over the files and moving them one at a time, would easily adjust to different numbers of groups.
$files = (gci -recurse).FullName
$buckets = $files |% {$_ | Add-Member NoteProperty "B" ($i++ % 3) -PassThru} |group B
$buckets.Name |% {
md "c:\temp$_"
Move-Item $buckets[$_].Group "c:\temp$_"
}

Non-Terminating Exception in Folder Deletion Script

I've written a Powershell script that would periodically delete folders on my machine.
The algorithm is as follows:
Drill down into each directory structure to the lowest subfolders
Check the creation date of the subfolder
If it's 14 days old, or older, delete it
LOG EVERYTHING (not part of the algorithm, just good practise)
When running, it operates exactly as expected...
... Except it throws the following, non-terminating exception:
Get-ChildItem : Could not find a part of the path 'C:\foo\baz'.
At C:\src\CoreDev\Trunk\Tools\BuildClean script\buildclean.ps1:55 char:15
+ Get-ChildItem <<<< -recurse -force |
+ CategoryInfo : ReadError: (C:\foo\baz:String) [Get-ChildItem],
DirectoryNotFoundException
+ FullyQualifiedErrorId : DirIOError,Microsoft.PowerShell.Commands.GetChil
dItemCommand
Why is this happening? More importantly, how can I remove it, and will it cause an issue?
The script is as follows:
# folderclean.ps1
# This script will remove each leaf node of a directory, provided that leaf is over
# 14 days old.
# CONSTANT DECLARATIONS
# testing (run on my local machine)
$proj_loc = "C:\foo", "C:\bar"
$logpath = "C:\Logs\BuildClean\$(Get-Date -format yyyyMMdd).log"
function Write-ToLogFile {
param ([string]$stringToWrite)
Add-Content $logpath -value $stringToWrite
}
# Function to check if a folder is a leaf folder.
# First, retrieve the directory $item is pointing to
# Then, create a list of children of $item that are folders
# If this list is either empty or null, return $true
# Otherwise, return $false
function Folder-IsLeaf($item) {
$ary = Get-ChildItem $item -force | ?{ $_.PSIsContainer }
if (($ary.length) -eq 0 -or $ary -eq $null) {
return $true
}
return $false
}
# Deletes leaf folders that are older than a certain threshhold.
# Get a list of children of the folder, where each child is a folder itself and
# was created over 14 days ago and the folder is a leaf
# For each of these children, delete them and increment $folderCount
# Get a list of children of the folder, where each child is a folder itself and
# was last modified over 14 days ago and the folder is a leaf
# For each of these children, delete them and increment $folderCount
function Remove-LeafFolders($path) {
$createdCount = 0
$modifiedCount = 0
Write-ToLogFile "Operation started at $(Get-Date -format "dd/MM/yyyy hh:mm:ss.fff")"
Write-ToLogFile "Looking in $proj_loc"
Write-ToLogFile ""
$start = $(Get-Date)
$proj_loc |
Get-ChildItem -recurse -force |
?{
$_.PSIsContainer -and ($_.CreationTime).AddDays(15) -lt $(Get-Date) -and $(Folder-IsLeaf $_.FullName) -eq $true
} | %{
$formattedDate = $($_.CreationTime).ToString("dd/MM/yyyy hh:mm:ss");
Write-ToLogFile "Folder $($_.FullName) is being removed; created: $formattedDate"
Remove-Item $_.FullName -recurse;
$createdCount += 1
}
$end = $(Get-Date)
$elapsed = $end - $start
Write-ToLogFile "Operation completed at $(Get-Date -format "dd/MM/yyyy hh:mm:ss.fff")."
Write-ToLogFile "Folders removed: $createdCount"
Write-ToLogFile "Time elapsed: $(($elapsed).TotalMilliseconds) ms"
Write-ToLogFile "-------------------------------"
}
Remove-LeafFolders($proj_loc)
I found this other StackOverflow question, and, after looking through the answer, I realised that the problem was the pipeline. So, I changed my code as follows:
...
$leafList = $proj_loc |
Get-ChildItem -recurse -force |
?{
$_.PSIsContainer -and ($_.CreationTime).AddDays(15) -lt $(Get-Date) -and $(Folder-IsLeaf $_.FullName) -eq $true
}
Foreach ($folder in $leafList)
{
$formattedDate = $($folder.CreationTime).ToString("dd/MM/yyyy hh:mm:ss");
Write-ToLogFile "Folder $($folder.FullName) is being removed; created: $formattedDate"
Remove-Item $folder.FullName -recurse;
$createdCount += 1
}
...
I created a few local folders and screwed around with them. No exceptions cropped up, so this appears to have worked:
Operation started at 10/12/2012 05:16:18.631
Looking in C:\foo C:\bar
Folder C:\foo\baz is being removed; created: 09/01/2010 02:00:00
Folder C:\bar\baz3\recursion is being removed; created: 01/01/2008 01:00:00
Operation completed at 10/12/2012 05:16:18.748.
Folders removed: 2
Time elapsed: 33.0033 ms
-------------------------------
Operation started at 10/12/2012 05:41:59.246
Looking in C:\foo C:\bar
Folder C:\foo\baz2\NewFolder is being removed; created: 10/10/2010 10:10:10
Folder C:\bar\baz3\barbar is being removed; created: 20/11/2012 05:37:38
Operation completed at 10/12/2012 05:41:59.279.
Folders removed: 2
Time elapsed: 21.0021 ms
-------------------------------
I think it would be easier to accomplish this sort of recursive deletion using a bottom-up approach rather than a top-down which is what Get-ChildItem gives you by default. Try this instead:
$proj_loc |
Get-ChildItem -recurse -force -Name | sort -desc | Get-Item |
? { ....
}