I have been working on a script of late and have come across a snag. I am in the process of removing folders which are automatically created. I want to delete the older versions of those files whilst keeping the new folders untouched, for example:
18.212.1021.0008 //Created on the 19/11/2018 12:12
18.212.1021.0008_1 //Created on the 19/11/2018 12:23
18.212.1021.0008_2 //Created on the 19/11/2018 12:27
18.212.1021.0008_3 //Created on the 19/11/2018 12:32
I would want to keep 18.212.1021.008_3 so I guess I would need to keep the folder with the most recent creation date.
Please see the code below:
$Versionarray = 13..20
Get-ChildItem "$env:LOCALAPPDATA\Microsoft\OneDrive" -Recurse | Where-Object {
# Recusivly deletes OneDrive version folders within
# Appdata\local which build up everytime OneDrive
# is installed/script is run
$item = $_
$item -is [System.IO.DirectoryInfo] -and (
$Versionarray | Where-Object { $item.Name.Contains($_) }
)
} | Remove-Item -Recurse -Confirm:$false -ErrorAction SilentlyContinue
If the newest folder you want to keep is also the one with the newest creation time, you can use this simple one-liner:
Get-ChildItem "$env:LOCALAPPDATA\Microsoft\OneDrive" -Directory | sort CreationTime | select -SkipLast 1 | Remove-Item -Recurse -Force
If you want to filter out only a specific type of folders by name, you could use a simple regex match. I cannot help you with the exact regex (since I would have to know your folder naming pattern) but it would look something like this:
Get-ChildItem "$env:LOCALAPPDATA\Microsoft\OneDrive" -Directory | where Name -match '\d\d+' | sort CreationTime | select -SkipLast 1 | Remove-Item -Recurse -Force
(Note that this is syntax might not work if you use an old Powershell version. If that's the case, let me know and I will provide a compatible fallback solution.)
UPDATE
In response to your comment: Your requirements are still a bit unclear, but here is something to get you started:
If you want to make sure to only delete folders that "look like" version folders, you can adjust the regex in the where-filter. _\d+$ will match anything with an underscore and numbers at the end:
where $_.Name -match '_\d+$'
If you also want to make sure, that this is actually a versioned copy of another existing folder, you could check that too:
where { $_.FullName -match '^(?<OriginalPath>.+)_\d+$' -and (Test-Path $Matches.OriginalPath) }
Related
extreme powershell newbie here so please be gentle...
I have a filing system where where files in folders are generated semi-automatically, with multiple versions being kept as redundancy (we really do revert regularly).
Files within the folder are named with the first 13 characters as the identifier, with various dates or initials afterwards.
12345-A-01-01_XYZ_20191026.pdf
i.e. the file is 12345-A-01-01 and everything past the first 13 characters is "unpredictable"
FILE000000001xxxxxxx.pdf
FILE000000001yyyy.pdf
FILE000000001zzzzzz.pdf
FILE000000002xxxx.pdf
FILE000000002yyy.pdf
FILE000000002zz.pdf
FILE000000003xx.pdf
FILE000000003yyy.pdf
FILE000000003zzzz.pdf
I'm trying to write a script that can determine the newest version (by date modified file property) of each file "group"
i.e. the newest FILE000000001*.pdf etc
and slide all the others into the .\Superseded subfolder
All I've managed to get so far is a "list" sorting to show the newest at the top of "each" group... now I need to know how to keep that file, and move the others... any direction or help would be great thanks :)....
$_SourcePath = "C:\testfiles"
$_DestinationPath = "C:\testfiles\Superseded"
Get-ChildItem $_SourcePath |
Where-Object {-not $_.PSIsContainer} |
Group-Object { $_.Basename.Substring(0,12) } |
foreach {
$_.Group |
sort LastWriteTime -Descending
} | Move-Item -Destination $_DestinationPath
I think you are pretty close. Since you sorted descending order you should just skip the first file:
$SourcePath = "C:\testfiles"
$DestinationPath = "C:\testfiles\Superseded"
Get-ChildItem $SourcePath -File |
Group-Object { $_.Basename.Substring(0,12) } |
ForEach-Object {
$_.Group |
Sort-Object LastWriteTime -Descending |
Select-Object -skip 1 |
Move-Item -Destination $DestinationPath -WhatIf
# Note: Above, the move has to be in each iteration of the loop
# so we skip the first (newest) of each file.
}
You don't need Where-Object {-not $_.PSIsContainer} , use the -File Parameter instead.
Also I wouldn't name your variables $_***. That's bound to get confused with $_ like the pipeline variable.
I added -WhatIf to the move command so you can test without causing any damage ...
I didn't test it, but it looks about right.
I have a folder that contains a lot of files, multiple files per day.
I would like to script something that deletes all but the latest file per day.
I have seen a lot of scripts that delete files over X days old but this is slightly different and having written no powershell before yesterday (I'm exclusively tsql), I'm not really sure how to go about it.
I'm not asking anyone to write the code for me but maybe describe the methods of achieving this would be good and I can go off an research how to put it into practise.
All files are in a single directory, no subfolders. there are files I dont want to delete, the files i want to delete have file name in format constant_filename_prefix_YYYYMMDDHHMMSS.zip
Is powershell the right tool? Should i instead be looking at Python (which I also don't know) Powershell is more convinient since other code we have is written in PS.
PowerShell has easy to use cmdlets for this kind of thing.
The question to me is if you want the use the dates in the file names, or the actual LastWriteTime dates of the files themselves (as shown in File Explorer).
Below two ways of handling this. I've put in a lot of code comments to help you get the picture.
If you want to remove the files based on their actual last write times:
$sourceFolder = 'D:\test' # put the path to the folder where your files are here
$filePrefix = 'constant_filename_prefix'
Get-ChildItem -Path $sourceFolder -Filter "$filePrefix*.zip" -File | # get files that start with the prefix and have the extension '.zip'
Where-Object { $_.BaseName -match '_\d{14}$' } | # that end with an underscore followed by 14 digits
Sort-Object -Property LastWriteTime -Descending | # sort on the LastWriteTime property
Select-Object -Skip 1 | # select them all except for the first (most recent) one
Remove-Item -Force -WhatIf # delete these files
OR
If you want to remove the files based the dates in the file names.
Because the date formats you used are sortable, you can safely sort on the last 14 digits of the file BaseName:
$sourceFolder = 'D:\test'
$filePrefix = 'constant_filename_prefix'
Get-ChildItem -Path $sourceFolder -Filter "$filePrefix*.zip" -File | # get files that start with the prefix and have the extension '.zip'
Where-Object { $_.BaseName -match '_\d{14}$' } | # that end with an underscore followed by 14 digits
Sort-Object -Property #{Expression = {$_.BaseName.Substring(14)}} -Descending | # sort on the last 14 digits descending
Select-Object -Skip 1 | # select them all except for the first (most recent) one
Remove-Item -Force -WhatIf # delete these files
In both alternatives you will find there is a switch -WhatIf at the end of the Remove-Item cmdlet. Yhis is for testing the code and no files wil actually be deleted. Instead, with this switch, in the console it writes out what would happen.
Once you are satisfied with this output, you can remove or comment out the -WhatIf switch to have the code delete the files.
Update
As I now understand, there are multiple files for several days in that folder and you want to keep the newest file for each day, deleting the others.
In that case, we have to create 'day' groups of the files and withing every group sort by date and delete the old files.
This is where the Group-Object comes in.
Method 1) using the LastWriteTime property of the files
$sourceFolder = 'D:\test' # put the path to the folder where your files are here
$filePrefix = 'constant_filename_prefix'
Get-ChildItem -Path $sourceFolder -Filter "$filePrefix*.zip" -File | # get files that start with the prefix and have the extension '.zip'
Where-Object { $_.BaseName -match '_\d{14}$' } | # that end with an underscore followed by 14 digits
Group-Object -Property #{Expression = { $_.LastWriteTime.Date }} | # create groups based on the date part without time part
ForEach-Object {
$_.Group |
Sort-Object -Property LastWriteTime -Descending | # sort on the LastWriteTime property
Select-Object -Skip 1 | # select them all except for the first (most recent) one
Remove-Item -Force -WhatIf # delete these files
}
Method 2) using the date taken from the file names:
$sourceFolder = 'D:\test' # put the path to the folder where your files are here
$filePrefix = 'constant_filename_prefix'
Get-ChildItem -Path $sourceFolder -Filter "$filePrefix*.zip" -File | # get files that start with the prefix and have the extension '.zip'
Where-Object { $_.BaseName -match '_\d{14}$' } | # that end with an underscore followed by 14 digits
Group-Object -Property #{Expression = { ($_.BaseName -split '_')[-1].Substring(0,8)}} | # create groups based on the date part without time part
ForEach-Object {
$_.Group |
Sort-Object -Property #{Expression = {$_.BaseName.Substring(14)}} -Descending | # sort on the last 14 digits descending
Select-Object -Skip 1 | # select them all except for the first (most recent) one
Remove-Item -Force -WhatIf # delete these files
}
I have a few QlikView servers with alot of QVD files I need to backup.
The idea is to backup three generations, so lets say the app is named tesla.qvd.
Backing it up naming it like testa.qvd.2019-06.05 if the file was modified today.
Then it would backup a new one the next time it's modified/written to.
In total I would like to save two generations before the first one is removed.
This is for a windows 2012 server, using PS 4.0
#$RemotePath = "C:\qlikview Storage\privatedata\backup\"
$LocalPath = "C:\qlikview Storage\privatedata"
$nomatch = "*\backup\*"
$predetermined=[system.datetime](get-date)
$date= ($predetermined).AddDays(-1).ToString("MM/dd/yyyy:")
Foreach($file in (Get-ChildItem -File $localpath -Recurse | Where {$_.FullName -notlike $nomatch} -Verbose ))
{
Copy-Item -Path $file.fullname -Destination "C:\qlikview Storage\privatedata\backup\$file.$(get-date -f yyyy-MM-dd)"
}
The code above would back the files up with the dates as described in the text before the code.
It's proceeding from here thats my problem.
I tried google and searching the forum.
I don't ask for someone to solve the whole issue I have.
But if you can help me out with which functions / what I should look on to get my end result it would help alot so I can proceed.
In the picture you can see an example how the library looks after backup has been done. The lastwrite on the files would be same as date thou, this is fictionaly created for this question.
You can use the basename attribute of the files in the backup folder since you add a new extension to the files. It would look something like this:
# Group by basename and find groups with more then 2 backups
$Groups = Get-ChildItem -Path "C:\qlikview Storage\privatedata\backup" | Group-Object basename | Where-Object {$_.Count -gt 2}
foreach ($g in $Groups) {
$g.Group | sort LastWriteTime -Descending | select -skip 2 | foreach {del $_.fullname -force}
}
I am trying to incorporate Powershell into my everyday workflow so I can move up from a Desktop Support guy to a Systems Admin. One question that I encountered when helping a coworker was how to search for a lost or forgotten file saved in an unknown directory. The pipeline I came up with was:
dir C:\ -Recurse -Filter *.pdf -ErrorAction SilentlyContinue -Force | Out-File pdfs.txt
This code performed exactly how I wanted but now I want to extend this command and make it more efficient. Especially since my company has clients with very messy file management.
What I want to do with this pipeline:
Recursively search for a specific file-type that was created in a specified time-frame. Lets say the oldest file allowed in this search is a file from two days ago.
Save the file to a text file with the columns containing the Filename, FullName(Path), and sorted by the created time in descending order.
What I have so far:
dir C:\ -Recurse -Filter *.pdf -ErrorAction SilentlyContinue -Force | Select-Object Name, FullName | Out-File *pdfs.txt
I really need help on how to create a filter for the time that the file was created. I think I need to use the Where-Object cmdlet right after the dir pipe and before the Select Object pipe but I don't know how to set that up. This is what I wrote: Where-Object {$_.CreationTime <
You're on the right track, to get the files from a specific file creation date range, you can pipe the dir command results to:
Where-Object {$_.CreationTime -ge "06/20/2017" -and $_.CreationTime -le "06/22/2017"}
If you want something more repeatable where you don't have to hard-code the dates everytime and just want to search for files from up to 2 days ago you can set variables:
$today = (Get-Date)
$daysago = (Get-Date).AddDays(-2)
then plugin the variables:
Where-Object {$_.CreationTime -ge $daysago -and $_.CreationTime -le $today}
I'm not near my Windows PC to test this but I think it should work!
See if this helps
dir c:\ -Recurse -Filter *.ps1 -ErrorAction SilentlyContinue -Force | select LastWriteTime,Name | Where-Object {$_.LastWriteTime -ge [DateTime]::Now.AddDays(-2) } | Out-File Temp.txt
I have situation where I have 3000 vendors in folder structure. Each vendor then has folders for each year (2001, .... 2014) and other folders as well. Is there a way to list all the files that is in latest year (whichever year).
Basically, I need to upload all the latest agreement files from file-share to SharePoint.
One Liner
Get-ChildItem | %{ $_.FullName | Get-ChildItem [1-9][0-9][0-9][0-9] | sort -Descending | Select-Object -First 1 | Get-ChildItem }
You start from the root folder, for each folder you get all the folders which name looks like a year, sort them, take the first one, and get all it's folders.
Of course, there is a plenty of issues with this. e.g. there has to be at least one year folder, no 'year' files etc. I will leave you tackle that kind of problems.
First I would recursively iterate through all the directories, matching the ones that are equivalent to the current year:
$thisYearDirs = get-childitem -Directory -Recurse | where {$_.Name -eq (get-date).year}
then you would just get the files in each of those:
$thisYearDirs | get-childitem
You could also do it all in one line:
get-childitem -Directory -Recurse | where {$_.Name -eq (get-date).year} | get-childitem
Note that the -directory switch needs powershell v3, you could filter out directories in earlier versions by modifying the where clause condition to do it:
get-childitem -Recurse | where {$_.PSIsCOntainer -and $_.Name -eq (get-date).year} | get-childitem