My need is to delete all files older than 14 days in a public folder. I have cobbled together a PowerShell script that just about does the trick as I need it. . . The only problem is, if the user moves a file into the folder - as opposed to copying it - my script will delete that file if it was last accessed more than 14 days ago, even if it was moved into the public folder the same day. The same thing happens with cut and paste. So this is a pretty serious problem.
Here is my script:
# Delete all files older than "file_age" days, at "path".
$path = "C:\Users\emcguire\Desktop\Test"
$file_age = "-14"
$current_date = Get-Date
$date_to_delete = $current_date.AddDays($file_age)
Get-ChildItem $path -Recurse | Where-Object { $_.LastAccessTime -lt $date_to_delete } | Remove-Item
I am pretty new to PowerShell, so I may be missing something very obvious. Is there an easy way to check for files that were moved into the folder but do not have their access timestamp changed? Is there a better way to approach this?
I appreciate any help!
The property LastAccessTime is a notoriously avoided property. Try to use LastWriteTime where ever possible first. Additionally, all those properties are stale, meaning they aren't refreshed when you call them. Use this code to call the refresh method to guarantee you've got the fresh file system info before you query the property:
$file = c:\somefile.txt
$fileObj = New-Object System.IO.FileInfo $file
$fileObj.Refresh()
As you're wanting to base your actions off of how long the file has been in the archive directory, you may want to value you the CreationTime attribute. I added a link below to the list of what your choices are in case there's a better one for your needs.
For reference on the refresh method
For reference on properties to value
Related
I get a CSV every week that our finance team puts in a shared drive. I have a script for that CSV that I run once I get it.
The first command of the script is of course Import-Csv.
The problem is, the finance team insists on naming the file differently each time plus they don't always put it in the same location within the drive.
As a result, I have to first hunt for the file, put it into the directory that the script points to and then rename the file.
I've tried talking to the team about putting it in the same location and making sure the filename is the same but they only follow the instructions for a couple of weeks before just doing whatever.
Ideally, I'd like for it so that when I run the script, there would be a popup that would ask me to pick a CSV (Similar to how it looks when you do "Save As" on an Office Document).
Anyway for this to be done within PowerShell?
You can access .Net classes and interface with the forms library to instantiate and take input from the standard FileOpen dialog. Something like below:
Using Namespace System.Windows.Forms
$FileBrowser = [OpenFileDialog]::new()
$FileBrowser.InitialDirectory = 'c:\temp'
$FileBrowser.Filter = 'Comma Separated Values (*.csv) | *.csv'
[Void]$FileBrowser.ShowDialog()
$CsvFile = $FileBrowser.FileName
Then use $CsvFile int he Import-Csv command.
You can change the .InitialDirectory property to make navigating a little more convenient.
Use the .Filter property to limit the file open display to CSV files, to make things that much more convenient.
Also, use the [Void] class to prevent the status return (usually 'OK' or 'Cancel') from echoing to the screen.
Note: A simple Google search will turn up many examples. I refined some of the work from here. That will also document some of the other properties if you want to explore etc.
If you are willing to settle for a selection box that doesn't look as nice as the Save As dialog, you can use Out-Gridview. Something along these lines might help.
$filenames =
#(Get-ChildItem -Path C:\temp -Recurse -Filter *.csv |
Sort-Object LastWriteTime -Descending |
Out-GridView -Title 'Choose a file' -PassThru)
$csvfile = $filenames[0].FullName
Import-Csv $csvfile | More
The -Path specifies a directory that contains all the locations where your csv file might be delivered. The sort is just to put the recently written files at the top of the grid. This supposedly makes selection easier. The #() wrapper merely makes sure the result stored in $filenames is an array.
You would do something else with the results of Import-Csv.
Steven's response certainly satisfies your original question, but an alternative would be to let PowerShell do the work. If you know the drive, and you know the name of the file this week, you can pass the name to your script and let it search the drive filtering on the specific csv file you need. Make it recursive, and open the only file that matches. Sorry, didn't have time yesterday to include code. Here's a function that returns the full file path when provided with a top level search path and a filename with possible wildcards.
function gfp { $result=gci $args[0] -recurse -include $args[1]; return ($result.DirectoryName + "\" + $result.Name) }
Example: gfp "d:\rootfolder" "thisweeksfilename.csv"
So right now I have a program that moves files automatically from one folder to another only once.
So if that file gets into that folder again, it shouldn't be moved.The application is being executed every 30 minutes. So right now what I have is if LastWriteTime is older than 30 minutes, don't move it.
# Check if file is older than 30 minutes
$olderthan = #(Get-ChildItem -Path $src\$_.pdf | ? { $_.LastWriteTime -ge $date} -ov olderthan)
if (-not $olderthan){
# If it's older than 30 minutes, move no file
$timesall = #(Get-ChildItem -Path $src\$_.pdf | Select-Object -Property BaseName)
write-LogRecord -Typ WARNING "'$($timesall.BaseName)' file(s) are not being moved because they're older than 30 minutes"
$timesall = 0
} else {
#Move File
}
And yes it works, but are there other, better ways to do it?
Thanks in advance!
The other alternative to inspecting file attributes is to do file tracking. I'll assume that the files do not continue to live in the destination folder (otherwise you can use TEST-PATH to see if a file exists before moving).
To me, the most straight forward tracking system would be to create a parallel folder where you can put files with the same name into it. Assuming the file has not been submitted before you would copy A.txt into your destination, and create a A.txt in your tracking path (which could be a empty file, or not, see below). Now you test is to see if the same named file exists in your tracking folder.
Note: this method allows you to easily reprocess a file by removing it from your tracking folder. It also just works when the scheduler does not fire, for whatever reason.
If you need more complex options, like accommodating a file that has changed, you could store finger print information, like size and a hash, in your tracking file. Your test the could also inspect those as part of it's test.
Lastly, at some point you'd probably want to groom your tracking folder. Using LastWriteTime and removing everything past, say, 1 month (or whatever if right for your circumstances) would keep your tracking folder from getting too big. You could run this every time after the transfers, or on a separate schedule.
Let's say I have 10 PDF files in a folder named c:\Temp
1440_021662_54268396_1.pdf
1440_028116_19126420_1.pdf
1440_028116_19676803_1.pdf
1440_028116_19697944_1.pdf
1440_028116_19948492_1.pdf
1440_028116_19977334_1.pdf
1440_028116_20500866_1.pdf
1440_028116_20562027_1.pdf
1440_028116_20566871_1.pdf
1440_028116_20573350_1.pdf
In my search, I know I am looking for a file that will match a specific number, for example 19676803 (I'm getting the number to search for from a SQL Query I'm running in my script)
I know how to find that specific file, but what I need to be able to do is move all the files after the searched file has been found to another pre-defined folder. So using the 10 PDFs above as the example files, I need to move all the files "after" the file named 1440_028116_19676803_1.pdf to another folder. I know how to move files using PowerShell, just do not know how to do it after/from a specific file name. Hope that makes sense.
$batchNumCompleted = 'c:\Temp\'
$lastLoanPrinted = $nameQuery.LoanNumber
$fileIndex = Get-ChildItem -path $batchNumCompleted | where {$_.name -match $lastLoanPrinted}
Can anyone provide suggestions/help on accomplishing my goal? I'm not able to provide all code written so far as it contains confidential information. Thank you.
Use the .Where() extension method in SkipUntil mode:
$allFiles = Get-ChildItem -path $batchNumCompleted
$filesToMove = $allFiles.Where({$_.Name -like '*19676803_1.pdf'}, 'SkipUntil') |Select -Skip 1
Remove the Select -Skip 1 command if you want to move the file with 19676803 in the name as well
I'm still fairly new to powershell, so please bear with me.
I have 2 almost identical directories. Files and folders from the old directory were copied over to a new directory. However, during this transfer process, something happened to the last modified date. The files and folders in the new directory have incorrect last modified dates (ex: today).
Rather than re-doing the transfer process, which would take a long time, I'd like to write something in powershell that will compare the last modified dates of the two directories and correct the dates in the new directory.
I'd also like to check first if file/folder has been modified since the file transfer. There would be no reason to change the date on those files.
What I found from looking around and googling:
Link1 Link2 Link 3 Link 4
I know that I can get the last modified date of a file with:
(Get-Item $filename).LastWriteTime
where $filename is the file directory.
I also came across the following:
dir $directory | ? {$_.lastwritetime -gt "6/1/19" -AND $_.lastwritetime -lt "12/30/19"}
I know I can get information regarding files that were modified between 2 dates. This, I can tweak to make it so the "less than (-lt)" can be used to check files that were not modified past a certain date.
dir $directory | ? {$_.lastwritetime -lt `12/13/19'}
This accomplishes one of my goals. I have a means to check if a file has been modified past a certain or not.
I saw this for changing the value lastwritetime
$folder = Get-Item C:\folder1
$folder.LastWriteTime = (Get-Date)
and realized this was simply
(Get-Item $filename).LastWriteTime = (Get-Date)
Which I could modify to meet my goal of replacing the new file's last write time wit the old file's correct time:
(Get-Item $filename).LastWriteTime = (Get-Item $filename2).LastWriteTime
I suppose what I'm struggling with is kind of putting it all together. I know how to recurse through files/folders for copy-item or even Get-Childitem by adding the "recurse" parameter. But I'm having difficulties wrapping my head around recursively navigating through each directory to change the dates.
Thank you for your help.
You could do the following to compare the LastWriteTime property of the original files and folders to the copies, while keping in mind that files in the copy folder could have been updated since the last transfer date.
# set the date to the last transfer date to determine if the file was updated after that
$lastTransferDate = (Get-Date).AddDays(-10) # just for demo 10 days ago
# set the paths for the rootfolder of the originals and the rootfolder to where everything was copied to
$originalPath = 'D:\OriginalStuff'
$copyPath = 'E:\TransferredStuff'
# loop through the files and folders of the originals
Get-ChildItem -Path $originalPath -Recurse | ForEach-Object {
# create the full path where the copied file of folder is to be found
$copy = Join-Path -Path $copyPath -ChildPath $_.FullName.Substring($originalPath.Length)
# test if this object can be found
if (Test-Path -Path $copy) {
$item = Get-Item -Path $copy
# test if the item has not been updated since the last transfer date
if ($item.LastWriteTime -le $lastTransferDate) {
# set the timestamp the same as the original
$item.LastWriteTime = $_.LastWriteTime
}
}
}
Great job with what you've done so far.
Just put what you have into a foreach statement.
Foreach($item in (gci 'C:\Users\usernamehere\Desktop\folder123' -recurse)){
(Get-Item $item.FullName).LastWriteTime = (Get-Item "C:\Users\usernamehere\Desktop\folderabc\RandomFile.txt").LastWriteTime
}
We wrap the Get-Childitem command with the -recurse flag into parenthesis so that the command executes on it's own and becomes a collection for our foreach command to traverse. $item is the current item in the loop. We will want to use the .FullName property to know the full path to the file for the current item. With that said you will use $item.FullName together for the files you are going to set the date on.
I know this question was already asked by someone but I will ask again.
Can someone tell me how to rename in bulk and in ascending order if possible in CMD. I already tried renaming in powershell but to no avail. It only let me use once and I need to rename another folder files but to no avail. It didn't let it rename files in another folder. This is the code I use in powershell:
$i = 1
Get-ChildItem *.mkv | %{Rename-Item $_ -NewName ('Haikyuu - {0:D2}.mkv' -f $i++)}
I'm renaming my anime series per folder and some of my copies have 100+ videos. and somehow you could teach me what each code mean (the code that must use in CMD). The ones I've searched can't understand it in layman's term or doesn't tell the user how it's supposed to work. Thank you in advance. by the way, the folder is placed in an external drive.
so from the beginning:
$i= variable for storing the initial value 1
Get-ChildItem = is like "dir" which lists the files and folder under a certain path.
In this case, it is listing all the files which starts with anything but have the extension .mkv
* indicates wildcard.
| = pipeline which passes the output of the first command as an input of the next command.
% = ForEach-Object is iterating each object one by one coming from the pipeline.
$_= Current pipeline object . Here it is taking each object one by one and renaming it using Rename-Item
-NewName = is the parameter of the Rename-Item which asks for the new name to pass.
Hope it clarifies your need.
The reason why I can't rename my video files is there were [brackets] on the filename.
So I use this:
Get-ChildItem -Recurse -Include *.mkv | Rename-Item -NewName { $_.Name.replace("[","").replace("]","").replace("(","").replace(")","") }
Which on the same directories, I can access subfolders too to omit brackets and parethesis. then I proceed using the code above in the question to rename my files in every folder. The Reason why I'm doing the 'renaming' per folder is that, each folder is different anime series. but the code above is working.
if anyone can give me less code than repeating the 'replace' and concatenating it, I will gladly accept and choose that as the best answer. :)
If you use the parameter -LiteralPath for the source, no prior renaming is necessary.
%i = 1
Get-ChildItem *.mkv |
ForEach {Rename-Item -LiteralPath "$_" -NewName ('Haikyuu - {0:D2}.mkv' -f $i++)}
A hint on sorting, I hope the present numbering of the source files has a constant width, otherwise the result is mixed up as an alphabetic sort (which is inherent to ntfs formatted drives) will sort the number 10 in front of 2.
To check this append the parameter -whatif to the Rename-Item command