Exporting results from powershell into a CSV document not working - powershell

I wrote a script in Powershell that will pull all files in a directory from the past seven years (this is my first time using Powershell).
I am trying to Export my results from the script below into a TXT or CSV document.
get-childitem -Path P:\ -recurse | where-object {$_.LastWriteTime -lt (getdate).Addyears(-7)}
What should I add to the end of this script to get this data written to a file?

First things first, your script is finding files older than 7 years not files from the last 7 years. You need to change your -lt to a -gt.
get-childitem -Path P:\ -recurse| where-object {$_.LastWriteTime -gt (get-date).Addyears(-7)}
As you have written your script it finds the date/time that the file was last written to, for example 3/24/2015 2:45PM. Then it checks if that is less than right now minus 7 years (at the time of writing that is 12/5/2010 3:22PM). Looking at the years alone we can see that 2015 is not less than 2010, so that file would be excluded.
You can output to a text file using the Out-File or (my preference) Set-Content cmdlets.
get-childitem -Path P:\ -recurse| where-object {$_.LastWriteTime -gt (get-date).Addyears(-7)} | Set-Content C:\Path\To\File.txt
Alternatively if you want to capture the data as well, or display it on screen, you can use the Tee-Object cmdlet.
get-childitem -Path P:\ -recurse| where-object {$_.LastWriteTime -gt (get-date).Addyears(-7)} | Tee-Object -FilePath C:\Path\To\File.txt
If you would like a CSV file you use the Export-Csv cmdlet. When using this cmdlet it is very common to use the -NoTypeInformation parameter (shortened to -NoType in my example) to avoid getting a first line that specifies the object types that it output.
get-childitem -Path P:\ -recurse| where-object {$_.LastWriteTime -gt (get-date).Addyears(-7)} | Export-Csv C:\Path\To\File.csv -NoType

get-childitem -path p:\ -recurse | where-object{$_.lastwritetime -lt((get-date).adddyears(-7))} | export-csv c:\temp\test.csv

Related

Sorting by date PowerShell

I'm trying to find all files that include some string and are not older than "x" days. Those files have to be sorted and then sent in .txt format.
The code seemed fine to me, but it doesn´t filter files by date. All other cmdlets work as intended, but the part Where-Object {$_.LastWriteTime -ge (Get-Date).AddDays(-$days)} doesn´t seem to be working at all.
Do you have any recommendations how to fix it?
Get-ChildItem -Path $path -Recurse |
Where-Object {$_.LastWriteTime -ge (Get-Date).AddDays(-$days)} |
Select-String -Pattern $searched |
Group-Object -Property Path, Filename |
select Name |
Out-File -Filepath C:\tmp\output.txt
| Sort-Object -Property propertyName -Descending
cmdlet to sort by property value
reference link - https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/sort-object?view=powershell-6

Powershell: Recursively search a drive or directory for a file type in a specific time frame of creation

I am trying to incorporate Powershell into my everyday workflow so I can move up from a Desktop Support guy to a Systems Admin. One question that I encountered when helping a coworker was how to search for a lost or forgotten file saved in an unknown directory. The pipeline I came up with was:
dir C:\ -Recurse -Filter *.pdf -ErrorAction SilentlyContinue -Force | Out-File pdfs.txt
This code performed exactly how I wanted but now I want to extend this command and make it more efficient. Especially since my company has clients with very messy file management.
What I want to do with this pipeline:
Recursively search for a specific file-type that was created in a specified time-frame. Lets say the oldest file allowed in this search is a file from two days ago.
Save the file to a text file with the columns containing the Filename, FullName(Path), and sorted by the created time in descending order.
What I have so far:
dir C:\ -Recurse -Filter *.pdf -ErrorAction SilentlyContinue -Force | Select-Object Name, FullName | Out-File *pdfs.txt
I really need help on how to create a filter for the time that the file was created. I think I need to use the Where-Object cmdlet right after the dir pipe and before the Select Object pipe but I don't know how to set that up. This is what I wrote: Where-Object {$_.CreationTime <
You're on the right track, to get the files from a specific file creation date range, you can pipe the dir command results to:
Where-Object {$_.CreationTime -ge "06/20/2017" -and $_.CreationTime -le "06/22/2017"}
If you want something more repeatable where you don't have to hard-code the dates everytime and just want to search for files from up to 2 days ago you can set variables:
$today = (Get-Date)
$daysago = (Get-Date).AddDays(-2)
then plugin the variables:
Where-Object {$_.CreationTime -ge $daysago -and $_.CreationTime -le $today}
I'm not near my Windows PC to test this but I think it should work!
See if this helps
dir c:\ -Recurse -Filter *.ps1 -ErrorAction SilentlyContinue -Force | select LastWriteTime,Name | Where-Object {$_.LastWriteTime -ge [DateTime]::Now.AddDays(-2) } | Out-File Temp.txt

Powershell Create CSV of files with "_lowRes.jpg" of a certain file size

I am trying to create a CSV file of all jpgs in a directory and its sub-directories that are above 100 KB and have the suffix "_lowRes.jpg".
Want to use Powershell.
Any help please?
This is pretty easy actually!
You'll do this with two separate filters, which PowerShell achieves via the Where-Object cmdlet. This cmdlet accepts comparisons in the format of {$_.PropertyName -eq "Something"} or PropertyName -eq "Something". The later format is only available on PowerShell v3 and up.
First, to filter to only files above 100KB.
Where-Object Length -ge 100KB
The second part, where the filename contains something.
Where-object Name -like "*lowRes.jpg*"
You could join them, but I would just pipe one into the other, like this.
dir *.jpg -Recurse | Where-Object Length -ge 100KB | Where-object Name -like "*lowRes.jpg*"
You might want to put the Name filtering first, because less files will have a certain name than be above or below a certain size. Depends on how your files are laid out.
Finally, pipe all of that into the Export-Csv cmdlet and bam, you're done!
you can do it simply like this :
Get-ChildItem "C:\temp" -Recurse -file -filter "*_lowRes.jpg" |
Where Length -ge 100KB | select fullname, Length |
export-csv "c:\temp\result.csv" -NoType
short version (for no purist) :
gci "C:\temp" -Rec -file -filter "*_lowRes.jpg" | ? L -le 100KB | select fu*, le* | epcsv "c:\temp\result.csv" -Not

Windows PowerShell - Delete Files Older than X Days

I am currently new at PowerShell and I have created a script based on gathered information on the net that will perform a Delete Operation for found files within a folder that have their LastWriteTime less than 1 day.
Currently the script is as follows:
$timeLimit = (Get-Date).AddDays(-1)
$oldBackups = Get-ChildItem -Path $dest -Recurse -Force -Filter "backup_cap_*" |
Where-Object {$_.PSIsContainer -and $_.LastWriteTime -lt $timeLimit}
foreach($backup in $oldBackups)
{
Remove-Item $dest\$backup -Recurse -Force -WhatIf
}
As far as I know the -WhatIf command will output to the console what the command "should" do in real-life scenarios. The problem is that -WhatIf does not output anything and even if I remove it the files are not getting deleted as expected.
The server is Windows 2012 R2 and the command is being runned within PowerShell ISE V3.
When the command will work it will be "translated" into a task that will run each night after another task has finished backing up some stuff.
I did it in the pipe
Get-ChildItem C:\temp | ? { $_.PSIsContainer -and $_.LastWriteTime -lt $timeLimit } | Remove-Item -WhatIf
This worked for me. So you don't have to ttake care of the right path to the file.
other solution
$timeLimit = (Get-Date).AddDays(-1)
Get-ChildItem C:\temp2 -Directory | where LastWriteTime -lt $timeLimit | Remove-Item -Force -Recurse
The original issue was $dest\$backup would assume that each file was in the root folder. But by using the fullname property on $backup, you don't need to statically define the directory.
One other note is that Remove-Item takes arrays of strings, so you also could get rid of the foreach
Here's the fix to your script, without using the pipeline. Note that since I used the where method this requires at least version 4
$timeLimit = (Get-Date).AddDays(-1)
$Backups = Get-ChildItem -Path $dest -Directory -Recurse -Force -Filter "backup_cap_*"
$oldBackups = $backups.where{$_.LastWriteTime -lt $timeLimit}
Remove-Item $oldBackups.fullname -Recurse -Force -WhatIf

Outputing Remove-Item to a log file

Scanning a directory for a specific set of files and sorting them by date. Keeping 7 of the LATEST copies of the file regardless of date, and removing the oldest if over 7. I am having a hard time producing a log file showing the deletes since Remove-Item has no output.
Below is a copy of my code:
$path = "C:\- Deploy to Production -\Previous Deploys\*_*_BOAWeb.rar" #BOA
$files = Get-ChildItem -Path $path -Recurse | Where-Object {-not $_.PsIsContainer}
$keep = 7
if ($files.Count -gt $keep) {
$files | Sort-Object CreationTime |
Select-Object -First ($files.Count - $keep) |
Remove-Item -Force
}
First off you are over complicating things. Add -Descending to your Sort command, and then change your Select to -Skip $keep. It's simpler that way. Then you have options for outputting your deleted files.
Remove-Item -Force -Verbose 4>&1 | Add-Content C:\Path\To\DeletedFiles.log
or (keeping with your current code above)
Select-Object -First ($files.Count - $keep) |Tee-Object -filepath C:\Path\To\DeletedFiles.log -append
The first will output the verbose output of Delete-Item and append it to whatever log file you specify the path for (use Set-Content if you want to replace the log instead). The second option will append the [FileInfo] objects onto a log that you specify.
Edit: As pointed out by Ansgar Wiechers, I had forgotten to to combine my verbose and stdout streams, so 4>&1 was added to the above code to correct that issue.