I am trying to read a CSV with a list of files including folder path and then delete them if they are older than x days.
I can do this is I list folders in the csv but cannot get it to work for just files.
CSV
FullName,,,,,,,,,,,,,
E:\$RECYCLE.BIN\S-1-5-21-352280589-691296097-1232828436-9414\$RCUCS3H.txt,,,,,,,,,,,,,
E:\$RECYCLE.BIN\S-1-5-21-352280589-691296097-1232828436-9414\$RWF5KKJ.txt,,,,,,,,,,,,,
E:\Account Lockout Files\Alockout.zip,,,,,,,,,,,,,
E:\Account Lockout Files\AlockoutXP.zip,,,,,,,,,,,,,
PowerShell contains.
$DatetoDelete = (Get-Date).AddDays(-3650)
Get-Content 'C:\delete\1.csv' | ForEach-Object { $_.Trim() } | Where-Object { $_.LastWriteTime -lt $DatetoDelete } | Remove-Item -Force
When I use the script I did above it just deletes the files even if they are older, please can any one assist? thanks.
You are almost there with your code, but you are missing the actual check for the last write time, if you use Get-Item to get the properties of the file, you will then have the LastWriteTime property for you to use in your Where-Object
The below should do what you need with just one extra command and a pipe
Get-Content 'C:\delete\1.csv' | ForEach-Object {
$_.Trim()
Get-Item -Path $_ |Where-Object { $_.LastWriteTime -lt $DatetoDelete } | Remove-Item -Force
}
Thanks for sharing an example of the input csv file. We now can be sure it actually is csv with headers and therefore, using Get-Content is the wrong cmdlet.
Try with Import-Csv instead:
$DatetoDelete = (Get-Date).AddDays(-3650).Date # set to midnight
Import-Csv -Path 'C:\delete\1.csv' | ForEach-Object {
$file = Get-Item -Path $_.FullName -ErrorAction SilentlyContinue
if (($file) -and $file.LastWriteTime -lt $DatetoDelete) {
$file | Remove-Item -Force
}
}
For some files in the csv you may not have permissions to delete, like perhaps the ones inside the E:\$RECYCLE.BIN\S-1-5-21-352280589-691296097-1232828436-9414..
I am trying to build a PowerShell command to extract all files from multiple folder and combine into a single file.
Script:
Get-ChildItem -Path $(Pipeline.Workspace)/Common_All/Drop_$(DropFolder)_Migrations -Filter "Release*"
-Directory | Get-ChildItem -File -Filter *.sql | ForEach-Object { $_ |Get-Content;
"GO" } | out-file $(System.DefaultWorkingDirectory)/combined-script.sql
This returns combined script with "GO" appended after each file content
How do I prepend a text before each file content?
ForEach-Object { "prepended text`n" + ($_ | Get-Content -raw) + "`nGO" }
It may also be done same way as you've appended GO.
ForEach-Object {"prepended text"; $_ |Get-Content; "GO" }
I am processing a batch of word documents.
I've successfully been able to extract the data that I wanted to a text file using this code:
$info = gci 'C:\Users\xxx\xxx' -Recurse -File *.doc -Include *lol -Exclude *poo*) | ForEach-Object {
Get-Content ($_.fullName ) | Where-Object { $_.Contains("Date:")}
Get-Content ($_.fullName ) | Where-Object { $_.Contains("Name:")}
}
$info > C:\Users\xxx.txt
This creates a text file like this-
Date: 1/11/2011
Name Joe Shmoe
For each found document...
I would like to remove the "Date:" and "Name:" part of the output for later extraction to an Excel file.
I've tried multiple methods using the $name.Split(':') followed by $name2 = $name.Substring($name.IndexOf(':') +1) and returning $name2 Heck, I've tried a ton of things. The best I could get was a complete iteration through each of the 100 files (with different names/dates) but only one name and date was returned 100 times. Could someone please help me out with this? Thank you!
This should accomplish your goal:
#Requires -Version 3
$Params = #{
Path = 'C:\Users\xx\xx'
Filter = '*.doc'
Include = '*lol'
Exclude = '*poo*'
File = $True
Recurse = $True
}
Get-ChildItem #Params |
ForEach-Object {
(Get-Content -Path $_.FullName |
Where-Object { $_ -match '(date)|(name):' }) -replace '(date)|(name):'
} |
Out-File -FilePath 'C:\Users\xxx.txt'
I want to be able to use my function like:
Get-Process | export-xsl;
Right now I'm manually calling Get-Process inside my function:
function export-xsl() {
$path = "{0}\Downloads\test.csv" -f $home;
Get-Process | export-csv -Path $path -NoTypeInformation
Invoke-item $path;
}
The examples I found seem to iterate on each item, which I believe will create multiple .csv files.
I tried, but this creates the CSV dozens of times, once per iteration. I'm trying to get the entire object as one CSV file.
function export-xsl() {
process {
$path = "{0}\Downloads\test.csv" -f $home;
$_ | export-csv -Path $path -NoTypeInformation
Invoke-item $path;
}
}
function export-xsl {
$path = "{0}\Downloads\test.csv" -f $home;
$input | export-csv -Path $path -NoTypeInformation;
Invoke-item $path;
}
$input will allow you to pipe all the data at once, instead of many iterations.
I want to do something like this-
"Error array cleared." | Out-File $ErrorLog $InfoLog -Append
However it's not working. Is this possible without writing another line to output it to the other file?
You can also use Tee-Object to accomplish the same thing. Look at example 3 on that page. Here is a quick sample that grabs the contents of the current directory and saves it to two files.
Get-ChildItem | Tee-Object -FilePath teetest.txt | Out-File teetest2.txt
One way is with a short function like this:
function Out-FileMulti {
param(
[String[]] $filePath
)
process {
$text = $_
$filePath | foreach-object {
$text | out-file $_ -append
}
}
}
Example:
"Out-FileMultiTest" | Out-FileMulti "test1.log","test2.log"
(Writes the string "Out-FileMultiTest" to both test1.log and test2.log)
Found this snippet of code which does what I need-
"Test" | %{write-host $; out-file -filepath $ErrorLog -inputobject $ -append}