How to compare the current month with file modified in current month using power shell script. My code is working file but it is reading all the csv file in the given directory. I just wanted to read current month file i.e. modified in October 2018.
Please help me out , I have total 6 files in my directory. 3 files having date modified in October 2018 and remaining 3 files are modified in September 2018.
I want my script to check the current month then read all csv of current month i.e. October 2018
Code:
$files = Get-ChildItem 'C:\Users\212515181\Desktop\Logsheet\*.csv'
$targetPath = 'C:\Users\212515181\Desktop\Logsheet'
$result = "$targetPath\Final.csv"
$curr_month = (Get-Date).Month
$curr_year = (Get-Date).Year
# Adding header to output file
[System.IO.File]::WriteAllLines($result,[System.IO.File]::ReadAllLines($files[1])[1])
foreach ($file in $files)
{
$month = $file.LastWriteTime.ToString()
$curr_month=(Get-Date).Month
if ($month= $curr_month)
{
$firstLine = [System.IO.File]::ReadAllLines($file) | Select-Object -first 1
[System.IO.File]::AppendAllText($result, ($firstLine | Out-String))
$lines = [System.IO.File]::ReadAllLines($file)
[System.IO.File]::AppendAllText($result, ($lines[2..$lines.Length] | Out-String))
}
}
# Change output file name to reflect month and year in MMYYYY format
Rename-Item $result "Final_$curr_month$curr_year.csv"
Your comparison is wrong. And will return $true causing all files to be read
It should be
if ($month -eq $curr_month)
Also I would remove the second
$curr_month = (get-date).month
it's adding overhead to your script as you set it before the loop
Related
I have below share and folders. For every month there will be a folder and there will be year folder prior to that as shown
Last Friday of every month there is excel sheet generated as shown below
End piece in file name "-14-26-48" remain same but prior pieces changes as per year and date. I would like to get this excel sheet path and name build in PowerShell. So how can i code so i get following path without manually entering complete path and filename. whenever i run new code i should get path and file name for last sheet which was generated last Friday of last month.
$oldbk = Import-Excel -Path '\\hcohesity05\cohesity_reports\2022\7\07-29\Cohesity_FETB_Report-2022-07-29-14-26-48.xlsx'
If you just want the file from the last Friday of the previous month the easiest way to find it is not to build the path, but just search a month's folders and filter for files created on a Friday, sort by time created, and get the first one (once sorted the last folder created will be the first item).
$LastMonth = [datetime]::Now.AddMonths(-1).ToString('yyyy\\M')
$TargetFolder = Get-ChildItem (Join-Path '\\hcohesity05\cohesity_reports' $LastMonth) |Where{$_.CreationTime.DayOfWeek -eq 'Friday'}|Sort CreationTime|Select -first 1
$FilePath = (Resolve-Path (Join-Path $TargetFolder.FullName 'Cohesity_FETB_Report-*-14-26-48.xlsx')).Path
$oldbk = Import-Excel -Path $FilePath
You can use the [DateTime] type to help you out here.
# To do this, we create a new datetime,
# (using the current month and year, but the first day).
$endOfMonth = [datetime]::new(
[DateTime]::Now.Year,
[DateTime]::Now.Month,
1
).AddMonths(1).AddDays(-1)
# Then we add a month to it and subtract a day
# (thus giving us the last day of the month)
# Now we create a variable for the last Friday
$lastFriday = $endOfMonth
# and we keep moving backward thru the calendar until it's right.
while ($lastFriday.DayOfWeek -ne 'Friday') {
$lastFriday = $lastFriday.AddDays(-1)
}
# At this point, we can make the ReportPath, piece by piece
# ($BasePath would be the path until this point)
$ReportPrefix = "Cohesity_FETB_Report"
$ReportSuffix = "14-26-48.xlsx"
$ReportPath = Join-Path $BasePath $lastFriday.Year |
Join-Path -ChildPath $lastFriday.Month |
# A Custom format string will be needed for the month/day
Join-Path -ChildPath $lastFriday.ToString("MM-dd") |
Join-Path -ChildPath "$reportPrefix-$($lastFriday.ToString('YYYY-MM-dd'))-$reportSuffix"
$reportPath
I have a text file called HelplineSpecialRoster.txt that looks like this
01/01/2019,6AM,0400012345,Kurt,kurt#outlook.com
02/01/2019,6AM,0412345676,Bill,bill#outlook.com
03/01/2019,6AM,0400012345,Sam,Sam#outlook.com
04/01/2019,6AM,0412345676,Barry,barry#outlook.com
05/01/2019,6AM,0400012345,Kurt,kurt#outlook.com
I'm in Australia so the dates are day/month/year.
I have some code that creates a listbox that displays the lines from the text file, but I want to edit the text file before it is displayed to only show older dates. A helpful person gave me this code and it worked once but now it stopped working for some reason. When I delete the whole text file and recreated it it started working again but only once.
If there is a future shift in the file say
05/02/2019,6AM,0400012345,Kurt,kurt#outlook.com
and todays date being 29/01/2019 it works to delete the older shifts. If there is only old shifts in the file as above, it doesn't delete them. When I add a date that is in the future, then it works to delete the older ones and only keep the future one.
$SpecialRosterPath = "C:\Helpline Dialer\HelplineSpecialRoster.txt"
$CurrentDate2 = (Get-Date).Date # to have a datetime starting today at 00:00:00
Function DeleteOlderShifts {
$CurrentAndFutureShifts = Get-Content $SpecialRosterPath | Where-Object {
$_ -match "^(?<day>\d{2})\/(?<mon>\d{2})\/(?<year>\d{4})" -and
(Get-Date -Year $Matches.year -Month $Matches.mon -Day $Matches.day) -ge $CurrentDate2
}
$CurrentAndFutureShifts
$CurrentAndFutureShifts | Set-Content $SpecialRosterPath
}
DeleteOlderShifts;
Any ideas?
When there are only older dates in your input file the result in $CurrentAndFutureShifts will be empty. Empty values in a pipeline are skipped over, meaning that nothing is written to the output file, so the output file remains unchanged.
You can avoid this issue by passing the variable to the parameter -Value. Change
$CurrentAndFutureShifts | Set-Content $SpecialRosterPath
into
Set-Content -Value $CurrentAndFutureShifts -Path $SpecialRosterPath
Rather than using a text file, use a CSV file with headers. This is essentially just a text file saved with a .csv file extension that includes headers for each column:
Note I have added an additional row at the bottom with a date older than today's to prove testing.
HelpineSpecialRoster.csv content:
Date,Time,Number,Name,Email
01/01/2019,6AM,400012345,Kurt,kurt#outlook.com
02/01/2019,6AM,412345676,Bill,bill#outlook.com
03/01/2019,6AM,400012345,Sam,Sam#outlook.com
04/01/2019,6AM,412345676,Barry,barry#outlook.com
05/01/2019,6AM,400012345,Kurt,kurt#outlook.com
01/02/2019,6AM,400012345,Dan,dan#outlook.com
Set the path of the CSV:
$csvPath = "C:\HelpineSpecialRoster.csv"
Import CSV from file:
$csvData = Import-CSV $csvPath
Get todays date # 00:00
$date = (Get-Date).Date
Filter csv data to show rows where the date is older than today's date:
$csvData = $csvData | ? { (Get-Date $date) -lt (Get-Date $_.Date) }
Export the CSV data back over the original CSV:
$csvData | Export-CSV $csvPath -Force
I have two csv files. One with a report from AD, containing accounts created during last month, second is manually kept database, that should theoretically contain the same information, but from all history of our company, with some additional data needed for accounting. I imported the AD report to powershell, now I need to import specific rows of the database. The rows I need are defined by a value in column "Date added". I need to import only rows, where the date exceeds specific value. I have this code:
$Report = Read-Host "File name" #AD report, last ten chars are date of report creation, in format yyyy-MM-dd
$Date_text = $Report.Substring($Report.get_Length()-10)
$Date = Get-Date -Date $Date_text
$Date_limit = (($Date).AddDays(-$Date.Day)).Date
$Date_start = $Date_limit.AddMonths(-1)
$CSVlicence = Import-Csv $Database -Encoding UTF8 |
where {(Where-Object {![string]::IsNullOrWhiteSpace($_.'Date added')} |
ForEach-Object{$_.'Date added' = $_.'datum Pridani' -as [datetime] $_}) -gt $Date_start}
When run like this, nothing is imported. Without the condition the database is imported successfully, but it's extremely large and the rest of the script takes for ever. So I need to work only with relevant data. I don't care, that when Date_limit is 30th Sep, the Date_start would be 30th Aug instead of 31st Aug. That's just few more rows, but all those 10 years or so really takes for ever, if everything is imported.
So based on your current logic, it filters the PSCustomObject based on your constraints, which is the wrong way to handle it since any item in the object will cause it to be filtered out. You want to filter the source.
$Report = Read-Host -Prompt 'Filename'
## Grabs the datestamp at the end
$Date = Get-Date -Date $Report.Substring($Report.Length - 10)
## Grabs last day of previous month
$Limit = $Date.AddDays(-$Date.Day)
## Grabs last day of two months ago, inaccuracy of one day
$Start = $Limit.AddMonths(-1)
Get-Content -Path $Database -TotalCount 1 | Set-Content -Path 'tempfile'
Get-Content -Path $Database -Encoding 'UTF8' |
## Checks that the entry has a valid date entry within limits
ForEach-Object {
## For m/d/yy or d/m/yy variants, try
## '\d{1,2}\/\d{1,2}\/\d{2,4}'
If ($_ -match '\d{4}-\d{2}-\d{2}')
{
[DateTime]$Date = $Matches[0]
If ($Date -gt $Start -and $Date -lt $Limit) { $_ }
}
} |
Add-Content -Path 'tempfile'
$CSV = Import-Csv -Path 'tempfile' -Encoding 'UTF8'
I have several laptops that are generating daily activity logs for a process into txt files. I've figured out how to write a script to append the logs into one master file on a daily basis, but now I'm concerned about file size. I'd like to keep a rolling 60 days of data in my master file.
Here is my data format:
2016-06-23T04:02:33,JE5030UA,88011702312014569339,0000000034626,01451560610600980
Using Get-Date.AddDays(-60) I can get the cutoff date, but it's in MM/dd/yyy format.
If I set up a variable to get the date in the same format as my file (Get-Date -format 'yyyyMMdd), I can't use the .AddDays() method with it to get the cutoff date.
That's how far I've got so far. I'd include code, but there's not much there. The script to append the files was so easy. I can't believe it's difficult to purge old records.
My questions:
What am I missing on the date issue?
What is the best cmdlet to purge records > 60 days? There doesn't appear to be a 'delete' cmdlet for records in a file. I was expecting a 'if date > 60 days, then delete record' kind of function.
Do I need to add a header to the text file?
Take a look at the following code to read from your combined log and then filter out rows that are within your date range. You get a DateTime object from (get-date).AddDays(); you get a DateTime object from the time stamp in the file, then you can compare them. This is one way of doing it anyway.
$cutoffDate = (get-date).AddDays(-60);
$fileContents = get-content C:\your\path\combinedLog.txt
foreach($line in $fileContents)
{
write-host "Current line = $line"
$words = $line.Split(',')
$date=get-date $words[0];
write-host "Date of line = $date"
if($date -gt $cutoffDate)
{
# Append $line to your trimmed log
}
}
Since you're using an ISO date format you can remove records older than a given cutoff date by formatting the cutoff date accordingly and comparing the first field of each line to it:
$file = 'C:\path\to\your.log'
$cutoff = (Get-Date).AddDays(-60).ToString('yyyy-MM-dd\THH:mm:ss')
(Get-Content $file) |
Where-Object { $_.Split(',')[0] -ge $cutoff } |
Set-Content $file
However, rotating logs is usually a better appraoch than clearing out a single file. Write your logs to a different file each day, e.g. like this:
... | Set-Content "C:\path\to\master_$(Get-Date -f 'yyyyMMdd).log"
so you can simply remove logs by their last modification date:
$cutoff = (Get-Date).AddDays(-60)
Get-ChildItem 'C:\log\folder\master_*.log' |
Where-Object { $_.LastWriteTime -lt $cutoff } |
Remove-Item
This is in reference to the post here: How to delete date-based lines from files using PowerShell
Using the below code (contributed by 'mjolinor') I can take a monolithic (pipe "|" delimited) CSV file and create a trimmed CSV file with only lines containing dates less than $date:
$date = '09/29/2011'
foreach ($file in gci *.csv) {
(gc $file) |
? {[datetime]$_.split('|')[1] -lt $date
} | set-content $file
}
The above code works great! What I need to do now is create additional CSV files from the monolithic CSV file with lines containing dates >= $date, and each file needs to be in 1-week chunks going forward from $date.
For example, I need the following 'trimmed' CSV files (all created from original CSV):
All dates less than 09/30/2011 (already done with code above)
File with date range 09/30 - 10/6
File with date range 10/7 - 10/14
Etc, etc, until I reach the most recent date
You can use the GetWeekOfYear method of Calendar like this
$date = (Get-Date)
$di = [Globalization.DateTimeFormatInfo]::CurrentInfo
$week = $di.Calendar.GetWeekOfYear($date, $di.CalendarWeekRule, $di.FirstDayOfWeek)
to determine the week number of a given date.
This is NOT tested with your input (it was adapted from some script I use for time-slicing and counting Windows log events), but should be close to working. It can create files on any arbitrary time span you designate in $span:
$StartString = '09/29/2011'
$inputfile = 'c:\somedir\somefile.csv'
$Span = new-timespan -days 7
$lines = #{}
$TimeBase = [DateTime]::MinValue
$StartTicks = ([datetime]$startString).Ticks
$SpanTicks = $Span.Ticks
get-content $inputfile |
foreach {
$dt = [datetime]$_.split('|')[1]
$Time_Slice = [int][math]::truncate(($dt.Ticks - $StartTicks) / $SpanTicks)
$lines[$Time_Slice] += #($_)
}
$lines.GetEnumerator() |
foreach {
$filename = ([datetime]$StartString + [TimeSpan]::FromTicks($SpanTicks * $_.Name)).tostring("yyyyMMdd") + '.csv'
$_.value | export -csv $filename
}