Powershell Script to clean up logs on a Monthly Basis - powershell

Powershell newbie here again with another question.
We currently have a log folder accumulating text files. My supervisor would like for me to create a script where only the current month's logs are visible. All previous months should be moved to an archive folder. He would like for me to create a powershell script we can run once a month to achieve this.
For example, our log folder should only have logs from January 2021. Anything older than January 2021 should be archived. Once February 1, 2021 hits, all the logs from January 2021 should be moved to the archive folder, and so on.
How can I achieve a script that looks at the folder and only keeps the logs for the current month? Any guidance, resources, videos, etc are greatly appreciated! I've been scouring the internet for resources, but I haven't quite found anything that suits my needs.
Update: Was able to find a wonderful script here: PowerShell: Sort and Move Files by Date to month and year provided by Thomas Maurer (all credit to him!)
# Get the files which should be moved, without folders
$files = Get-ChildItem 'C:\Users\testuser\Desktop\log' -Recurse | where {!$_.PsIsContainer}
# List Files which will be moved
$files
# Target Filder where files should be moved to. The script will automatically create a folder for the year and month.
$targetPath = 'C:\Users\testuser\Desktop\log\archive'
foreach ($file in $files)
{
# Get year and Month of the file
# I used LastWriteTime since this are synced files and the creation day will be the date when it was synced
$year = $file.LastWriteTime.Year.ToString()
$month = $file.LastWriteTime.Month.ToString()
# Out FileName, year and month
$file.Name
$year
$month
# Set Directory Path
$Directory = $targetPath + "\" + $year + "\" + $month
# Create directory if it doesn't exsist
if (!(Test-Path $Directory))
{
New-Item $directory -type directory
}
# Move File to new location
$file | Move-Item -Destination $Directory
}
What I would like to achieve now: This script works great, but I am trying to tinker with it so I can move everything EXCEPT the current month. I'm still researching and investigating, so I will make sure to update my post if I am able to figure out this missing piece for me. Thank you all for your help!

One approach to leave the files that were last modified in this month is to use a small helper function that formats the LastWriteTime date into a string yyyy\MM.
function Format-YearMonth ([datetime]$date) {
# simply output a string like "2021\01"
return '{0:yyyy\\MM}' -f $date
}
$sourcePath = 'C:\Users\testuser\Desktop\log'
$targetPath = 'C:\Users\testuser\Desktop\log\archive'
$thisMonth = Format-YearMonth (Get-Date)
# Get the files which should be moved, without folders
# this can be more efficient if all files have the same extension on which
# you can use -Filter '*.log' for instance.
Get-ChildItem -Path $sourcePath -File -Recurse |
# filter out the files that have a LastWriteTime for this year and month
Where-Object {(Format-YearMonth $_.LastWriteTime) -ne $thisMonth } |
ForEach-Object {
# Set destination Path
$Directory = Join-Path -Path $targetPath -ChildPath (Format-YearMonth $_.LastWriteTime)
# Create directory if it doesn't exsist
if (!(Test-Path $Directory)) {
$null = New-Item $Directory -type Directory
}
Write-Host "Moving file '$($_.FullName)' to '$Directory'"
# Move File to new location
$_ | Move-Item -Destination $Directory -Force
}

Related

Zipping using powershell

I've written code to zip files older than 7 days from a source folder to a subfolder and then delete the original files. My code works best with Compress-Archive and Remove-Item cmdlets with fewer files, but takes more time and system memory for a large volume of files.
So, I'm working on a solution using 7zip instead as it's faster.
Below code does zipping correctly but not limit itself to files older than 7 days and deletes all the files from source folder. It should zip and delete only files older than 7 days.
Is there anything wrong with the code.
if (-not (test-path "$env:ProgramFiles\7-Zip\7z.exe")) {throw "$env:ProgramFiles\7-Zip\7z.exe needed"}
set-alias 7z "$env:ProgramFiles\7-Zip\7z.exe"
$Days = "7"
$Date = Get-Date -format yyyy-MM-dd_HH-mm
$limit = (Get-Date).AddDays(-$Days)
$filePath = "C:\Users\529817\New folder1\New folder_2"
Where LastWriteTime -lt $limit | 7z a -t7z -sdel "C:\Users\529817\New folder1\New folder_2\ARCHIVE\$Date.7z" "$filePath"
I don't think you are running the 7zip command correctly. You are simply telling it to add all the files from the directory $filepath to the archive then delete all the files. That and I have serious doubts that 7zip can take pipeline input as your sample suggests.
Look at the examples from 7Zip cmdline help:
7z a archive1.zip subdir\
Adds all files and subfolders from folder subdir to archive archive1.zip. The filenames in archive will contain subdir\ prefix.
7z a archive2.zip .\subdir\*
Adds all files and subfolders from folder subdir to archive archive2.zip. The filenames in archive will not contain subdir\ prefix.
I'd have to download 7Zip to test but I think you need a loop to process the files you isolated with the Where clause. It might look something like:
if (-not (test-path "$env:ProgramFiles\7-Zip\7z.exe")) {throw "$env:ProgramFiles\7-Zip\7z.exe needed"}
set-alias 7z "$env:ProgramFiles\7-Zip\7z.exe"
$Days = "7"
$Date = Get-Date -format yyyy-MM-dd_HH-mm
$limit = (Get-Date).AddDays(-$Days)
$filePath = "C:\Users\529817\New folder1\New folder_2"
Get-ChildItem $filePath |
Where-Object{ $_.LastWriteTime -lt $limit } |
ForEach-Object{
7z a -t7z -sdel "C:\Users\529817\New folder1\New folder_2\ARCHIVE\$Date.7z" $_.FullName
}
Note: At least in your sample you are missing the Get-ChildItem command. I don't think you need to reference the .Date property from the [DateTime] object returned by the .AddDays() method unless you want the boundary to be midnight of that date. Otherwise .AddDays() will return a [DateTime] naturally.

Powershell - Adding a Month to all files in a folder

a complete newbie to Powershell here.
What I am attempting to do is using Powershell, adding a month for the creationtime of files in a folder, which is based off the current creation date of the document, as opposed to the get-date system date.
Edit to how I get it to work for single documents:
I managed to get it to work for a specific file using this command:
$(get-item test.txt).creationtime=$(Get-item test.txt).creationtime.AddMonths(1)
Rather than specify each filer individually, I want to do the same as above but for all documents in a folder.
If the question is to add 1 month to the LastWriteTime date of each file, instead of setting the LastWriteTime date to a date 1 month away from the current system date, you can do
foreach ($file in (Get-ChildItem -Path 'PathToWhereTheFilesAre' -File)) {
$file.LastWriteTime = $file.LastWriteTime.AddMonths(1)
}
If it is the current system date plus 1 month you want, you can use any of the answers given already by both Klausen and Venkataraman R
You can go for foreach loop to iterate over files and update the lastwritetime
$FolderName = "c:\dev"
foreach($file in (GEt-childitem -Path $FolderName ))
{
$file.lastwritetime=$(Get-Date).AddMonths(1)
}
First you need to define which files you want to apply this "addmonths". In the code sample I added two parameters you may want to remove, just in case you need them.
Thanks.
# Folder where your files are located
$folder = 'C:\Temp\'
# Create list of files to go thru.
# -Recurse option will navigate in the folders recursively.
# -File can be used so no folders are picked up in your script
$files = Get-ChildItem -Path $folder -Recurse -File
# Iterate file by file in the results
foreach($file in $files)
{
$file.lastwritetime=$(Get-Date).AddMonths(1)
}
Thanks all for your help!
I've used your guidance and created the following code which does exactly what I want it to do:
$folder = "C:\Users\ajames\Powershell"
$files = Get-ChildItem -Path $folder -Recurse -File
foreach($file in $files)
{$file.creationtime=$($file.CreationTime.AddDays(1)) }
Cheers again.

Powershell - Copy directory and files from source folder to destination folder

I am working out a scenario using PowerShell.
As of now, I am trying to workout a scenario where I have a batch job which upon successful execution first creates a date folder for that particular day and then creates a .CSV file under. The folder structure than look like below.
\\Server\SourceA\Processed\20200120\TestA.CSV
when the job runs next day, it will create another folder and file like below.
\\Server\SourceA\Processed\20200121\TestB.CSV
This is how there have been so many folders already created in past.
My PS script is good to run daily after the batch job is completed. I read a date, append to path and it copies file from source to destination folder. But I want to enable my PS script to read all previous date folders created under
\\Server\SourceA\Processed\
Another tricky part is - under the date folder there is few other sub folders. I.e.
\\Server\SourceA\Processed\20191010\Log
\\Server\SourceA\Processed\20191010\Charlie
\\Server\SourceA\Processed\20191010\Alpha
\\Server\SourceA\Processed\20191010\Delta
among them, I only need to read files from Log folder only.
Hence, my actual source path becomes like
\\Server\SourceA\Processed\20191010\Log\TestA.CSV
Here is my script (which is static right now and unable to read past existing date folders).
$fullSourceFileName = "\\Server\SourceA\Processed\"
$date = Get-Date -format "yyyyMMdd"
$fullSourceFileName = "$($fullSourceFileName)$($date)\Log
$destination = "\\Server\DestA\Processed\"
$destination = "$($destination)$($date)\"
get-childitem -path $fullSourceFileName -recurse | copy-item -destination "$($destinationFolder)$($date)\"
Your help is highly appreciated.
I did not know I can use foreach loop in Powershell.
So, here is the answer to read all dynamic date folder under my given path.
I hope this helps the community.
$fullSourceFileName = "\\Server\SourceA\Processed\"
$DirToRead = "\Log\"
$dates = Get-ChildItem -Path $fullSourceFileName -Directory
$destination = "\\Server\DestA\Processed\"
foreach ($date in $dates){
$ActualPath = "$($fullSourceFileName)$($date)$($DirToRead)"
if(!(Test-Path $ActualPath))
{
Write-Output $($ActualPath)$(" source path does not exist")
}
else
{
get-childitem -path $ActualPath -recurse | copy-item -destination "$($destinationFolder)$($date)\"
}
$ActualPath = ""
}

Copy file from directories with today's date to another location

AD Manager Plus generates reports hourly to a time stamped file path and I would like to copy these files to another location - overwriting the existing file. I will then schedule the script to run hourly after the files have been generated. Unfortunately the location the reports are extracted to cannot be modified and it creates date & time stamped folders.
Example:
C:\ADManager Plus\audit-data\reports\16042019\DailyTrue-Up01-55-07\Real Last Logon.xls
C:\ADManager Plus\audit-data\reports\ddmmyyyy\DailyTrue-Uphh-mm-ss\Real Last Logon.xls
I thought the easiest approach would be to:
Get the last modified folder in the Reports Folder - eg Apr162019
Get the last modified folder in the Apr162019 Folder - eg DailyTrue-Up01-55-07
Filter for the Real Last Logon.xls spreadsheet in folder DailyTrue-Up01-55-07
$Path = "C:\ADManager Plus\audit-data\reports"
$DestinationPath = "\\domain\networkshare\Reports\"
Get-ChildItem -Path $Path -Directory | ForEach-Object {
Get-ChildItem -Path "$Path\$_" -File -Filter "Real Last Logon.xlsx" |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1 |
Copy-Item -Force -Destination (New-Item -Force -Type Directory -Path (Join-Path $DestinationPath ($_.FullName.Replace("$Path\", ''))))
}
The code we have seems to copy all folders to the location and can't look in multiple directories.
I got a feeling we are approaching this wrong, Can anyone suggest the best way to achieve this? There are few posts online that explain how to retrieve files from time stamped folders.

assign current time to powershell variable

I pieced together a PowerShell script that is triggered (via Task Scheduler) every time the computer restarts.
The script will do the following:
Find a csv file located in a specific directory
Rename the file by appending a time/date stamp to the end
Move that file into an archive folder
During the day a software application automatically creates a new csv file. So next time the computer reboots, it repeats the steps above.
Final step - the script also looks in the archive folder and deletes any files which are > 7 days old.
Sometimes (not all the time) when the computer restarts and the script runs, it completes steps 1 and 2 but not step 3.
And so what this means is the csv file is renamed but the script did NOT move it into the archive folder.
Why?
I open the script in PowerShell ISE and run the script manually and I see the reason why:
A file with that name already exists in the archive folder.
How can that happen if the file name is always dynamically renamed using a date/time stamp (down to the second).
Turns out the variable which is assigned the value of Get-Date is not updated.
It still contains the old time.
Why does this happen if the very first thing I do in my PowerShell script is this:
$rightNow = Get-Date
I know it's not best practice to assign the current date and time to a variable and obviously the variable is not going to update itself as every second goes by. That's fine. I don't need it to. What I DO expect it to do is grab the current date and time (at the time this line of code runs) and assign it to my variable called $rightNow.
For some reason the variable is not getting updated.
Why does this happen? What's the best way for me to quickly grab the current date and time (down to the second) and use it as part of a file name?
Here is my current script:
$source = "C:\Logs"
$destination = "C:\Logs\archive"
$old = 7
$rightNow = Get-Date
# delete all files in the archive folder that are > 7 days old
Get-ChildItem $destination -Recurse |
Where-Object {-not $_.PSIsContainer -and
$rightNow.Subtract($_.CreationTime).Days -gt $old } |
Remove-Item
# rename all csv files in the Log folder by appending currentDate_currentTime
Get-ChildItem -Path $source\* -Include *.csv | % {
$name = $_.Name.Split(".")[0] + "_" + ($_.CreationTime | Get-Date -Format yyyyMMdd) + "_" + ($_.CreationTime | Get-Date -Format hhmmss) + ".csv"
Rename-Item $_ -NewName $name
Move-Item "$($_.Directory)\$name" -Destination $destination
}
You don't use the current date in the rename, you use the file's CreationTime property. If you want the current datetime try
$name = $_.BaseName + [datetime]::now.ToString('_yyyyMMdd_hhmmss') + $_.Extension
Or better yet just perform the rename as part of the move process.
$source = "C:\Logs"
$destination = "C:\Logs\archive"
$old = 7
# delete all files in the archive folder that are > 7 days old
Get-ChildItem $destination -Recurse -File |
Where-Object { $_.CreationTime -lt [datetime]::Today.AddDays(-$old) } |
Remove-Item
# rename all csv files in the Log folder by appending currentDate_currentTime
Get-ChildItem -Path $source\* -Include *.csv | % {
$_ | Move-Item -Dest $("$destination\" + $_.BaseName + [datetime]::now.ToString('_yyyyMMdd_hhmmss') + $_.Extension)
}