I am fairly new to powershell and still learning. I have completed my first script and now trying to add some logging into it. I am able to append to log file OK but stuck on backing up the log and rotating it. This is what I have so far
$CheckFile = Test-Path $Logfilepath
IF ($CheckFile -eq $false) {
$Date = (Get-Date).tostring()
$Date + ' - Automonitor log created - INFO' | Out-File -Append -Force $Logfilepath }
Else {
IF ((Get-Item $Logfilepath).length -gt $Size) {
Move-Item $Logfilepath -Destination $LogsOldFolder -Force}
}
This is where I am stuck. If the file is bigger than 5MB I need it to move to another folder (which I have in the script) but when moved into that folder I only want to keep the 5 newest files to avoid storage issues. I will need the files named like the below.
Automonitor.log.1
Automonitor.log.2
Automonitor.log.3
Automonitor.log.4
Automonitor.log.5
Automonitor.log.1 being the newest created file. So I am really baffled on the process I would take and how to rename the files to match the above format and when new file is copied over, to rename all of them again dependent on date created and deleting the oldest so only 5 files ever exist.
I hope that makes sense, if anyone has any ideas that would be great.
You can go this way:
$a = gci $destfolder
if ( $a.count -gt 5)
{
$a | sort lastwritetime | select -first ($a.count - 5) | remove-item
}
This will get you every file older than the first 5.
So, this script doesnt care about the filenames. If you want that, you should Change the $a = gci $destfolder part to some Wildcards.
Related
I'm tryng to write a .ps1 script that delete files older than 2 days but leave the most recent files also if old.For the delete part internet is full of code's snippet to copy/paste.
For the "leave recent files" i'm in truble.
the structure of the bk's folder is the following:
--Db.yyyy.MM.dd.Native.bak.zip
--Files.yyyy.MM.dd.zip
--Log.yyyy.MM.dd.txt
--AND SO ON WITH THE OLDERS FILES
I wanna keep the most recent trio of this files also if older than 2 days.
If any one have a suggestion to the right approach or a solution, i'm here to learn.
tks to all.
P.S. Is the first time i use powershell and i have to do this script for work.
I would like to get you started so you have an idea how to approach this. It's not too hard actually if you approach it logically. First, you need to obtain the correct files from the backup folder. Then you have to examine each file by parsing the filename.
I wonder if you cannot just take the file date and sort on the oldest? But if you really need to strip the filename, I wrote a very rough script on how such approach could look. Keep in mind, I just did some quick and dirty replace to make it work:
#Get files
$zipFilesInFolder = Get-Childitem –Path "C:\Temp" | Where-Object {!$_.PSIsContainer -and ($_.Name -like "*Files*") } | Sort-Object -Property Name -Descending
Write-Host 'Files found:' $zipFilesInFolder
# Check files found
[datetime]$oldestDate = Get-Date;
[string]$oldestFile;
# Check each file by parsing the filename
Foreach ($i in $zipFilesInFolder) {
$fileDate = $i -replace 'Files.' -replace '.zip',''
$parsedDate = [datetime]::parseexact($fileDate, 'yyyy-MM-dd', $null)
#If we find an older file then the one we currently have in memory, re-assign
if($parsedDate -lt $oldestDate) {
Write-Host 'Older file found than:' $oldestDate ', oldest is now: ' $i
$oldestDate = $parsedDate;
$oldestFile = $i;
}
}
# Display and copy
Write-Host 'Oldest file found:' $oldestFile
I created a directory: C:\Temp with the files:
Files.2021-04-21.zip up to Files.2021-04-26.zip
The output looks like this:
Files found: Files.2021-04-26.zip Files.2021-04-25.zip Files.2021-04-23.zip Files.2021-04-22.zip Files.2021-04-21.zip Files.2021-04-21.zip
Older file found than: 26-4-2021 10:17:01 , oldest is now: Files.2021-04-26.zip
Older file found than: 26-4-2021 00:00:00 , oldest is now: Files.2021-04-25.zip
Older file found than: 25-4-2021 00:00:00 , oldest is now: Files.2021-04-23.zip
Older file found than: 23-4-2021 00:00:00 , oldest is now: Files.2021-04-22.zip
Older file found than: 22-4-2021 00:00:00 , oldest is now: Files.2021-04-21.zip
Oldest file found: Files.2021-04-21.zip
This should be enough to get your assignment done :) Good luck!
AGAIN, I want to stress that you are probably better off by looking at the date modified of the file instead of the filename.
In that case, do this
# Get files
$zipFilesInFolder = Get-Childitem –Path "C:\Temp" | Where-Object {!$_.PSIsContainer -and ($_.Name -like "*Files*") } | Sort-Object -Property LastWriteTime -Descending
Write-Host 'Files found:' $zipFilesInFolder
# Check each file
Foreach ($i in $zipFilesInFolder) {
$i # Shows files from top to bottom, based on last modified date
}
I want to merge many CSV-files into one (a few hundred files) removing the header row of the added CSVs.
As the files sit in several subfolders I need to start from the root traversing all the subfolders and process all CSVs in there. Before merging I want to archive them with zip deleting old CSVs. The new merged CSV-file and the zip-archive should be named like their parent folder.
In case the Script is started again for the same folder none of already processed files should be damaged or removed accidentally.
I am not a Powershell guy so I have been copying pasting from several resources in the web and came up with the following solution (Sorry don't remember the resources feel free to put references in the comment if you know).
This patch-work code does the job but it doesn't feel very bulletproof. For now it is processing the CSV files in the subfolders only. Processing the files within the given $targDir as well would also be nice.
I am wondering if it could be more compact. Suggestions for improvement are appreciated.
$targDir = "\\Servername\folder\"; #path
Get-ChildItem "$targDir" -Recurse -Directory |
ForEach-Object { #walkinthrough all subfolder-paths
#
Set-Location -Path $_.FullName
#remove existing AllInOne.csv (targed name for a merged file) in case it has been left over from a previous execution.
$FileName = ".\AllInOne.csv"
if (Test-Path $FileName) {
Remove-Item $FileName
}
#remove existing AllInOne.csv (targed name for archived files) in case it has been left over from a previous execution.
$FileName = ".\AllInOne.zip"
if (Test-Path $FileName) {
Remove-Item $FileName
}
#compressing all csv files in the current path, temporarily named AllInOne.zip. Doing that for each file adding it to the archive (with -Update)
# I wonder if there is a more efficient way to do that.
dir $_.FullName | where { $_.Extension -eq ".csv"} | foreach { Compress-Archive $_.FullName -DestinationPath "AllInOne.zip" -Update}
##########################################################
# This code is basically merging all the CSV files
# skipping the header of added files
##########################################################
$getFirstLine = $true
get-childItem ".\*.csv" | foreach {
$filePath = $_
$lines = $lines = Get-Content $filePath
$linesToWrite = switch($getFirstLine) {
$true {$lines}
$false {$lines | Select -Skip 1}
}
$getFirstLine = $false
Add-Content ".\AllInOne.csv" $linesToWrite
# Output file is named AllInOne.csv temporarily - this is not a requirement
# It was simply easier for me to come up with this temp file in the first place (symptomatic for copy&paste).
}
#########################################################
#deleting old csv files
dir $_.FullName | where { $_.Extension -eq ".csv" -and $_ -notlike "AllInOne.csv"} | foreach { Remove-Item $_.FullName}
# Temporarily rename AllinOne files with parent folder name
Get-ChildItem -Path $_.FullName -Filter *.csv | Rename-Item -NewName {$_.Basename.Replace("AllInOne",$_.Directory.Name) + $_.extension}
Get-ChildItem -Path $_.FullName -Filter *.zip | Rename-Item -NewName {$_.Basename.Replace("AllInOne",$_.Directory.Name) + $_.extension}
}
I have been executing it in the Powershell ISE. The Script is for a house keeping only, executed casually and not on a regular base - so performance doesn't matter so much.
I prefer to stick with a script that doesn't depend on additional libraries if possible (e.g. for Zip).
It may not be bulletproof, but I have seen worse cobbled together scripts. It'll definitely do the job you want it to, but here are some small changes that will make it a bit shorter and harder to break.
Since all your files are CSVs and all would have the same headers, you can use Import-CSV to compile all of the files into an array. You won't have to worry about stripping the headers or accidentally removing a row.
Get-ChildItem "*.csv" | Foreach-Object {
$csvArray += Import-CSV $_
}
Then you can just use Export-CSV -Path $_.FullName -NoTypeInformation to output it all in to a new CSV file.
To have it check the root folder and all the subfolders, I would throw all of the lines in the main ForEach loop into a function and then call it once for the root folder and keep the existing loop for all the subfolders.
function CompileCompressCSV {
param (
[string] $Path
)
# Code from inside the ForEach Loop
}
# Main Script
CompileCompressCSV -Path $targetDir
Get-ChildItem -Path $targetDir -Recurse -Directory | ForEach-Object {
CompileCompressCSV -Path $_.FullName
}
This is more of a stylistic choice, but I would do the steps of this script in a slightly different order:
Get Parent Folder Name
Remove old compiled CSVs and ZIPs
Compile CSVs into an array and output with Parent Folder Name
ZIP together CSVs into a file with the Parent Folder Name
Remove all CSV files
Personally, I'd rather name the created files properly the first time instead of having to go back and rename them unless there is absolutely no way around it. That doesn't seem the case for your situation so you should be able to create them with the right name on the first go.
A friend of mine asked me to write a short script for him. The script should check a specific folder, find all files and subfolders older than X days and remove them. Simple so far, I wrote the script, successfully tested it on my own system and sent it to him. Here's the thing - it doesn't work on his system. To be more specific, the Get-ChildItem cmdlet does not return anything for the provided path, but it gets weirder even, more on that later. I'm using the following code to first find the files and folders (and log them before deleting them later on):
$Folder = "D:\Data\Drive_B$\General\ExchangeFolder"
$CurrentDate = Get-Date
$TimeSpan = "-1"
$DatetoDelete = $CurrentDate.AddDays($TimeSpan)
$FilesInFolder = (Get-ChildItem -Path $Folder -Recurse -ErrorAction SilentlyContinue | Where-Object {$_.LastWriteTime -lt $DatetoDelete})
All variables are filled and we both know that the folder is filled to the brim with files and subfolders older than one day, which was our timespan we chose for the test. Now, the interesting part is that not only does Get-ChildItem not return anything - going to the folder itself and typing in "dir" does not return anything either. Never seen behaviour like this. I've checked everything I could think of - is it DFS?, typos, folder permissions, share permissions, hidden files, ExecutionPolicy. Everything is as it should be to allow this script to work properly as it did on my own system when initially testing it. The script does not return any errors whatsoever.
So for some reason, the content of the folder cannot be found by powershell. Does anyone know of a reason why this could be happening? I'm at a loss here :-/
Thanks for your time & help,
Fred
.AddDays() takes an double I would use that.
Filter then action
This code will work for you.
$folder = Read-Host -Prompt 'File path'
$datetodel = (Get-Date).AddDays(-1)
$results = try{ gci -path $folder -Recurse | select FullName, LastWriteTime | ?{ $_.LastWriteTime -lt $datetodel}}catch{ $Error[-1] }
$info = "{0} files older than: {1} deleting ...." -f $results.count, $datetodel
if($results | ogv -PassThru){
[System.Windows.Forms.MessageBox]::Show($info)
#Put you code here for the removal of the files
# $results | % del FullName -force
}else{
[System.Windows.Forms.MessageBox]::Show("Exiting")
}
So I have been tasked to write a script that will move files from one folder to another folder, which is easy enough. The problem I am having is the files are for accounts so there will be a file called DEA05292020.pdf and another file called TENSJ05292020 and each file needs to go to a specific folder (EX. the DEA05292020.pdf file needs to be moved to a folder called DEA and the TENSJ05292020 will move to the TENSJ folder. There are over a hundred different accounts that have their own specific folder. The files all start off in our Recon folder and need to be moved at the end of each month to their respective accounts folder. So my question is how could I go about creating a powershell script to make that happen. I am very new to powershell and have been studying the "Learn Powershell in a Month of Lunches" and have a basic grasp of it. So what I have so far is very simple where I can copy the file over to the new folder:
copy-item -path "\Sageshare\share\Reconciliation\PDF Recon Center\DEA RECON 05292020" -destination "Sageshare\share\Account Rec. Sheets\Seperate Accounts\DEA"
This works but I need a lot more automation in regards to seperating all the different account names in the PDF Recon Center folder. How do I make a script that can filter the account name (IE: DEA) and also the month and year from the name of the file (IE: 052020 pulled out of the 05292020 part of the filename)?
Thanks!
If #Lee_Dailey wants to write the code and post it here, I'll delete my answer. He solved the problem, I just code monkeyed it.
Please don't test on everything at once, run it in batches so you can monitor its behavior and not mess up your environment. It moves files in ways you may not want, i.e. if there is a folder named a it'll move everything that matches that folder into it. If you want to prevent this you can write the prescanning if there is a folder more "closely matching" that name before it actually creates the folder itself. Pretty sure it does everything you want however in the simplest way to understand. :)
$names = $(gci -af).name |
ForEach-Object {
if (-not ($_.Contains(".git"))){
$_
}
}
if ( $null -eq $names ) {
Write-Host "No files to move!"
Start-Sleep 5
Exit
}
$removedNames = $names |
ForEach-Object {
$_ = $_.substring(0, $_.IndexOf('.')) # Remove extension
$_ -replace '[^a-zA-Z-]','' # Regex removes numbers
}
$removedNames = $removedNames |
Get-Unique # Get unique folder names
$names |
ForEach-Object {
$name = $_
$removedNames |
ForEach-Object {
if ($name.Contains($_)) # If it matches a name
{
if (-not (Test-Path ".\$_")) { # If it doesn't see the folder
New-Item -Path ".\" `
-Name "$_" `
-ItemType "directory"
}
Move-Item -Path ".\$name" `
-Destination ".\$_" # Move file to folder
}
}
}
I pieced together a PowerShell script that is triggered (via Task Scheduler) every time the computer restarts.
The script will do the following:
Find a csv file located in a specific directory
Rename the file by appending a time/date stamp to the end
Move that file into an archive folder
During the day a software application automatically creates a new csv file. So next time the computer reboots, it repeats the steps above.
Final step - the script also looks in the archive folder and deletes any files which are > 7 days old.
Sometimes (not all the time) when the computer restarts and the script runs, it completes steps 1 and 2 but not step 3.
And so what this means is the csv file is renamed but the script did NOT move it into the archive folder.
Why?
I open the script in PowerShell ISE and run the script manually and I see the reason why:
A file with that name already exists in the archive folder.
How can that happen if the file name is always dynamically renamed using a date/time stamp (down to the second).
Turns out the variable which is assigned the value of Get-Date is not updated.
It still contains the old time.
Why does this happen if the very first thing I do in my PowerShell script is this:
$rightNow = Get-Date
I know it's not best practice to assign the current date and time to a variable and obviously the variable is not going to update itself as every second goes by. That's fine. I don't need it to. What I DO expect it to do is grab the current date and time (at the time this line of code runs) and assign it to my variable called $rightNow.
For some reason the variable is not getting updated.
Why does this happen? What's the best way for me to quickly grab the current date and time (down to the second) and use it as part of a file name?
Here is my current script:
$source = "C:\Logs"
$destination = "C:\Logs\archive"
$old = 7
$rightNow = Get-Date
# delete all files in the archive folder that are > 7 days old
Get-ChildItem $destination -Recurse |
Where-Object {-not $_.PSIsContainer -and
$rightNow.Subtract($_.CreationTime).Days -gt $old } |
Remove-Item
# rename all csv files in the Log folder by appending currentDate_currentTime
Get-ChildItem -Path $source\* -Include *.csv | % {
$name = $_.Name.Split(".")[0] + "_" + ($_.CreationTime | Get-Date -Format yyyyMMdd) + "_" + ($_.CreationTime | Get-Date -Format hhmmss) + ".csv"
Rename-Item $_ -NewName $name
Move-Item "$($_.Directory)\$name" -Destination $destination
}
You don't use the current date in the rename, you use the file's CreationTime property. If you want the current datetime try
$name = $_.BaseName + [datetime]::now.ToString('_yyyyMMdd_hhmmss') + $_.Extension
Or better yet just perform the rename as part of the move process.
$source = "C:\Logs"
$destination = "C:\Logs\archive"
$old = 7
# delete all files in the archive folder that are > 7 days old
Get-ChildItem $destination -Recurse -File |
Where-Object { $_.CreationTime -lt [datetime]::Today.AddDays(-$old) } |
Remove-Item
# rename all csv files in the Log folder by appending currentDate_currentTime
Get-ChildItem -Path $source\* -Include *.csv | % {
$_ | Move-Item -Dest $("$destination\" + $_.BaseName + [datetime]::now.ToString('_yyyyMMdd_hhmmss') + $_.Extension)
}