Finding errors in several folders full of logfiles powershell script - powershell

I want to find "error" in log files. There is time and date on every folder. you can see down below how it looks like. Inside of these folder there is other folder that are named "mail1" "mail2" on so on. The logfiles are inside the mail1, mail2, mail3 and so on. The path to one of the log files is c:\2019-05-24 00.00.09\mail1\mail.log
2019-05-24 00.00.09
2019-05-23 00.00.08
2019-05-22 00.00.05
2019-05-21 00.00.06
2019-05-20 00.00.09
My example just showed for finding error in 1 log file.
Get-Content C:\Users\123\Desktop\log\mail.log | Select-Object -first 10000 | Select-String ("Error") | Out-file C:\Users\1234\Desktop\leave\ouputerror.txt
can someone pls give me an easy example on how to find errors in several folders full of logfile.

Lets say your 2019-05-24 00.00.09 folders are located in C:\LogFolder. Then you can use something like this.
Get-ChildItem C:\LogFolder -Recurse -Filter *.log | ForEach-Object { Get-content $_.Fullname | Select-Object -first 10000 | Select-String ("Error") } | Out-file C:\Users\1234\Desktop\leave\ouputerror.txt -Append

Related

File Size with Powershell

What I am trying to do is create a PS script to see when a certain folder has a file over 1GB. If it found a file over 1GB, I want it to write a log file with info saying the name of the certain file and its size.
This works but not fully, if the file is less than 1GB I don't want a log file. (right now this will display the file info for over 1GB but if its less that 1GB it still creates a log file with no data). I don't want it to create a log for anything less than 1GB.
Any idea on how to do that?
Thanks!
Ryan
Get-ChildItem -Path C:\Tomcat6.0.20\logs -File -Recurse -ErrorAction SilentlyContinue |
Where-Object {$_.Length -gt 1GB} |
Sort-Object length -Descending |
Select-Object Name,#{n='GB';e={"{0:N2}" -F ($_.length/ 1GB)}} |
Format-List Name,Directory,GB > C:\Users\jensen\Desktop\FolderSize\filesize.log`
First, set a variable with the term/filter you're after and store the results
$items = Get-ChildItem -Path C:\Tomcat6.0.20\logs -File -Recurse -ErrorAction SilentlyContinue |
Where-Object {$_.Length -gt 1GB} |
Sort-Object Length -Descending |
Select-Object Name,#{n='GB';e={"{0:N2}" -F ($_.length/ 1GB)}}
Then pipe that to Out-File to your desired output path. This example will output a file to the Desktop of the user running the script, change as needed:
$items | Out-File -FilePath $ENV:USERPROFILE\Desktop\filesize.log -Force
The -Force parameter will overwrite an existing filesize.log file if one already exists.
To make sure you don't write blank files you should collect the minimal starting results that match your filter, and test them to see if they contain anything at all.
Then if they on;t you can ed the script, but if they do you can go on to do the sort and select the data and output it to a log file.
# Collect Matching Files
$Matched = GCI -Path "C:\Tomcat6.0.20\logs" -File -R -ErrorA "SilentlyContinue" | ? {
$_.Length -gt 1GB
}
# Check is $Matched contains Results before further processing, otherwise, we're done!
IF ([bool]$Matched) {
# If here, we have Data so select what we need and output to the log file:
$Matched | Sort Length -D | FT Name,Directory,#{
n="GB";e={"{0:N2}" -F ($_.Length/ 1GB)}
} -Auto | Out-File "C:\Users\jensen\Desktop\FolderSize\filesize_$(Get-Date -F "yyyy-MM-dd_HH-mm-ss").log"
}
In the above script, I fixed the $. to be $_., and separated Matching the 1GB files from Manipulating them, and Outputting them to a file.
We simply test if any files were matched at 1 GB by checking to see if the Variable has any results or is $NULL/undefined.
If so, there is no need to take any further action.
Only when 1Gb files are matched do we quickly sort them, and select the details you wanted, but instead we'll just Use Format-Table (FT) with -Auto-size to get nice looking output that is much easier to review for this sort of data.
(Note Format-Table selects and formats the info into a table in one step, saving the unnecessary step of using Select to get the data and then piping (|) it to Format-list or Format-Table, as that just adds a bit of a redundant step. Select-Object is best used when you will need to do further steps with that data that require "object-oriented" operations in future steps.)
Then we pipe that output to save it all to a Log file using Out-File, and I also changed the log file name to contain the current date and time in ISO format filesize_$(Get-Date -F "yyyy-MM-dd_HH-mm-ss").log So you can save each run and review them later and won't want to have one gigantic file or have no history of runs.

Backup QVD files every day & save three versions before removing the first generation

I have a few QlikView servers with alot of QVD files I need to backup.
The idea is to backup three generations, so lets say the app is named tesla.qvd.
Backing it up naming it like testa.qvd.2019-06.05 if the file was modified today.
Then it would backup a new one the next time it's modified/written to.
In total I would like to save two generations before the first one is removed.
This is for a windows 2012 server, using PS 4.0
#$RemotePath = "C:\qlikview Storage\privatedata\backup\"
$LocalPath = "C:\qlikview Storage\privatedata"
$nomatch = "*\backup\*"
$predetermined=[system.datetime](get-date)
$date= ($predetermined).AddDays(-1).ToString("MM/dd/yyyy:")
Foreach($file in (Get-ChildItem -File $localpath -Recurse | Where {$_.FullName -notlike $nomatch} -Verbose ))
{
Copy-Item -Path $file.fullname -Destination "C:\qlikview Storage\privatedata\backup\$file.$(get-date -f yyyy-MM-dd)"
}
The code above would back the files up with the dates as described in the text before the code.
It's proceeding from here thats my problem.
I tried google and searching the forum.
I don't ask for someone to solve the whole issue I have.
But if you can help me out with which functions / what I should look on to get my end result it would help alot so I can proceed.
In the picture you can see an example how the library looks after backup has been done. The lastwrite on the files would be same as date thou, this is fictionaly created for this question.
You can use the basename attribute of the files in the backup folder since you add a new extension to the files. It would look something like this:
# Group by basename and find groups with more then 2 backups
$Groups = Get-ChildItem -Path "C:\qlikview Storage\privatedata\backup" | Group-Object basename | Where-Object {$_.Count -gt 2}
foreach ($g in $Groups) {
$g.Group | sort LastWriteTime -Descending | select -skip 2 | foreach {del $_.fullname -force}
}

Outputing Remove-Item to a log file

Scanning a directory for a specific set of files and sorting them by date. Keeping 7 of the LATEST copies of the file regardless of date, and removing the oldest if over 7. I am having a hard time producing a log file showing the deletes since Remove-Item has no output.
Below is a copy of my code:
$path = "C:\- Deploy to Production -\Previous Deploys\*_*_BOAWeb.rar" #BOA
$files = Get-ChildItem -Path $path -Recurse | Where-Object {-not $_.PsIsContainer}
$keep = 7
if ($files.Count -gt $keep) {
$files | Sort-Object CreationTime |
Select-Object -First ($files.Count - $keep) |
Remove-Item -Force
}
First off you are over complicating things. Add -Descending to your Sort command, and then change your Select to -Skip $keep. It's simpler that way. Then you have options for outputting your deleted files.
Remove-Item -Force -Verbose 4>&1 | Add-Content C:\Path\To\DeletedFiles.log
or (keeping with your current code above)
Select-Object -First ($files.Count - $keep) |Tee-Object -filepath C:\Path\To\DeletedFiles.log -append
The first will output the verbose output of Delete-Item and append it to whatever log file you specify the path for (use Set-Content if you want to replace the log instead). The second option will append the [FileInfo] objects onto a log that you specify.
Edit: As pointed out by Ansgar Wiechers, I had forgotten to to combine my verbose and stdout streams, so 4>&1 was added to the above code to correct that issue.

Keep x versions of a file in Folder - delete rest

I have backup process that makes a copy of files and appending system time at the end. Date stamp indicates when the file was received via FTP (ddMMYYYYhhmmss).
fileName1.ZIP02062015090653
fileName1.ZIP01062015090653
fileName1.ZIP31052015090653
fileName1.ZIP29052015090653
fileName1.ZIP28052015090653
fileName1.ZIP21052015090653
fileName2.ZIP02062015090653
fileName3.ZIP02062015090653
reportName1.PDF02062015090653
reportNameX.TXT02062015090653
etc..
I need the script to keep the 5 most recent versions of each file.
i.e. fileName1.ZIP21052015090653 <- this should get deleted.
I was trying to work of a script below but it deletes everything after 5th file..
gci C:\temp\ -Recurse| where{-not $_.PsIsContainer}| sort CreationTime -desc|
select -Skip 5| Remove-Item -Force
I don't mind if script uses fileName, dateModified, creationTime or DateStamp - I'd like to be able to keep 5 versions of each file and blow away oldest one.
Try this:
gci C:\temp\ -Recurse |
where{-not $_.PsIsContainer} |
Group-Object basename |
foreach {
$_.Group |
sort CreationTime -Descending |
Select -Skip 5 |
foreach { Remove-Item $_.fullname -Force -WhatIf }
}
That will group the files by the basename, then sort/delete within each group.
Remove the -Whatif if the results look right and re-run to actually delete the files.

Delete files containing string

How can I delete all files in a directory that contain a string using powershell?
I've tried something like
$list = get-childitem *.milk | select-string -pattern "fRating=2" | Format-Table Path
$list | foreach { rm $_.Path }
And that worked for some files but did not remove everything. I've tried other various things but nothing is working.
I can easily get the list of file names and can create an array with the path's only using
$lista = #(); foreach ($f in $list) { $lista += $f.Path; }
but can't seem to get any command (del, rm, or Remove-Item) to do anything. Just returns immediately without deleting the files or giving errors.
Thanks
First we can simplify your code as:
Get-ChildItem "*.milk" | Select-String -Pattern "fRating=2" | Select-Object -ExcludeProperty path | Remove-Item -Force -Confirm
The lack of action and errors might be addressable by one of two things. The Force parameter which:
Allows the cmdlet to remove items that cannot otherwise be changed,
such as hidden or read-only files or read-only aliases or variables.
I would aslo suggest that you run this script as administrator. Depending where these files are located you might not have permissions. If this is not the case or does not work please include the error you are getting.
Im going to guess the error is:
remove-item : Cannot remove item C:\temp\somefile.txt: The process cannot access the file 'C:\temp\somefile.txt'
because it is being used by another process.
Update
In testing, I was also getting a similar error. Upon research it looks like the Select-String cmd-let was holding onto the file preventing its deletion. Assumption based on i have never seen Get-ChildItem do this before. The solution in that case would be encase the first part of this in parentheses as a sub expression so it would process all the files before going through the pipe.
(Get-ChildItem | Select-String -Pattern "tes" | Select-Object -ExpandProperty path) | Remove-Item -Force -Confirm
Remove -Confirm if deemed required. It exists as a precaution so that you don't open up a new powershell in c:\windows\system32 and copy paste a remove-item cmdlet in there.
Another Update
[ and ] are wildcard searches in powershell in order to escape those in some cmdlets you use -Literalpath. Also Select-String can return multiple hits in files so we should use -Unique
(Get-ChildItem *.milk | Select-String -Pattern "fRating=2" | Select-Object -ExpandProperty path -Unique) | ForEach-Object{Remove-Item -Force -LiteralPath $_}
Why do you use select-string -pattern "fRating=2"? You would like to select all files with this name?
I think the Format-Table Path don't work. The command Get-ChildItem don't have a property called "Path".
Work this snipped for you?
$list = get-childitem *.milk | Where-Object -FilterScript {$_.Name -match "fRating=2"}
$list | foreach { rm $_.FullName }
The following code gets all files of type *.milk and puts them in $listA, then uses that list to get all the files that contain the string fRating=[01] and stores them in $listB. The files in $listB are deleted and then the number of files deleted versus the number of files that contained the match is displayed(they should be equal).
sv -name listA -value (Get-ChildItem *.milk); sv -name listB -value ($listA | Select-String -Pattern "fRating=[01]"); (($listB | Select-Object -ExpandProperty path) | ForEach-Object {Remove-Item -Force -LiteralPath $_}); (sv -name FCount -value ((Get-ChildItem *.milk).Count)); Write-Host -NoNewline Files Deleted ($listA.Count - $FCount)/($listB.Count)`n;
No need to complicate things:
1. $sourcePath = "\\path\to\the\file\"
2. Remove-Item "$sourcePath*whatever*"
I tried the answer, unfortunately, errors seems to always come up, however, I managed to create a solution to get this done:
Without using Get-ChilItem; You can use select-string directly to search for files matching a certain string, yes, this will return the filename:count:content ... etc, but, internally these have names that you can chose or omit, the one you need is the "filename" to do this pipe this into "select-object" choosing the "FileName" from the output.
So, to select all *.MSG files that has the pattern of "Subject: Webservices restarted", you can do the following:
Select-String -Path .*.MSG -Pattern 'Subject: WebServices Restarted'
-List | select-object Filename
Also, to remove these files on the fly, you could pip into a ForEach statement with the RM command as follows:
Select-String -Path .*.MSG -Pattern 'Subject: WebServices Restarted'
-List | select-object Filename | foreach { rm $_.FileName }
I tried this myself, works 100%.
I hope this helps