Hope anyone can help me out:
Trying to make an exception in $arrfiles for L4j code.
Original : $arrFiles | ? {$_ -match '.jar$'} | % {
My try:
$arrFiles | ? { ($_ -match '.jar$') -and ($_ -notmatch 'jbcdserver.jar')} | % {
I would like to make some exclusions but if i can get one should be fine.
$arrFiles giving me output like c:\program files\application\jbcdserver.jar
this is a peace of the script:
#get a list of all files-of-interest on the device (depending on scope) :: GCI is broken; permissions errors when traversing root dirs cause aborts (!!!)
$arrFiles=#()
foreach ($drive in $varDrives) {
gci "$drive `\"` -force | ? {$_.PSIsContainer} | % {
gci -path "$drive `\$_\" `-rec -force -include *.jar,*.log,*.txt -ErrorAction 0 | % {
$arrFiles+=$_.FullName
}
}
}
#scan i: JARs containing vulnerable Log4j code
write-host "====================================================="
write-host "- Scanning for JAR files containing potentially insecure Log4j code..."
$arrFiles | ? {$_ -match '.jar$'} | % {
if (select-string -Quiet -Path $_ "JndiLookup.class") {
write-DRMMAlert "! ALERT: Potentially vulnerable file at $($_)!"
write-DRMMDiag #($arrFiles | out-string)
if (!(test-path "$env:PROGRAMDATA\CentraStage\L4Jdetections.txt" -ErrorAction SilentlyContinue)) {set-content -path "$env:PROGRAMDATA\CentraStage\L4Jdetections.txt" -Value "! CAUTION !`r`n$(get-date)"}
Add-Content "$env:PROGRAMDATA\CentraStage\L4Jdetections.txt" -Value "POTENTIALLY VULNERABLE JAR: $($_)"
$script:varDetection=1
Write-EventLog -LogName "Application" -Source "Log4J" -EventID 1004 -EntryType Error -Message "ALERT!: Potentially vulnerable files gevonden, check Programdata\centrastage\L4Jdetections.txt"
exit 1
}
Not giving me any Exclusion in script
Related
New to PowerShell.
Some experience with Linux, bash, and programming (Java, C, HTML/CSS/JS). I just started an internship.
I was given a PowerShell script in order to do basic disk clean-up. Part of it pasted below. It writes to both console and a logfile. Some of the servers that I am cleaning having hundreds of thousands of files. I want to increase the performance of the script by only writing to the logfile. It usually starts out pretty strong, but once the console output gets large enough, things start to slow down drastically.
I attempted to simply remove the -verbose tags, but then it doesn't write to either. Then my understanding was that 'SilentlyContinue' would allow print to log, but not console. But the code already has SilentlyContinue flags? I then tried adding-Verbose to some of the for-each statements and that didn't work either.
I'm just kind of running in circles now.
Any ideas or pointers?
function global:Write-Verbose
(
[string]$Message
)
{ # check $VerbosePreference variable
if ( $VerbosePreference -ne 'SilentlyContinue' )
{ Write-Host " $Message" -ForegroundColor 'Yellow' }
}
Write-Verbose
$DaysToDelete = 7
$LogDate = get-date -format "MM-d-yy-HH"
$objShell = New-Object -ComObject Shell.Application
$objFolder = $objShell.Namespace(0xA)
Start-Transcript -Path C:\Windows\Temp\$LogDate.log
#Cleans all code off of the screen.
Clear-Host
$Before = Get-WmiObject Win32_LogicalDisk | Where-Object { $_.DriveType -eq "3" } | Select-Object SystemName,
#{ Name = "Drive" ; Expression = { ( $_.DeviceID ) } },
#{ Name = "Size (GB)" ; Expression = { "{0:N1}" -f ( $_.Size / 1gb) } },
#{ Name = "FreeSpace (GB)" ; Expression = { "{0:N1}" -f ( $_.Freespace / 1gb ) } },
#{ Name = "PercentFree" ; Expression = { "{0:P1}" -f ( $_.FreeSpace / $_.Size ) } } |
Format-Table -AutoSize | Out-String
## Stops the windows update service.
Get-Service -Name wuauserv | Stop-Service -Force -Verbose -ErrorAction SilentlyContinue
## Windows Update Service has been stopped successfully!
## Deletes the contents of windows software distribution.
Get-ChildItem "C:\Windows\SoftwareDistribution\*" -Recurse -Force -Verbose -ErrorAction SilentlyContinue |
Where-Object { ($_.CreationTime -lt $(Get-Date).AddDays(-$DaysToDelete)) } |
remove-item -force -Verbose -recurse -ErrorAction SilentlyContinue
## The Contents of Windows SoftwareDistribution have been removed successfully!
## Deletes the contents of the Windows Temp folder.
Get-ChildItem "C:\Windows\Temp\*" -Recurse -Force -Verbose -ErrorAction SilentlyContinue |
Where-Object { ($_.CreationTime -lt $(Get-Date).AddDays(-$DaysToDelete)) } |
remove-item -force -Verbose -recurse -ErrorAction SilentlyContinue
## The Contents of Windows Temp have been removed successfully!
## Deletes all files and folders in user's Temp folder.
Get-ChildItem "C:\users\$env:USERNAME\AppData\Local\Temp\*" -Recurse -Force -ErrorAction SilentlyContinue |
Where-Object { ($_.CreationTime -lt $(Get-Date).AddDays(-$DaysToDelete)) } |
remove-item -force -Verbose -recurse -ErrorAction SilentlyContinue
## The contents of C:\users\$env:USERNAME\AppData\Local\Temp\ have been removed successfully!
## Remove all files and folders in user's Temporary Internet Files.
Get-ChildItem "C:\users\$env:USERNAME\AppData\Local\Microsoft\Windows\Temporary Internet Files\*" -Recurse -Force -Verbose -ErrorAction SilentlyContinue |
Where-Object { ($_.CreationTime -le $(Get-Date).AddDays(-$DaysToDelete)) } |
remove-item -force -recurse -ErrorAction SilentlyContinue
## All Temporary Internet Files have been removed successfully!
## Cleans IIS Logs if applicable.
Get-ChildItem "C:\inetpub\logs\LogFiles\*" -Recurse -Force -ErrorAction SilentlyContinue |
Where-Object { ($_.CreationTime -le $(Get-Date).AddDays(-60)) } |
Remove-Item -Force -Verbose -Recurse -ErrorAction SilentlyContinue
## All IIS Logfiles over x days old have been removed Successfully!
## deletes the contents of the recycling Bin.
$objFolder.items() | ForEach-Object { Remove-Item $_.path -ErrorAction Ignore -Force -Verbose -Recurse }
## The Recycling Bin has been emptied!
## Starts the Windows Update Service
Get-Service -Name wuauserv | Start-Service -Verbose
$After = Get-WmiObject Win32_LogicalDisk | Where-Object { $_.DriveType -eq "3" } | Select-Object SystemName,
#{ Name = "Drive" ; Expression = { ( $_.DeviceID ) } },
#{ Name = "Size (GB)" ; Expression = { "{0:N1}" -f ( $_.Size / 1gb) } },
#{ Name = "FreeSpace (GB)" ; Expression = { "{0:N1}" -f ( $_.Freespace / 1gb ) } },
#{ Name = "PercentFree" ; Expression = { "{0:P1}" -f ( $_.FreeSpace / $_.Size ) } } |
Format-Table -AutoSize | Out-String
## Sends some before and after info for ticketing purposes
Hostname ; Get-Date | Select-Object DateTime
Write-Host "Before: $Before"
Write-Host "After: $After"
Write-Verbose ( Get-ChildItem -Path C:\* -Include *.iso, *.vhd, *.vhdx -Recurse -ErrorAction SilentlyContinue |
Sort Length -Descending | Select-Object Name, Directory,
#{Name = "Size (GB)"; Expression = { "{0:N2}" -f ($_.Length / 1GB) } } | Format-Table |
Out-String )
## Completed Successfully!
Stop-Transcript
You want to look into Redirection: https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_redirection?view=powershell-7.1
It's useful for logging your Catch output, wrapping it in ${} and specifying your stream output. Ex: ${ Write-Verbose ... } 4>&1 3>&1 2>&1 >> $logFile
Most of what you have though looks like you're trying to log data for reference. Like your Before:After & final Get-ChildItem statements. If you're doing it as you go, you can pipe to Out-File -Append to a log file. But since you have it in a block towards the end, you can simply wrap & redirect:
&{
Write-Host "Before: " $Before
Write-Host "After: " $After"
Get-ChildItem -Path C:\* -Include *.iso, *.vhd, *.vhdx -Recurse -ErrorAction SilentlyContinue |
Sort Length -Descending | Select-Object Name, Directory,
#{Name = "Size (GB)"; Expression = { "{0:N2}" -f ($_.Length / 1GB) } } | Format-Table |
Out-String
} *> C:\FilePath\File.txt
Notice you don't even need to wrap your Get-ChildItem in a Write-Verbose statement. Get cmdlets that spit out to console, will just write the output to the file when using redirection. The only time you should need to issue a Write statement is if you are adding a string of text / interpolating data.
On an unrelated note, I see a DRY opportunity. Your before & after statements are identical. Put a function towards the top of your file that returns the statement, that way you can just assign your $Before & $After vars to the return output.
I found a PowerShell script on TechNet to help locate duplicate files in folders. However, when I run it, I am getting an error on what appears to be every folder\file. Not sure what switch is supposed to be used in this.
$Path = '\\servername\Share\Folders' #define path to folders to find duplicate files
$Files=gci -File -Recurse -path $Path | Select-Object -property FullName,Length
$Count=1
$TotalFiles=$Files.Count
$MatchedSourceFiles=#()
ForEach ($SourceFile in $Files)
{
Write-Progress -Activity "Processing Files" -status "Processing File $Count / $TotalFiles" -PercentComplete ($Count / $TotalFiles * 100)
$MatchingFiles=#()
$MatchingFiles=$Files |Where-Object {$_.Length -eq $SourceFile.Length}
Foreach ($TargetFile in $MatchingFiles)
{
if (($SourceFile.FullName -ne $TargetFile.FullName) -and !(($MatchedSourceFiles |
Select-Object -ExpandProperty File) -contains $TargetFile.FullName))
{
Write-Verbose "Matching $($SourceFile.FullName) and $($TargetFile.FullName)"
Write-Verbose "File sizes match."
if ((fc.exe /A $SourceFile.FullName $TargetFile.FullName) -contains "FC: no differences encountered")
{
Write-Verbose "Match found."
$MatchingFiles+=$TargetFile.FullName
}
}
}
if ($MatchingFiles.Count -gt 0)
{
$NewObject=[pscustomobject][ordered]#{
File=$SourceFile.FullName
MatchingFiles=$MatchingFiles
}
$MatchedSourceFiles+=$NewObject
}
$Count+=1
}
$MatchedSourceFiles
Errors
FC: Insufficient number of file specifications
fc.exe : FC: Invalid Switch
At line:18 char:12
gci : Could not find a part of the path
At line:2 char:8
fc.exe : FC: Invalid Switch
At line:18 char:12
To fix your fc.exe error and optimize tour script, I also recommend #rich-moss 's solution.
But if you only want to find duplicates, you can easily accomplish so by checking their hashes.
Example:
$Duplicates = Get-ChildItem -File -Recurse | Get-FileHash | Group-Object -Property Hash | Where-Object Count -gt 1
If ($duplicates.count -lt 1) {
$null # 'No duplicates found. Do stuff ...'
} else {
$result = foreach ($d in $duplicates) {
$d.Group | Select-Object -Property Path, Hash
}
The script you provided is very inefficient and provides false positives in my tests. It's inefficient because it compares every file twice (Source->Target and Target->Source) and because it iterates through all files regardless of size. Here's a quicker version that gathers the files into groups of similarly sized files and only executes FC.EXE once per pair of files:
$Path = 'C:\Temp'
$SameSizeFiles = gci -Path $Path -File -Recurse | Select FullName, Length | Group-Object Length | ? {$_.Count -gt 1} #the list of files with same size
$MatchingFiles=#()
$GroupNdx=1
Foreach($SizeGroup in ($SameSizeFiles | Select Group)){
For($FromNdx = 0; $FromNdx -lt $SizeGroup.Group.Count - 1; $FromNdx++){
For($ToNdx = $FromNdx + 1; $ToNdx -lt $SizeGroup.Group.Count; $ToNdx++){
If( (fc.exe /A $SizeGroup.Group[$FromNdx].FullName $SizeGroup.Group[$ToNdx].FullName) -contains "FC: no differences encountered"){
$MatchingFiles += [pscustomobject]#{File=$SizeGroup.Group[$FromNdx].FullName; Match = $SizeGroup.Group[$ToNdx].FullName }
}
}
}
Write-Progress -Activity "Finding Duplicates" -status "Processing group $GroupNdx of $($SameSizeFiles.Count)" -PercentComplete ($GroupNdx / $SameSizeFiles.Count * 100)
$GroupNdx += 1
}
$MatchingFiles
Efficiency will be even more important if you're running it over the network. You may find it quicker to execute the script on the server itself, rather than from a share. There is some discussion here about the fastest way to compare files in .Net.
My company had an old batch script that ran every 15 minutes and collected the event log from two servers and put it in a location where devs can access it for triage purposes.
It created and then maintained an event log file (.evtx) for each server for an entire day and then started a new one the next day, etc.
I recently re-wrote it from batch to powershell and added some folder cleanup, file zipping, etc. but that is only partially working.
Issue:
Script should be checking for the last 7 files, based on creation date and ignoring them. It should remove anything that is in the 8th, 9th, 10th, etc. spot. Instead it is only leaving 5 files in one folder (for one server) and 4 files in another folder (for another server). I don't know why. Another issue I've noticed once or twice is that it'll delete a file that is in the 4th spot of the list but ignore the 5th and then delete the 6th, etc.
I'm not sure about the zips part which is set to 60 days as my script has only been running for about 20-25 days.
Code:
# Cleaning up LogFile folder
Write-Output "Cleaning up existing *.evtx, *.zip and *.7z files to ensure efficient disk space usage..." | Out-File $HistFile -append
Write-Output " " | Out-File $HistFile -append
# Now cleaning up event logs at an interval of 7 days
$EventLogsCount = Get-ChildItem $Path -Recurse -Exclude *.zip, *.7z, *.ps1, *.txt | Where {-not $_.PsIsContainer} | Sort CreationTime -desc | Select -Skip 7 | %{$_.Count}
if ($EventLogsCount -eq $null)
{
Write-Output "No event logs to remove..." | Out-File $HistFile -append
Write-Output " " | Out-File $HistFile -append
}
else
{
Write-Output "Removing the following event log files:" | Out-File $HistFile -append
Write-Output " " | Out-File $HistFile -append
Get-ChildItem $Path -Recurse -Exclude *.zip, *.7z, *.ps1, *.txt | Where {-not $_.PsIsContainer} | Sort CreationTime -desc | Select -Skip 7 | foreach {
$_ | Remove-Item -Recurse
if (Test-Path $_)
{
"Failed to remove $_"
}
else
{
"$_"
}
} | Out-File $HistFile -append
Write-Output " " | Out-File $HistFile -append
}
# Cleaning up zips at a greater interval of 60 days
$ZipsCount = Get-ChildItem $Path -Recurse -Exclude *.evtx, *.ps1, *.txt | Where {-not $_.PsIsContainer} | Sort CreationTime -desc | Select -Skip 60 | %{$_.Count}
if ($ZipsCount -eq $null)
{
Write-Output "No zipped files to remove..." | Out-File $HistFile -append
}
else
{
Write-Output "Removing the following zipped files:" | Out-File $HistFile -append
Write-Output " " | Out-File $HistFile -append
Get-ChildItem $Path -Recurse -Exclude *.evtx, *.ps1, *.txt | Where {-not $_.PsIsContainer} | Sort CreationTime -desc | Select -Skip 60 | foreach {
$_ | Remove-Item -Recurse
if (Test-Path $_)
{
"Failed to remove $_"
}
else
{
"$_"
}
} | Out-File $HistFile -append
Write-Output " " | Out-File $HistFile -append
}
Your logic is a little wonky. Currently, you're gathering a list of files and skipping x number of them as a whole based on their sorted creation time. You can use Get-ChildItem's -Include flag instead of excluding everything else. I've rewritten the script to be more easily read and functional. It looks at the file's last written time and filters based on your threshold (7 days for event logs, or 60 days for zip files)
Script rewritten for intended functionality:
# Log script functionality
"Cleaning up existing *.evtx, *.zip and *.7z files to ensure efficient disk space usage...`r`n" >> $HistFile
# Now cleaning up event logs at an interval of 7 days
$EventLogs = GCI $Path -Include *.evtx -Recurse |
? { $_.LastWriteTime -lt (Get-Date).AddDays(-7) }
If (!$EventLogs) {
"No event logs to remove...`r`n" >> $HistFile
} Else {
"Removing the following event log files:`r`n" >> $HistFile
$EventLogs |
% {
Try {
Remove-Item $_ -EA Stop
$_.FullName >> $HistFile
} Catch {
"Failed to remove $($_.FullName)" >> $HistFile
}
}
}
# Cleaning up zips at a greater interval of 60 days
$ZipFiles = GCI $Path -Include *.zip,*.7z -Recurse |
? { $_.LastWriteTime -lt (Get-Date).AddDays(-60) }
If (!$ZipFiles) {
"No zipped files to remove..." >> $HistFile
} Else {
"Removing the following zipped files:`r`n" >> $HistFile
$ZipFiles |
% {
Try {
Remove-Item $_ -EA Stop
$_.FullName >> $HistFile
} Catch {
"Failed to remove $($_.FullName)" >> $HistFile
}
}
}
There is COTS app we have that creates reports and never deletes it. So we need to start cleaning it up. I started doing a foreach and would run out of memory on the server (36GB) when it got up to 50ish million files. After searching it seemed you could change it like so
Get-ChildItem -path $Path -recurse | foreach {
and it won't go through memory but process each item at a time. I can get to 140 million files before I run out of memory.
Clear-Host
#Set Age to look for
$TimeLimit = (Get-Date).AddMonths(-4)
$Path = "D:\CC\LocalStorage"
$TotalFileCount = 0
$TotalDeletedCount = 0
Get-ChildItem -Path $Path -Recurse | foreach {
if ($_.LastWriteTime -le $TimeLimit) {
$TotalDeletedCount += 1
$_.Delete()
}
$TotalFileCount += 1
$FileDiv = $TotalFileCount % 10000
if ($FileDiv -eq 0 -and $TotalFileCount -ne 0) {
$TF = [string]::Format('{0:N0}', $TotalFileCount)
$TD = [string]::Format('{0:N0}', $TotalDeletedCount)
Write-Host "Files Scanned : " -ForegroundColor Green -NoNewline
Write-Host "$TF" -ForegroundColor Yellow -NoNewline
Write-Host " Deleted: " -ForegroundColor Green -NoNewline
Write-Host "$TD" -ForegroundColor Yellow
}
Is there a better way to do this? My only next thought was not to use the -Recurse command but make my own function that calls itself for each directory.
EDIT:
I used the code provided in the first answer and it does not solve the issue. Memory is still growing.
$limit = (Get-Date).Date.AddMonths(-3)
$totalcount = 0
$deletecount = 0
$Path = "D:\CC\"
Get-ChildItem -Path $Path -Recurse -File | Where-Object { $_.LastWriteTime -lt $limit } | Remove-Item -Force
Using the ForEach-Object and the pipeline should actually prevent the code from running out of memory. If you're still getting OOM exceptions I suspect that you're doing something in your code that counters this effect, which you didn't tell us about.
With that said, you should be able to clean up your data directory with something like this:
$limit = (Get-Date).Date.AddMonths(-4)
Get-ChildItem -Path $Path -Recurse -File |
Where-Object { $_.LastWriteTime -lt $limit } |
Remove-Item -Force -WhatIf
Remove the -WhatIf switch after you verified that everything is working.
If you need the total file count and the number of deleted files, add counters like this:
$totalcount = 0
$deletecount = 0
Get-ChildItem -Path $Path -Recurse -File |
ForEach-Object { $totalcount++; $_ } |
Where-Object { $_.LastWriteTime -lt $limit } |
ForEach-Object { $deletecount++; $_ } |
Remove-Item -Force -WhatIf
I don't recommend printing status information to the console when you're bulk-processing large numbers of files. The output could significantly slow down the processing. If you must have that information, write it to a log file and tail that file separately.
I am trying to handle errors when scanning through folders. Let's say I have something like:
Get-ChildItem $somepath -Directory | ForEach-Object {
if(error occurred due to too long path) {
skip this folder then
} else {
Write-Host $_.BaseName
}
}
When I do this I print the folders in $somepath until one of them is too long and then the loop stops. Even when using SilentlyContinue. I want to print even after reaching a folder that is too long.
If you can install a non-ancient PowerShell version (3.0 or newer), simply prepend the path with \\?\ to overcome the 260-character limit for full path:
Get-ChildItem "\\?\$somepath" | ForEach {
# ............
}
You could try ignoring the files longer 260 characters by using the Where-Object cmdlet.
Get-ChildItem $somepath -Directory -ErrorAction SilentlyContinue `
| Where-Object {$_.length -lt 261} `
| ForEach-Object { Write-Host $_.BaseName }
Or you could use the following (Ref).
cmd /c dir $somepath /s /b | Where-Object {$_.length -lt 261}
I will add my solution since the neither on this page worked for me. I am using relative paths, so I can't use the \\ prefix.
$TestFiles = Get-ChildItem $pwd "*Tests.dll" -Recurse | Where-Object {$_.FullName.length -lt 261} | Select-Object FullName -ExpandProperty FullName | Where-Object FullName -like "*bin\Release*"
Write-Host "Found TestFiles:"
foreach ($TestFile in $TestFiles) {
Write-Host " $TestFile"
}