File Size with Powershell - powershell

What I am trying to do is create a PS script to see when a certain folder has a file over 1GB. If it found a file over 1GB, I want it to write a log file with info saying the name of the certain file and its size.
This works but not fully, if the file is less than 1GB I don't want a log file. (right now this will display the file info for over 1GB but if its less that 1GB it still creates a log file with no data). I don't want it to create a log for anything less than 1GB.
Any idea on how to do that?
Thanks!
Ryan
Get-ChildItem -Path C:\Tomcat6.0.20\logs -File -Recurse -ErrorAction SilentlyContinue |
Where-Object {$_.Length -gt 1GB} |
Sort-Object length -Descending |
Select-Object Name,#{n='GB';e={"{0:N2}" -F ($_.length/ 1GB)}} |
Format-List Name,Directory,GB > C:\Users\jensen\Desktop\FolderSize\filesize.log`

First, set a variable with the term/filter you're after and store the results
$items = Get-ChildItem -Path C:\Tomcat6.0.20\logs -File -Recurse -ErrorAction SilentlyContinue |
Where-Object {$_.Length -gt 1GB} |
Sort-Object Length -Descending |
Select-Object Name,#{n='GB';e={"{0:N2}" -F ($_.length/ 1GB)}}
Then pipe that to Out-File to your desired output path. This example will output a file to the Desktop of the user running the script, change as needed:
$items | Out-File -FilePath $ENV:USERPROFILE\Desktop\filesize.log -Force
The -Force parameter will overwrite an existing filesize.log file if one already exists.

To make sure you don't write blank files you should collect the minimal starting results that match your filter, and test them to see if they contain anything at all.
Then if they on;t you can ed the script, but if they do you can go on to do the sort and select the data and output it to a log file.
# Collect Matching Files
$Matched = GCI -Path "C:\Tomcat6.0.20\logs" -File -R -ErrorA "SilentlyContinue" | ? {
$_.Length -gt 1GB
}
# Check is $Matched contains Results before further processing, otherwise, we're done!
IF ([bool]$Matched) {
# If here, we have Data so select what we need and output to the log file:
$Matched | Sort Length -D | FT Name,Directory,#{
n="GB";e={"{0:N2}" -F ($_.Length/ 1GB)}
} -Auto | Out-File "C:\Users\jensen\Desktop\FolderSize\filesize_$(Get-Date -F "yyyy-MM-dd_HH-mm-ss").log"
}
In the above script, I fixed the $. to be $_., and separated Matching the 1GB files from Manipulating them, and Outputting them to a file.
We simply test if any files were matched at 1 GB by checking to see if the Variable has any results or is $NULL/undefined.
If so, there is no need to take any further action.
Only when 1Gb files are matched do we quickly sort them, and select the details you wanted, but instead we'll just Use Format-Table (FT) with -Auto-size to get nice looking output that is much easier to review for this sort of data.
(Note Format-Table selects and formats the info into a table in one step, saving the unnecessary step of using Select to get the data and then piping (|) it to Format-list or Format-Table, as that just adds a bit of a redundant step. Select-Object is best used when you will need to do further steps with that data that require "object-oriented" operations in future steps.)
Then we pipe that output to save it all to a Log file using Out-File, and I also changed the log file name to contain the current date and time in ISO format filesize_$(Get-Date -F "yyyy-MM-dd_HH-mm-ss").log So you can save each run and review them later and won't want to have one gigantic file or have no history of runs.

Related

Foreach/copy-item based on name contains

I'm trying to create a list of file name criteria (MS Hotfixes) then find each file name containing that criteria in a directory and copy it to another directory. I think I'm close here but missing something simple.
Here is my current attempt:
#Create a list of the current Hotfixes.
Get-HotFix | Select-Object HotFixID | Out-File "C:\Scripts\CurrentHotfixList.txt"
#
#Read the list into an Array (dropping the first 3 lines).
$HotfixList = Get-Content "C:\Scripts\CurrentHotfixList.txt" | Select-Object -Skip 3
#
#Use the Hotfix names and copy the individual hotfixes to a folder
ForEach ($Hotfix in $HotfixList) {
Copy-Item -Path "C:\KBtest\*" -Include *$hotfix* -Destination "C:\KBtarget"
}
If I do a Write-Host $Hotfix and comment out my Copy-Item line I get the list of hotfixes as expected.
If I run just the copy command and input the file name I am looking for - it works.
Copy-Item -Path "C:\KBtest\*" -Include *kb5016693* -Destination "C:\KBtarget"
But when I run my script it copies all the files in the folder and not just the one file I am looking for. I have several hotfixes in that KBtest folder but only one that is correct for testing.
What am I messing up here?
The simplest solution to your problem, taking advantage of the fact that -Include can accept an array of patterns:
# Construct an array of include patterns by enclosing each hotfix ID
# in *...*
$includePatterns = (Get-HotFix).HotfixID.ForEach({ "*$_*" })
# Pass all patterns to a single Copy-Item call.
Copy-Item -Path C:\KBtest\* -Include $includePatterns -Destination C:\KBtarget
As for what you tried:
To save just the hotfix IDs to a plain-text file, each on its own line, use the following, don't use Select-Object -Property HotfixId (-Property is implied if you omit it), use Select-Object -ExpandProperty HotfixId:
Get-HotFix | Select-Object -ExpandProperty HotFixID | Out-File "C:\Scripts\CurrentHotfixList.txt"
Or, more simply, using member-access enumeration:
(Get-HotFix).HotFixID > C:\Scripts\CurrentHotfixList.txt
Using Select-Object -ExpandProperty HotfixID or (...).HotfixID returns only the values of the .HotfixID properties, whereas Select-Object -Property HotfixId - despite only asking for one property - returns an object that has a .HotfixID property - this is a common pitfall; see this answer for more information.
Then you can use a Get-Content call alone to read the IDs (as strings) back into an array (no need for Select-Object -Skip 3):
$HotfixList = Get-Content "C:\Scripts\CurrentHotfixList.txt"
(Note that, as the solution at the top demonstrates, for use in the same script you don't need to save the IDs to a file in order to capture them.)
This will likely fix your primary problem, which stems from how Out-File creates for-display string representations of the objects ([pscustomobject] instances) that Select-Object -Property HotfixID created:
Not only is there an empty line followed by a table header at the start of the output (which your Select-Object -Skip 3 call skips), there are also two empty lines at the end.
When these empty lines were read into $hotfix in your foreach loop, -Include *$hotfix* effectively became -Include **, which therefore copied all files.
first, you do not need to create and import those textfiles:
get-hotfix | ?{$_.hotfixid -notmatch 'KB5016594|KB5004567|KB5012170'} | %{
copy-item -path "C:\kbtest\$($_.HotfixId).exe" -Destination "C:\kbTarget"
}
This filters for the hotfixes you do not want, if you do not need it remove:
?{$_.hotfixid -notmatch 'KB5016594|KB5004567|KB5012170'}
I assume that those files are exe files in my example.

Finding errors in several folders full of logfiles powershell script

I want to find "error" in log files. There is time and date on every folder. you can see down below how it looks like. Inside of these folder there is other folder that are named "mail1" "mail2" on so on. The logfiles are inside the mail1, mail2, mail3 and so on. The path to one of the log files is c:\2019-05-24 00.00.09\mail1\mail.log
2019-05-24 00.00.09
2019-05-23 00.00.08
2019-05-22 00.00.05
2019-05-21 00.00.06
2019-05-20 00.00.09
My example just showed for finding error in 1 log file.
Get-Content C:\Users\123\Desktop\log\mail.log | Select-Object -first 10000 | Select-String ("Error") | Out-file C:\Users\1234\Desktop\leave\ouputerror.txt
can someone pls give me an easy example on how to find errors in several folders full of logfile.
Lets say your 2019-05-24 00.00.09 folders are located in C:\LogFolder. Then you can use something like this.
Get-ChildItem C:\LogFolder -Recurse -Filter *.log | ForEach-Object { Get-content $_.Fullname | Select-Object -first 10000 | Select-String ("Error") } | Out-file C:\Users\1234\Desktop\leave\ouputerror.txt -Append

Increase speed of PowerShell Get-ChildItem large directory

I have a script that references a .csv document of filenames and then runs a Get-ChildItem over a large directory to find the file and pull the 'owner'. Finally the info outputs into another .csv document. We use this to find who created files. Additionally I have it create .txt files with filename and timestamp to see how fast the script is finding the data. The code is as follows:
Get-ChildItem -Path $serverPath -Filter $properFilename -Recurse -ErrorAction 'SilentlyContinue' |
Where-Object {$_.LastWriteTime -lt (get-date).AddDays(30) -and
$_.Extension -eq ".jpg"} |
Select-Object -Property #{
Name='Owner'
Expression={(Get-Acl -Path $_.FullName).Owner}
},'*' |
Export-Csv -Path "$desktopPath\Owner_Reports\Owners.csv" -NoTypeInformation -Append
$time = (get-date -f 'hhmm')
out-file "$desktopPath\Owner_Reports\${fileName}_$time.txt"
}
This script serves it's purpose but is extremely slow based on the large size of the parent directory. Currently it takes 12 minutes per filename. We query approx 150 files at a time and this long wait time is hindering production.
Does anyone have better logic that could increase the speed? I assume that each time the script runs Get-ChildItem it recreates the index of the parent directory, but I am not sure. Is there a way we can create the index one time instead of for each filename?
I am open to any and all suggestions! If more data is required (such as the variable naming etc) I will provide upon request.
Thanks!

Combine TXT files in a directory to one file with column added at end with for file name

I have got a set of txt files in a directory that I want to merge together.
The contents of all the txt files are in the same format as follows:
IPAddress Description DNSDomain
--------- ----------- ---------
{192.168.1.2} Microsoft Hyper-V Network Adapter
{192.168.1.30} Microsoft Hyper-V Network Adapter #2
I have the below code that combines all the txt files in to one txt file called all.txt.
copy *.txt all.txt
From this all.txt I can't see what lines came from what txt file. Any ideas on any bits of code that would add an extra column to the end txt file with the file name the rows come from?
As per the comments above, you've put the output of Format-Table into a text file. Note that Format-Table might be visually structured on screen, but is just lines of text. By doing that you have made it harder to work with the data.
If you just want a few properties from the results of the Get-WMIObject cmdlet, use Select-Object which (in the use given here) will effectively filter the data for just the properties you want.
Instead of writing text to a simple file, you can preserve the tabular nature of the data by writing to a structured file (i.e. CSV):
Get-WmiObject -Class Win32_NetworkAdapterConfiguration -Filter IPEnabled=TRUE -ComputerName SERVERNAMEHERE |
Select-Object PSComputerName, IPAddress, Description, DNSDomain |
Export-Csv 'C:\temp\server.csv'
Note that we were able to include the PScomputerName property in each line of data, effectively giving you the extra column of data you wanted.
So much for getting the raw data. One way you could read in all the CSV data and write it out again might look like this:
Get-ChildItem *.csv -Exclude all.csv |
Foreach-Object {Import-Csv $_} |
Export-Csv all.csv
Note that we exclude the output file in the initial cmdlet to avoid reading and writing form/to the same file endlessly.
If you don't have the luxury to collect the data again you'll need to spool the files together. Spooling files together is done with Get-Content, something like this:
Get-ChildItem *.txt -Exclude all.txt |
Foreach-Object {Get-Content $_ -Raw} |
Out-File all.txt
In your case, you wanted to suffix each line, which tricker as you need to process the files line-by-line:
$files = Get-ChildItem *.txt
foreach($file in $files) {
$lines = Get-Content $file
foreach($line in $lines) {
"$line $($file.Name)" | Out-File all.txt -Append
}
}

Delete duplicate files with Powershell except the file specified

I am using the following code to delete duplicate files in one folder:
ls *.wav -recurse | get-filehash | group -property hash | where { $_.count -gt 1 } | % { $_.group | select -skip 1 } | del
I have two issues. I want to limit this to only one filehash at a time and I need to specify the file name I want to keep.
As an example, I have a folder named Recordings. The first five files listed all have the same filehash but only one has the filename that has already been entered in my database.
Recordings
It would be great if I could use the -Exclude parameter for the del cmdlet but that parameter does not accept pipeline input.
I also considered using the code above and then renaming the remaining file afterward but the code is not limited to one filehash.
It all depends on how you want it to work. For example, if you know the file name you want to keep in advance, you could do it this way:
$fileName = 'file1.txt'
$fileHash = Get-FileHash .\$filename
$duplicates = ls -Recurse | Get-FileHash | Where-Object {$_.Hash -eq $fileHash.Hash -and ($_.Path | Split-Path -Leaf) -ne $fileName }
$duplicates | del
This sequence sets the filename, gets the hash of that file, and then the main command checks for other files with that same hash that doesn't have the same filename.
Note: Test first to make sure this will do what you expect before you execute the del command.
Update: It appears that Get-FileHash puts some sort of lock on the files being hashed so you cannot immediately pipe to the del (Remove-Item) command. I modified the results to save the array of duplicates to a variable and then pass that to the delete command which works.