I'm trying to create a script that will find the most recent build_info files from multiple install locations in a server's directory, select the "version: " text from each file, and compare them to see if they're all the same (which is what we hope for), or if certain install locations have different versions. As a bonus, it would also be nice to have each path's install version have its own variable so that if I have to output any differences, I can say which specific paths have which versions. For example, if something is installed in Path1, Path2, and Path3, I want to be able to say, "all paths are on version 3.5," or "Path1 is version 1.2, Path2 is version 3.5, Path3 is version 4.8."
Here's a neater list of what I'm trying to do:
Loop through folders in a directory.
For each folder, sort the txt files with a specific name in that path by Creation Date descending and select the most recent.
Once it has the most recent files from each path, Select-String a specific phrase from each of them. Specifically, "version: ".
Compare the version from each path and see if all are the same or there are differences, then output the result.
This is what I've been able to write so far:
$Directory = dir D:\Directory\Path* | ?{$_.PSISContainer};
$Version = #();
foreach ($d in $Directory) {
$Version = (Select-String -Path D:\Directory\Path*\build_info_v12.txt -Pattern "Version: " | Select-Object -ExpandProperty Line) -replace "Version: ";
}
if (#($Version | Select -Unique).Count -eq 1) {
Write-Host 'The middle tiers are all on version' ($Version | Select -Unique);
}
else {
Write-Host 'One or more middle tiers has a different version.';
}
I've had to hard code in the most recent build_info files because I'm not sure how to incorporate the sorting aspect into this. I'm also not sure how to effectively assign each path's result to a variable and output them if there are differences. This is what I've been messing around with as far as the sorting aspect, but I don't know how to incorporate it and I'm not even sure if it's the right way to approach this:
$Recent = Get-ChildItem -Path D:\Directory\Path*\build_info*.txt | Sort-Object CreationTime -Descending | Select-Object -Index 0;
You can use Sort-Object and Select-Object to determine the most recent file. Here is a function that you can give a collection of files to and it will return the most recent one:
function Get-MostRecentFile{
param(
$fileList
)
$mostRecent = $fileList | Sort-Object LastWriteTime | Select-Object -Last 1
$mostRecent
}
Here is one possible solution:
Get-ChildItem "D:\Directory\Path" -Include "build_info*.txt" -File -Recurse |
Group-Object -Property DirectoryName |
ForEach-Object {
$_.Group |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1 |
ForEach-Object {
New-Object -TypeName PsCustomObject |
Add-Member -MemberType NoteProperty -Name Directory -Value $_.DirectoryName -PassThru |
Add-Member -MemberType NoteProperty -Name FileName -Value $_.Name -PassThru |
Add-Member -MemberType NoteProperty -Name MaxVersion -Value ((Select-String -Path $_.FullName -Pattern "Version: ").Line.Replace("Version: ","")) -PassThru
}
}
This will produce a collection of objects, one for each directory in the tree, with properties for the directory name, most recent version and the file we found the version number in. You can pipe these to further cmdlets for filtering, etc.
Related
I'm not well-versed with PowerShell and I have been trying to sort the output for the code below:
function Get-DirSize ($path) {
BEGIN {}
PROCESS{
$colItems = Get-ChildItem $path | Where-Object {$_.PSIsContainer -eq $true} | Sort-Object
foreach ($folder in $colItems)
{
$object = New-Object -TypeName PSObject
$subFolderItems = Get-ChildItem $folder.FullName -recurse -force -ErrorAction SilentlyContinue | Where-Object {$_.PSIsContainer -eq $false} | Measure-Object -property Length -sum | Select-Object Sum
$sizeGB="{0:N4}" -f ($subFolderItems.sum/1GB)
$object | Add-Member -MemberType NoteProperty -Name "Folder" -Value $folder.FullName
$object | Add-Member -MemberType NoteProperty -Name "Size(GB)" -Value $sizeGB
$object
}
}
END {}
}
Get-DirSize -path 'C:\' |
Sort-Object 'Size(GB)' -Descending
I am able to get a sorted output on my desktop, but somehow the same code does not give me a sorted output on my laptop (as seen below for my laptop output).
PSSortedOutput
Does anyone know why this may be happening? Or is there something I should be changing to the code itself.
Thank you.
This line:
$sizeGB="{0:N4}" -f ($subFolderItems.sum/1GB)
converts ($subFolderItems.sum/1GB) into a string and stores it in a variable named "sizeGB".
Then on this line:
Sort-Object 'Size(GB)' -Descending
You are sorting that string in descending order. So you are performing a lexical sort (A.K.A alphabetical order) of strings of numbers. If you look at your results you will see they are sorted alphabetically in descending order (e.g. the string starting with '6' comes before the string starting with '5' which comes before the string starting with '4' etc.).
So don't prematurely convert your numbers to strings for the purposes of formatting. Formatting objects returned from your function is antithetical to the idea of PowerShell anyway. You want to deal with objects all the way through, from start to finish. The final consumer should be the one that decides how the output should be formatted.
I have a series of files in a directory with the following format:
file_ddMMyyyyhhttss.csv
eg:
myfile_151220171038.csv
myfile_301120171445.csv
myfile_121020161114.csv
I know how to select the latest by LastWriteTime:
gci "$pathtofile" | Sort LastWriteTime | Select -Last 1
but unsure how to split the "datestamp" in the file and then to sort by year, month and then date, in order to determine the latest file. Any suggestions?
Well, the specific format in your file names prevents any useful sorting without parsing it, but you can do just that:
Get-ChildItem $pathtofile |
ForEach-Object {
# isolate the timestamp
$time = $_ -replace '.*(\d{12}).*','$1'
# parse
$timestamp = [DateTime]::ParseExact($time, 'ddMMyyyyHHmm', $null)
# add to the objects so we can sort
$_ | Add-Member -PassThru NoteProperty Timestamp $timestamp
} |
Sort-Object Timestamp
Adjust to fit your exact date/time format, because the one you specified in your question does not match the one on your files.
My proposition ;)
Only take file, only file with format asked, Only print file with date when date are really date.
[System.DateTime]$parsedDate=get-date
Get-ChildItem "c:\temp\" -file -filter "*.csv" | where BaseName -match ".*(\d{12}).*" | %{
$DtString=$_.BaseName.substring($_.BaseName.Length - 12)
if ([DateTime]::TryParseExact($DtString, "ddMMyyyyhhmm",$null,[System.Globalization.DateTimeStyles]::None,[ref]$parseddate))
{
$_ | Add-Member -Name TimeInName -Value $parseddate -MemberType NoteProperty -PassThru
}
} | Sort TimeInName -Descending | Select -First 1
The Sort-Object cmdlet can work not only with regular but also with calculated properties. That allows you to sort the files without passing them through a ForEach-Object loop first. Like this:
$pattern = '.*_(\d{14}).*'
$datefmt = 'ddMMyyyyHHmmss'
$culture = [Globalization.CultureInfo]::InvariantCulture
Get-ChildItem $pathtofile | Sort-Object #{n='Timestamp';e={
$datestr = $_.Basename -replace $pattern, '$1'
[DateTime]::ParseExact($datestr, $datefmt, $culture)
}} | Select-Object -Last 1
I usually recommend using the InvariantCulture constant rather than $null as the third argument for ParseExact(), because using $null gave me errors in some cases.
If your source directory contains files that don't match your filename pattern you may want to exclude them with a Where-Object filter before sorting the rest:
Get-ChildItem $pathtofile | Where-Object {
$_.Basename -match $pattern
} | Sort-Object #{n='Timestamp';e={
$datestr = $_.Basename -replace $pattern, '$1'
[DateTime]::ParseExact($datestr, $datefmt, $culture)
}} | Select-Object -Last 1
I'm trying to add some pdf's files from a directory into csv file.
I'm using the below PowerShell script to add the PDFs but the issue is that I need to split (but leave the original PDF path on each file as is) the path numbers into calls.
Is there any way to accomplish this one?
Get-ChildItem -Recurse "C:\Users\alon\Desktop\MYpdf\" |
ForEach-Object {
$_ | Add-Member -Name "Owner" -MemberType NoteProperty -Value (Get-Acl $_.FullName).Owner -PassThru
} |
Sort-Object FullName |
Select FullName, CreationTime, LastWriteTime, Length, Owner |
Export-Csv -Force -NoTypeInformation "C:\Users\alon\Desktop\MYpdf\directory.csv"
The output should be like this:
You mean you want to add the numbers from the filename as additional fields in your output? I would split each file's basename and construct custom objects from that information and the file metadata:
Get-ChildItem ... |
ForEach-Object {
$numbers = $_.BaseName -split '_'
[PSCustomObject]#{
FullName = $_.FullName
Company = [int]$numbers[0]
Invoice = [int]$numbers[1]
Deal = [int]$numbers[2]
Customer = [int]$numbers[3]
Autonumber = [int]$numbers[4]
CreationTime = $_.CreationTime
LastWriteTime = $_.LastWriteTime
Length = $_.Length
Owner = (Get-Acl $_.FullName).Owner
} |
Sort-Object FullName |
Export-Csv ...
Using calculated properties would also work, but in this case constructing new objects is arguably the simpler approach.
Note that the [PSCustomObject] type accelerator requires PowerShell v3 or newer. On older versions you can create the objects via New-Object and then use Select-Object to get the properties in a defined order.
I have multiple folders across a number of SQL Servers that contain hundreds/thousands of databases. Each database comprises of three elements:
<dbname>.MDF
<dbname>.LDF
<dbname>files (Folder that contains db files/attachments)
I need to marry these files together and add up their total size, does anyone have any advice on how to do this?
EDIT : Just to clarify, I'm currently able to output the filesizes of the MDF/LDF files, I have a separate script that summarises the folder sizes. I need a method of adding together a .MDF/.LDF/DBFiles folder when their name matches. Bearing in mind all of the files are prefixed with the database name.
EDIT #2: The 2 options given so far sum together the .mdf/.ldf files with no problem, but do not add the folder size of the DBFiles folder. Does anyone have any input on how to amend these scripts to include a folder beginning with the same name.
First provided script:
$root = 'C:\db\folder'
Get-ChildItem "$root\*.mdf" | Select-Object -Expand BaseName |
ForEach-Object {
New-Object -Type PSObject -Property #{
Database = $_
Size = Get-ChildItem "$root\$_*" -Recurse |
Measure-Object Length -Sum |
Select-Object -Expand Sum
}
}
Second provided script:
gci "c:\temp" -file -Include "*.mdf", "*.ldf" -Recurse |
group BaseName, DirectoryName |
%{new-object psobject -Property #{FilesAndPath=$_.Name; Size=($_.Group | gci | Measure-Object Length -Sum).Sum } }
EDIT #3:
Thanks to Ansgar (below), the updated solution has done the trick perfectly. Updating question with final solution:
$root = 'C:\db\folder'
Get-ChildItem "$root\*.mdf" | Select-Object -Expand BaseName |
ForEach-Object {
New-Object -Type PSObject -Property #{
Database = $_
Size = Get-ChildItem "$root\$_*\*" -Recurse |
Measure-Object Length -Sum |
Select-Object -Expand Sum
}
}
Enumerate just the .mdf files from your database folder, then enumerate the files and folders for each basename.
$root = 'C:\db\folder'
Get-ChildItem "$root\*.mdf" | Select-Object -Expand BaseName |
ForEach-Object {
New-Object -Type PSObject -Property #{
Database = $_
Size = Get-ChildItem "$root\$_*\*" -Recurse |
Measure-Object Length -Sum |
Select-Object -Expand Sum
}
}
if you want the sum of sise files database by dir and name file (without extension), try it
gci "c:\temp" -file -Include "*.mdf", "*.ldf" -Recurse |
group BaseName, DirectoryName |
%{new-object psobject -Property #{FilesAndPath=$_.Name; Size=($_.Group | gci | Measure-Object Length -Sum).Sum } }
Modifiy a little the include gci if necessary
How can I list size of each folder in a directory by the sum of all files in each folder/subfolders?
My latest attempt:
ls | foreach-object { select-object Name, #{Name = "Size"; Expression = { ls $_ -recurse | measure-object -property length -sum } }
I've made other attempts but nothing successful yet. Any suggestions or solutions are very welcome. I feel like I'm missing something very obvious.
The output should look as follows:
Name Size
And it should list each folder in the root folder and the size of the folder counting subfolders of that folder.
I was able to resolve the issue with the following:
param([String]$path)
ls $path | Add-Member -Force -Passthru -Type ScriptProperty -Name Size -Value {
ls $path\$this -recurse | Measure -Sum Length | Select-Object -Expand Sum } |
Select-Object Name, #{Name = "Size(MB)"; Expression = {"{0:N0}" -f ($_.Size / 1Mb)} } | sort "Size(MB)" -descending
I think you've basically got it, honestly.
You could be a bit more elegant by using Add-Member:
ls | Add-Member -Force -Passthru -Type ScriptProperty -Name Length -Value {
ls $this -recurse | Measure -Sum Length | Select -Expand Sum }
PSCX messes with the formatting and will output "" for the size even though you've actually got a size. If you're using PSCX you'll have to add an explicit | Format-Table Mode, LastWriteTime, Length, Name -Auto
It's not particularly elegant but should get the job done:
gci . -force | ?{$_.PSIsContainer} |
%{$res=#{};$res.Name=$_.Name; $res.Size = (gci $_ -r | ?{!$_.PSIsContainer} |
measure Length -sum).Sum; new-object psobject -prop $res}
Note the use of -Force to make sure you're summing up hidden files. Also note the aliases I have used (convenient when typing interactively). There's ? for Where-Object and % for Foreach-Object. Saves the wrists. :-)
Here is a handy Powershell example script that may be adapted to fit what you are looking for.