Counting empty Elements in foreach from xml - powershell

So i have a XML File which i read and iterate through with a foreach loop.
so far so good.
Now i need to check if an Object is empty so i can prompt the User to Enter the Info.
right now i have to check every Object by Name if it is Empty.
foreach($var in $xml){
if(!$var.Object1){$count++}
if(!$var.Object2){$count++}
do{
more Stuff depending on the count Value
}
while($i -ne $count
}
as you can see i this will be a long list fast, depending how big your xml file is.
I don't want to write out every Object Name.
I am wondering if i can just get the amount of objects and then loob through it if one of it is empty
somehwat like this
foreach($var in $xml){
$c = $var.count
for(i=1; $i -le $c; $i++){
if(!$var.object$i){$count++}
}
do{
more Stuff depending on the count Value
}
while($i -ne $count
}
i could save a lot of lines. But i can't figure out how to do this or if this is even Possible...
at the moment i refuse to belive that you HAVE to ask every single Object if it is Empty.
I dumped down this script to a few lines. The actual Script is a bit larger. (the do part has around 25 lines) But i wanted to keep it as small as possible.
If it is neccessary i can post my whole Script.
thanks in advance and Regards

For this you're going to want to first load your XML into an XML object.
$path = "$env:temp\myFile.xml"
$xml = New-Object -TypeName XML
$xml.Load($Path)
from here you can traverse the nodes and select whatever information you need from them
#e.g.
$xml.Devices.Servers | Select-Object -Property Name, SerialNumber, IP
What you'd specifically be looking for in your case though is any object with a $null value. So you would do something like:
$xml.Devices.Servers | Where-Object {[string]::IsNullOrEmpty($_.Name)}
This returns either a $True or $False. Then we do our conditionals from there.

Related

Delete last used line in txt file

first time poster here, but you helped me a lot before. I don´t know how to ask google this question.
I have a powershell script, where with foreach command i check every computer in .txt that contains computer names. (Short explanation is that in check bitlocker status, connection avalability etc.) Everything works fine, but since I fall in love with powershell recently and try to automate more and more thing, that i should upgrade this script little more.
I have foreach ($DestinationComputer in $DestinationComputers) and after i check everything i wanted, i want to delete that row in .txt file.
Can someone help? I am still learning this and got stuck.
Continuing from my comment, I suggest creating a list of computernames that did not process correctly, while discarding the ones that did not fail.
By doing so, you will effectively remove the items from the text file.
Something like this:
$DestinationComputers = Get-Content -Path 'X:\somewhere\computers.txt'
# create a list variable to store computernames in
$list = [System.Collections.Generic.List[string]]::new()
# loop over the computer names
foreach ($DestinationComputer in $DestinationComputers) {
# first check: is the machine available?
$succes = Test-Connection -ComputerName $DestinationComputer -Count 1 -Quiet
if ($succes) {
# do whatever you need to do with that $DestinationComputer
# if anything there fails, set variable $success to $false
<YOUR CODE HERE>
}
# test if we processed the computer successfully and if not,
# add the computername to the list. If all went OK, we do not
# add it to the list, thus removing it from the input text file
if (-not $success) {
$list.Add($DestinationComputer)
}
}
# now, write out the computernames we collected in $list
# conputernames that were processed OK will not be in there anymore.
# I'm using a new filename so we don't overwrite the original, but if that is
# what you want, you can set the same filename as the original here.
$list | Set-Content -Path 'X:\somewhere\computers_2.txt'

Suppress Array List Add method pipeline output

I am using an Array List to build a sequence of log items to later log. Works a treat, but the Add method emits the current index to the pipeline. I can address this by sending it to $null, like this
$strings.Add('junk') > $null
but I wonder if there is some mechanism to globally change the behavior of the Add method. Right now I have literally hundreds of > $null repetitions, which is just ugly. Especially when I forget one.
I really would like to see some sort of global variable that suppresses all automatic pipelining. When writing a large script I want to intentionally send to the pipeline, as unexpected automatic send to pipeline is a VERY large fraction of my total bugs, and the hardest to find.
You could wrap your ArrayList in a custom object with a custom Add() method.
$log = New-Object -Type PSObject -Property #{
Log = New-Object Collections.ArrayList
}
$log | Add-Member -Type ScriptMethod -Name Add -Value {
Param(
[Parameter(Mandatory=$true)]
[string]$Message
)
$this.Log.Add($Message) | Out-Null
}
$log.Log.Add('some message') # output on this is suppressed
So, this thread getting resurrected has led me to "answer" my own question, since I discovered long ago that ArrayLists have been deprecated and Collections.Generic.List<T> is the preferred solution, as pointed out by #santiago-squarzon today.
So, for anyone wondering
$log = [System.Collections.Generic.List[String]]::new()
or the older New-Object way
$log = New-Object System.Collections.Generic.List[String]
to instantiate the collection, then happily
$log.Add('Message')
with no pipeline pollution to worry about. You can also add multiple items at once with
$log.AddRange()
With the range being another list, or an array if you cast to List first.
And you can insert a message with something like
$log.Insert(0, 'Message')
So yeah, lots of flexibility and no pollution. Winning.

Pass parameter from one Powershell function to another

I’m new to powershell and development in general. I’m trying to write a script that will email a contact once a file exceeds a certain size. I have two individual functions both working separately (one to check the file size and one to generate a file for sendmail to use) but I can’t get them to interact.
I want to execute the function CheckSize and if the variable $ExceedsSize gets set to 1 then call function SendMail otherwise the script should finish with no other action.
I’ve searched through the forums but couldn’t find anything to apply to what I’m doing.
##Check file to see if it is over a particular size and then send email once threshold is reached.
param(
[string]$SiteName = "TestSite", #Name of Site (No Spaces)
[string]$Path = "\\NetworkPath\Directory", #Path of directory to check
[int]$FileSizeThreshold = 10, #Size in MB that will trigger a notification email
[string]$Contacts = "MyEmail#email.com"
)
CLS
##Creates variable $ExceedsSize based on newest file in folder.
Function CheckSize {
IF ((GCI $Path -Filter *.txt | Sort LastWriteTime -Descending | Select-Object -first 1 | Measure-Object -property Length -sum).sum / 1000000 -gt $FileSizeThreshold) {$ExceedsSize = 1}
ELSE {$ExceedsSize = 0}
Write-Host $ExceedsSize
}
Function SendMail {
Param([string]$Template, [string]$Contacts, [string]$WarnTime)
$EmailLocation = "\\NetworkPath\Scripts\File_$SiteName.txt"
#Will Generate email from params
New-Item $EmailLocation -type file -force -value "From: JMSIssue#emails.com`r
To: $Contacts`r
Subject: $SiteName file has exceeded the maximum file size threshold of $FileSizeThreshold MB`r`n"
#Send Email
#CMD /C "$SendMail\sendmail.exe -t < $EmailLocation"
}
Add this before or after your Write-Host $ExceedsSize:
return $ExceedsSize
Add this to the bottom:
$var = CheckSize
if ($var -eq 1){
SendMail
}
Explanation
You have two functions, but don't actually run them. The part at the bottom does that.
Your CheckSize function does not return the $ExceedsSize for the rest of the function; by default it remains within the scope of the function. return x means the variable is passed back to the main script. $var = means it is assigned ot that variable.
Per the other answer, you need to return $ExceedsSize instead of using Write-Host (see here for why Write-Host is considered harmful: http://www.jsnover.com/blog/2013/12/07/write-host-considered-harmful/).
You could alternatively call the SendMail function from within the CheckSize function, e.g:
if ($ExceedsSize -eq 1){SendMail}
You will still need to call the CheckSize function somewhere also:
CheckSize
You might also want to give consideration to naming your functions in the verb-noun style of the built in cmdlets. This really helps make their use more explicit to you and others. When choosing a verb, its best to stick to the approved list: https://msdn.microsoft.com/en-us/library/ms714428(v=vs.85).aspx
And also to use names that are fairly unique to avoid possible conflicts.
I'd suggest something along the lines of:
Get-NewestFileSize
(although that's what it should then return)
and
Send-CCSMail

Create variable from CSV

I want to make variables from a particular column in a CSV.
CSV will have the following headers:
FolderName,FolderManager,RoleGroup,ManagerEmail
Under FolderName will be a list of rows with respective folder names such as: Accounts,HR,Projects, etc... (each of these names is a separate row in the FolderName column)
So I would like to create a list of variables to call on in a later stage. They would be something like the following:
$Accounts,
$HR,
$Projects,
I have done a few different scripts based on searching here and google, but unable to produce the desired results. I am hoping someone can lead me in the right direction here to create this script.
Versions of this question ("dynamic variables" or "variable variables" or "create variables at runtime") come up a lot, and in almost all cases they are not the right answer.
This is often asked by people who don't know a better way to approach their problem, but there is a better way: collections. Arrays, lists, hashtables, etc.
Here's the problem: You want to read a username and print it out. You can't write Hello Alice because you don't know what their name is to put in your code. That's why variables exist:
$name = Read-Host "Enter your name"
Write-Host "Hello $name"
Great, you can write $name in your source code, something which never changes. And it references their name, which does change. But that's OK.
But you're stuck - how can you have two people's names, if all you have is $name? How can you make many variables like $name2, $name3? How can you make $Alice, $Bob?
And you can...
New-Variable -Name (Read-Host "Enter your name") -Value (Read-Host "Enter your name again")
Write-Host "Hello
wait
What do you put there to write their name? You're straight back to the original problem that variables were meant to solve. You had a fixed thing to put in your source code, which allowed you to work with a changing value.
and now you have a varying thing that you can't use in your source code because you don't know what it is again.
It's worthless.
And the fix is that one variable with a fixed name can reference multiple values in a collection.
Arrays (Get-Help about_Arrays):
$names = #()
do {
$name = Read-Host "Enter your name"
if ($name -ne '')
{
$names += $name
}
} while ($name -ne '')
# $names is now a list, as many items long as it needs to be. And you still
# work with it by one name.
foreach ($name in $names)
{
Write-Host "Hello $name"
}
# or
$names.Count
or
$names | foreach { $_ }
And more collections, like
Hashtables (Get-Help about_Hash_Tables): key -> value pairs. Let's pair each file in a folder with its size:
$FileSizes = #{} # empty hashtable. (aka Dictionary)
Get-ChildItem *.txt | ForEach {
$FileSizes[$_.BaseName] = $_.Length
}
# It doesn't matter how many files there are, the code is just one block
# $FileSizes now looks like
#{
'readme' = 1024;
'test' = 20;
'WarAndPeace' = 1048576;
}
# You can list them with
$FileSizes.Keys
and
foreach ($file in $FileSizes.Keys)
{
$size = $FileSizes[$file]
Write-Host "$file has size $size"
}
No need for a dynamic variable for each file, or each filename. One fixed name, a variable which works for any number of values. All you need to do is "add however many there are" and "process however many there are" without explicitly caring how many there are.
And you never need to ask "now I've created variable names for all my things ... how do I find them?" because you find these values in the collection you put them in. By listing all of them, by searching from the start until you find one, by filtering them, by using -match and -in and -contains.
And yes, New-Variable and Get-Variable have their uses, and if you know about collections and want to use them, maybe you do have a use for them.
But I submit that a lot of people on StackOverflow ask this question solely because they don't yet know about collections.
Dynamic variables in Powershell
Incrementing a Dynamic Variable in Powershell
Dynamic variable and value assignment in powershell
Dynamically use variable in PowerShell
How to create and populate an array in Powershell based on a dynamic variable?
And many more, in Python too:
https://stackoverflow.com/a/5036775/478656
How can you dynamically create variables via a while loop?
Basically you want to create folders based on the values you are getting from CSV File.
(FileName has headers such as FolderName,
FolderManager,
RoleGroup,
ManagerEmail)
$File=Import-csv "FileName"
$Path="C:\Sample"
foreach ($item in $File){
$FolderName=$item.FolderName
$NewPath=$Path+"\$FolderName"
if(!(Test-Path $NewPath))
{
New-Item $NewPath -ItemType Directory
}
}
Hope this HElps.
In PowerShell, you can import a CSV file and get back custom objects. Below code snippet shows how to import a CSV to generate objects from it and then dot reference the properties on each object in a pipeline to create the new variables (your specific use case here).
PS>cat .\dummy.csv
"foldername","FolderManager","RoleGroup"
"Accounts","UserA","ManagerA"
"HR","UserB","ManagerB"
PS>$objectsFromCSV = Import-CSV -Path .\dummy.csv
PS>$objectsFromCSV | Foreach-Object -Process {New-Variable -Name $PSItem.FolderName }
PS>Get-Variable -name Accounts
Name Value
---- -----
Accounts
PS>Get-Variable -name HR
Name Value
---- -----
HR
`

Aligning the corrupted data records in a text file using powershell

My data file(.txt) has records of 31 fields/columns each and the fields are pipe delimited. Somehow, few records are corrupted(the record is split into multiple lines).
Can anyone guide in writing a script that reads this input data file and shapes it into a file containing exactly 31 fields in each record?
PS: I am new to powershell.
Sample data:
Good data - Whole record shows up in a single line.
Bad data - Record is broken into multiple lines.
Below is the structure of the record.
11/16/2007||0007327| 3904|1000|M1||CCM|12/31/2009|000|East 89th Street|01CM1| 11073|DONALD INC|001|Project 077|14481623.8100|0.0000|1.00000|1|EA|September 2007 Invoice|Project 027||000000000000|1330|11/16/2007|X||11/29/2007|2144.57
Here is what i have tried and script hangs
#Setup paths
$Input = "Path\Input.txt"
$Output = "Path\Output.txt"
#Create empty variables to set types
$Record=""
$Collection = #()
#Loop through text file
gc Path\Input.txt | %{
$Record = "$Record$_"
If($Record -Match "(\d{1,2}/\d{1,2}/\d{4}(?:\|.*?){31})(\d{1,2}/\d{1,2}/\d{4}\|.*?\|.*)"){
$Collection+=$Matches[1]
$Record=$Matches[2]
}
}
#Add last record to the collection
$Collection+=$Record $Collection | Out-File $Output
I see some issues that need to be clarified or addressed. First I noticed the line $Record=$Matches[2] does not appear to serve a purpose. Second your regex string appears to have some flaws which you were looking for. When i test your regex against your test data here: http://regex101.com/r/yA9tZ1/1
At least on that site the forward slashes needed to be escaped. Once I escaped the tester threw the error at me
Your expression took too long to evaluate.
I know the root of that issue comes from this portion of your regex which is trying to match your passive group with a non greedy quantifier 31 times. (?:\|.*?){31}
So taking a guess as to your true intention I have the following regex string
(\d{1,2}\/\d{1,2}\/\d{4}.{31}).*?(\d{1,2}\/\d{1,2}\/\d{4}\|.*?\|.*)
You can see the results here: http://regex101.com/r/qY1jZ7/2
While i doubt it is exactly what you wanted I hope this leads you in the right direction.
I just tried this, and while that solution worked for an extremely similar issue where the user only had 11 fields per record, apparently it's just no good for your 31 field records. I'd like to suggest an alternative using -Split alongside a couple of regex matches. This should work faster for you I think.
#Create regex objects to match against
[RegEx]$Regex = "(.*?)(\d{2}/\d{2}/\d{4})$"
[RegEx]$Regex2 = "(\d{2}/\d{2}/\d{4}.*)"
#Setup paths
$Input = "Path\Input.txt"
$Output = "Path\Output.txt"
#Create empty variables to set types
$Record=""
$Collection = #()
#Loop through text file
gc $Input | %{
If($_ -match "^\d{1,2}/\d{1,2}/\d{4}" -and $record.split("|").count -eq 31){$collection+=$record;$record=$_}
else{
$record="$record$_"
if($record.split("|").count -gt 31){
$collection+=$regex.matches(($record.split("|")[0..30]) -join "|").groups[1].value
$record=$regex2.matches(($record.split("|")[30..($record.split("|").count)]) -join "|").groups[1].value
}
}
}
#Add last record to the collection
$collection+=$record
#Output everything to a file
$collection|out-file $Output