export to csv powershell script using multiple foreach statements - powershell

I have following powershell script reading from csv and exporting to another csv. It's working in terms of basic functionality. Script below is currently exporting as such:
USERS
jdoe
mprice
tsmith
Add-PSSnapin microsoft.sharepoint.powershell -ErrorAction SilentlyContinue
# csv file name
[parameter(Mandatory=$false)][string]$CsvFilePath = ".\AllSiteCollectionsLocal.csv"
$csvItems = Import-Csv $CsvFilePath
$resultsarray = #()
$firstObject = New-Object PSObject
# iterate lines in csv
foreach($Item in $csvItems)
{
$site = new-object Microsoft.SharePoint.SPSite($Item.SiteCollection)
$web = $site.openweb()
$siteUsers = $web.SiteUsers
Write-Host $Item.SiteCollection -ForegroundColor Green
foreach($user in $siteUsers)
{
Write-Host $user.LoginName
$loginnames = #{
USERS = $user.LoginName
}
$resultsarray += New-Object PSObject -Property $loginnames
}
$web.Dispose()
$site.Dispose()
$resultsarray | export-csv -Path c:\temp\sitesandusers.csv -NoTypeInformation
}
I need to export as below. Note, I dont even need a header, but do need $Item.SiteCollection value to print out between each iteration of users under each site, so the outer foreach needs to print $Item.SiteCollection then the inner foreach would print $user.LoginName
http://test1.com
jdoe
mprice
http://test2.com
tsmith

I'm guessing you wanted to do parameters for your script to be called from elsewhere? As of now, your metadata attribute on $CsvFilePath are redundant to what PowerShell already does for you.
As for your question, you would just have to append $Item.SiteCollection to your PSObject. This too isn't needed as PowerShell streaming capabilities allow you to assign directly to a variable; so no need for += - which can be computationally expensive on larger lists slowing overall performance. Now we end up with:
Param (
[parameter(Mandatory=$false)]
[string]$CsvFilePath = ".\AllSiteCollectionsLocal.csv"
)
Add-PSSnapin microsoft.sharepoint.powershell -ErrorAction SilentlyContinue
$csvItems = Import-Csv $CsvFilePath
$variable = foreach($Item in $csvItems)
{
$site = new-object Microsoft.SharePoint.SPSite($Item.SiteCollection)
$web = $site.openweb()
$siteUsers = $web.SiteUsers
Write-Host -Object $Item.SiteCollection -ForegroundColor Green
Write-Output -InputObject $Item.SiteCollection
foreach($user in $siteUsers)
{
Write-Host -Object $user.LoginName
Write-Output -InputObject $user.LoginName
}
$null = $web.Dispose()
$null = $site.Dispose()
}
$variable | Out-File -FilePath 'c:\temp\sitesandusers.csv'
Bypassing $variable you can assign the output directly to the file placing the export outside the first foreach statement.
This requires the use of a sub-expression operator $() to wrap around the loop.
Also added a Param ( ) statement for your parameter declaration.
Didn't mess with the parameter attributes as it can show the Authors intentions regardless if it's needed or not.
Probably should add that, Write-Output will explicitly write to the success stream allowing the values to be assigned to the variable, whereas Write-Host writes to the information stream, so no object pollution (duplicates) occur.

Related

Speed Powershell Script Up

I am looking for way to speed up my Powershell script. I have a script that returns the manager Employee ID and manager name based on a .txt file that has the samaccountnames for each user under that manager. The problem is the list is very long, about 1400+ names and the script is taking forever to run. Here is my script. It works, just looking for a way to speed it up:
cls
If (!(Get-Module -Name activerolesmanagementshell -ErrorAction SilentlyContinue))
{
Import-Module activerolesmanagementshell
}
Write-host $("*" * 75)
Write-host "*"
Write-host "* Input file should contain just a list of samaccountnames - no header row."
Write-host "*"
Write-host $("*" * 75)
$File = Read-Host -Prompt "Please supply a file name"
If (!(test-path $File))
{
Write-host "Sorry couldn't find the file...buh bye`n`n"
exit
}
get-content $File | %{
$EmpInfo = get-qaduser -proxy -Identity $_ -IncludedProperties employeeid,edsva_SSCOOP_managerEmployeeID
# Check if we received back a Manager ID - if yes, get the Manager's name
# If not, set the Manager Name to "NONE" for output
If ($($EmpInfo.edsva_SSCOOP_managerEmployeeID).length -gt 2)
{
# Get the Manager's name from AD
$($EmpInfo.edsva_SSCOOP_managerEmployeeID)
$ManagerName = $(Get-QADUser -SearchAttributes #{employeeid=$($EmpInfo.edsva_SSCOOP_managerEmployeeID)} | select name).name
If (!$ManagerName)
{
$ManagerName = "NONE"
}
# Add the Manager name determined above (or NONE) to the properties we'll eventually output
$EmpInfo | Add-Member -MemberType NoteProperty -Name ManagerName -Value $ManagerName
}
Else
{
$EmpInfo.edsva_SSCOOP_managerEmployeeID = "NONE"
}
# Output user samaccountname edsva_SSCOOP_managerEmployeeID and ManagerName to a file
$EmpInfo | select samaccountname,edsva_SSCOOP_managerEmployeeID,ManagerName | export-csv "C:\Users\sfp01\Documents\Data_Deletion_Testing\Script_DisaUser_MgrEmpID\Disabled_Users_With_Manager.txt" -NoTypeInformation -Append
} # End of file processing loop
Ok, first things first... asking your user to type in a file name. Give them a nice friendly dialog box with little effort. Here's a function I keep on hand:
Function Get-FilePath{
[CmdletBinding()]
Param(
[String]$Filter = "All Files (*.*)|*.*|Comma Seperated Values (*.csv)|*.csv|Text Files (*.txt)|*.txt",
[String]$InitialDirectory = $home,
[String]$Title)
[void][System.Reflection.Assembly]::LoadWithPartialName("System.windows.forms")
$OpenFileDialog = New-Object System.Windows.Forms.OpenFileDialog
$OpenFileDialog.initialDirectory = $InitialDirectory
$OpenFileDialog.filter = $Filter
$OpenFileDialog.Title = $Title
[void]$OpenFileDialog.ShowDialog()
$OpenFileDialog.filename
}
Then you can do:
$File = Get-FilePath -Filter 'Text Files (*.txt)|*.txt|All Files (*.*)|*.*' -InitialDirectory "$home\Desktop" -Title 'Select user list'
That doesn't speed things up, it's just a quality of life improvement.
Secondly, your 'can't find the file' message will appear as the window closes, so the person that ran your script probably won't see it. Towards that end I have a function that I use to pause a script with a message.
Function Invoke-Pause ($Text){
[reflection.assembly]::LoadWithPartialName('Windows.Forms')|out-null
If($psISE){
[Windows.Forms.MessageBox]::Show("$Text", "Script Paused", [Windows.Forms.MessageBoxButtons]"OK", [Windows.Forms.MessageBoxIcon]"Information") | ?{(!($_ -eq "OK"))}
}Else{
Write-Host $Text
Write-Host "Press any key to continue ..."
$x = $host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")
}
}
With that you can get a message to the user, and then close the script so the user knows what happened. This function works in both the PowerShell console, as well as in the PowerShell ISE. In the console you get a text message that you define, and then a 'Press any key to continue...' message, and it waits for the user to press a key. In the ISE it pops up a window with your message, and waits for the user to click the OK button before proceeding. You could do something like:
If(!(Test-Path $File)){Invoke-Pause "Sorry couldn't find the file...buh bye";exit}
Now to get on to speeding things up!
You have more than one employee per manager right? So why look up the manager more than once? Setup a hashtable to keep track of your manager info, and then only look them up if you can't find them in the hashtable. Before your loop declare $Managers as a hashtable that just declares that 'NONE' = 'NONE', then inside the loop populate it as needed, and then reference it later.
Also, you are appending to a file for each user. That means PowerShell has to get a file lock on the file, write to it, close the file, and release the lock on it... over and over and over and over... Just pipe your users down the pipeline and write to the file once at the end.
Function Get-FilePath{
[CmdletBinding()]
Param(
[String]$Filter = "All Files (*.*)|*.*|Comma Seperated Values (*.csv)|*.csv|Text Files (*.txt)|*.txt",
[String]$InitialDirectory = $home,
[String]$Title)
[void][System.Reflection.Assembly]::LoadWithPartialName("System.windows.forms")
$OpenFileDialog = New-Object System.Windows.Forms.OpenFileDialog
$OpenFileDialog.initialDirectory = $InitialDirectory
$OpenFileDialog.filter = $Filter
$OpenFileDialog.Title = $Title
[void]$OpenFileDialog.ShowDialog()
$OpenFileDialog.filename
}
cls
If (!(Get-Module -Name activerolesmanagementshell -ErrorAction SilentlyContinue))
{
Import-Module activerolesmanagementshell
}
Write-host $("*" * 75)
Write-host "*"
Write-host "* Input file should contain just a list of samaccountnames - no header row."
Write-host "*"
Write-host $("*" * 75)
$File = Get-FilePath -Filter 'Text Files (*.txt)|*.txt|All Files (*.*)|*.*' -InitialDirectory "$home\Desktop" -Title 'Select user list'
If (!(test-path $File))
{
Write-host "Sorry couldn't find the file...buh bye`n`n"
exit
}
$Managers = #{'NONE'='NONE'}
Get-Content $File | %{
$EmpInfo = get-qaduser -proxy -Identity $_ -IncludedProperties employeeid,edsva_SSCOOP_managerEmployeeID
Switch($EmpInfo.edsva_SSCOOP_managerEmployeeID){
{$_.Length -lt 2} {$EmpInfo.edsva_SSCOOP_managerEmployeeID = 'NONE'}
{$_ -notin $Managers.Keys} {
$MgrLookup = Get-QADUser -SearchAttributes #{employeeid=$EmpInfo.edsva_SSCOOP_managerEmployeeID} |% Name
If(!$MgrLookup){$MgrLookup = 'NONE'}
$Managers.add($EmpInfo.edsva_SSCOOP_managerEmployeeID,$MgrLookup)
}
}
Add-Member -InputObject $EmpInfo -NotePropertyName 'ManagerName' -NotePropertyValue $Managers[$EmpInfo.edsva_SSCOOP_managerEmployeeID] -PassThru
} | select samaccountname,edsva_SSCOOP_managerEmployeeID,ManagerName | Export-Csv "C:\Users\sfp01\Documents\Data_Deletion_Testing\Script_DisaUser_MgrEmpID\Disabled_Users_With_Manager.txt" -NoTypeInformation -Append

Cannot bind argument to parameter 'InputObject' because it is null

I have a powershell script that measures download time on some pages, however I get the error above, I am unsure what I am doing wrong
error is
Cannot bind argument to parameter 'InputObject' because it is null.
function ResponseTime($CommonName,$URL, $environment)
{
$Times = 5
$i = 0
$TotalResponseTime = 0
Write-HOst $URL
While ($i -lt $Times) {
$Request = New-Object System.Net.WebClient
$Request.UseDefaultCredentials = $true
$Start = Get-Date
Write-HOst $URL
$PageRequest = $Request.DownloadString($URL)
$TimeTaken = ((Get-Date) - $Start).TotalMilliseconds
$Request.Dispose()
$i ++
$TotalResponseTime += $TimeTaken
}
$AverageResponseTime = $TotalResponseTime / $i
Write-Host Request to $CommonName took $AverageResponseTime ms in average -ForegroundColor Green
$details = #{
Date = get-date
AverageResponseTime = $AverageResponseTime
ResponseTime = $Destination
Environment = $environment
}
$results += New-Object PSObject -Property $details
$random = Get-Random -minimum 1 -maximum 30
Start-Sleep -s $random
}
#PRODUCTION
ResponseTime -commonname 'app homepage' -URL 'https://url1' -environment 'PRODUCTION'
ResponseTime -commonname 'department homepage' -URL 'https://url2' -environment 'PRODUCTION'
$results | export-csv -Path c:\so.csv -NoTypeInformation
Reviewing your last edit, it seems that $results simply returns $null (As your error says)
The only line setting $results is $results += New-Object PSObject -Property $details
It is not in the scope of your Export-CSV call and - even if it would, $results could be empty, if this line is not called.
You should IMHO set it to e.g. an ArrayList like follows:
$results = New-Object -TypeName System.Collections.ArrayList
And add items to it via
$times = ResponseTime -commonname '' #etc
$results.Add($times) | Out-Null
This gives you an ArrayList - even if there are no items in it - which can easily be transformed to CSV and other formats.
#Clijsters has given the correct answer; i.e. the issue being the scope of your $results variable.
This answer just provides a bit of a code review to help you with other bits going forwards...
function Get-ResponseTime {
[CmdletBinding()]
param (
[Parameter(Mandatory = $true)]
[string]$CommonName
,
[Parameter(Mandatory = $true)]
[string]$URL
,
[Parameter(Mandatory = $true)]
[string]$Environment
,
[Parameter(Mandatory = $false)]
[int]$Times = 5
)
[System.Int64]$TotalResponseTime = 0
[System.Diagnostics.Stopwatch]$stopwatch = New-Object 'System.Diagnostics.Stopwatch'
Write-Verbose "Processing URL: $URL"
1..$times | foreach-object {
[System.Net.WebClient]$Request = New-Object 'System.Net.WebClient'
$Request.UseDefaultCredentials = $true
Write-Verboset "Call $_ to URL: $URL"
$stopwatch.Restart()
$PageRequest = $Request.DownloadString($URL)
$stopwatch.Stop()
$TimeTaken = $stopwatch.Elapsed.TotalMilliseconds
$Request.Dispose()
$TotalResponseTime += $TimeTaken
}
$AverageResponseTime = $TotalResponseTime / $Times
Write-Verbose "Request to $CommonName took $AverageResponseTime ms on average"
$details = #{
Date = get-date
AverageResponseTime = $AverageResponseTime
#ResponseTime = $Destination #this is not declared anywhere / don't know what this field's for
Environment = $environment
}
Write-Output (New-Object 'PSObject' -Property $details)
#do you really want a delay here? Doesn't make much sense... may make sense to include a delay in the above loop; i.e. to stagger your tests?
#$random = Get-Random -minimum 1 -maximum 30
#Start-Sleep -s $random
}
#PRODUCTION
[PSObject[]]$results = #(
(Get-ResponseTime -commonname 'app homepage' -URL 'https://url1' -environment 'PRODUCTION' -Verbose)
,(Get-ResponseTime -commonname 'department homepage' -URL 'https://url2' -environment 'PRODUCTION' -Verbose)
)
$results | Export-Csv -LiteralPath 'c:\so.csv' -NoTypeInformation
Use verb-noun function names (e.g. Get-Item). What is the naming convention for Powershell functions with regard to upper/lower case usage?
Use "Cmdlets" (Advanced Functions) instead of (Basic) Functions; they're basically the same thing, only tagged with [Cmdletbinding()]. The reason for this you get support for functionality such as verbose output. http://www.lazywinadmin.com/2015/03/standard-and-advanced-powershell.html
Use a stopwatch to time processes (you could also use measure-command; but any output would be suppressed / consumed by the measure-command function). Timing a command's execution in PowerShell
Have your cmdlet output its values to the pipeline via Write-Output (or you can leave off the function name; any output caused by placing a variable with nothing to process it will be fed to the pipeline; i.e. write-object $a is the same as a line solely consisting of $a).
Capture the output into your $results variable outside of the function, and handle the results there.

How do I use PowerShell to pull headers from a OneNote document

Background:
In my work environment, we have a transitional location for our knowledgebase notes. These reside in a number of OneNote 2016 workbooks which have been maintained over years. I am currently in the middle of delegating content update efforts to our staff and part of this work involves importing all our OneNote notebook names and section names into an excel spreadsheet for hierarchy management.
Task: I spent ages looking online for an easy and quick way to export hierarchy information from OneNote to csv using PowerShell and could not for the life of me find an easy way that worked. The following code resonated through the interwebs but each time I tried to run the code, I kept getting errors.
$onenote = New-Object -ComObject OneNote.Application
$scope = [Microsoft.Office.Interop.OneNote.HierarchyScope]::hsPages
[ref]$xml = $null
$onenote.GetHierarchy($null, $scope, $xml)
$schema = #{one=”http://schemas.microsoft.com/office/onenote/2013/onenote”}
$xpath = “//one:Notebook/one:Section”
Select-Xml -Xml (
$xml.Value) -Namespace $schema -XPath $xpath |
foreach {
$node = $psitem.Node
$npath = Split-Path -Path $node.Path -Parent
$props = [ordered]#{
Workbook = Split-Path -Path $npath -Leaf
Section = $node.Name
}
New-Object -TypeName PSObject -Property $props
}
Error:
The error I would get from executing this code was as follows:
value of type "System.String" to type "System.Xml.XmlNode".
At line:10 char:17
+ Select-Xml -Xml (
Solution:
In the end I had to break down the established connection to the Onenote Application and found a workable solution for OneNote 2016. I've provided my solution but am keen to hear of any other possible ways to manipulate this data effectively in the future:
Function Get-OneNoteHeaders{
[CmdletBinding()]
Param()
Begin
{
$onenote = New-Object -ComObject OneNote.Application
$scope = [Microsoft.Office.Interop.OneNote.HierarchyScope]::hsPages
[ref]$xml = $null
$csvOutput = "c:\temp\onenote-headers.csv"
}
Process
{
$onenote.GetHierarchy($null, $scope, $xml)
[xml]$result = ($xml.Value)
Foreach($notebook in $($result.DocumentElement.Notebook)){
Add-content -Path $csvOutput -Value "$($notebook.name)"
Foreach($section in $($notebook.section)){
Add-content -Path $csvOutput -Value ",$($section.name)"
Foreach($page in $section.page){
Add-content -Path $csvOutput -Value ",,$($page.name)"
}
}
}
}
End{}
}
#Get-OneNoteHeaders

PowerShell return collection object as duplicates

Hi apologies if this has been asked before. If so
I have function that builds a object array of group members. I can see it works fine inside the function but the return object is has exactly double the members - tried an ArrayList and that is even worse. Can somebody please explain what is going on....
function Get-MsolGroupMembers
{
[CmdletBinding()]
param
(
[Parameter(Mandatory=$true, Position=0)]
[string]
$SearchString
)
$groups = Get-MsolGroup -SearchString $SearchString -MaxResults 1
$retObjs = #()
Write-Host -fore Yellow $groups.Count 'Group(s) found'
foreach ($group in $groups)
{
$groupGUID = $group.ObjectId
$groupDisplayName = $group.DisplayName
$groupEmail = $group.EmailAddress
$groupType = $group.GroupType
$groupMembers = Get-MsolGroupMember -GroupObjectId $groupGUID -All
foreach ($groupMember in $groupMembers)
{
$Properties = #{"GroupDisplayName"=$groupDisplayName;
"GroupEmail"=$grouEmail;
"GroupType"=$groupType;
"MemberDisplayName"=$groupMember.DisplayName;
"MemberEmail"=$groupMember.EmailAddress;
"MemberType"=$groupMember.GroupMemberType}
$Obj = New-Object -TypeName PSObject -Property $Properties
Write-Output $Obj | select GroupDisplayName,GroupEmail,GroupType,MemberDisplayName,MemberEmail,MemberType
$retObjs += $Obj
}
return $reObjs;
}
}
$members = Get-MsolGroupMembers -SearchString 'My Test Group'
$members.Count
Sure, this is easy. You're outputting everything twice. Once with the Write-Output line, and then again with the return line. PowerShell functions return anything to the pipeline that is not specifically redirected (such as with Write-Host or Export-Csv), so both of those commands essentially do the same thing, which is where your doubling comes from. Remove one or the other and you'll be all set.

How to extract metadata using a specific filename (get-childitem) rather than looping through ComObject namespace items

I have found multiple code snippets to scroll through a folder and display the metadata of each item in the folder, like this:
function funLine($strIN)
{
$strLine = "=" * $strIn.length
Write-Host -ForegroundColor Yellow "`n$strIN"
Write-Host -ForegroundColor Cyan $strLine
}
$sfolder = "S:\Temp"
$objShell = New-Object -ComObject Shell.Application
$objFolder = $objShell.namespace($sFolder)
foreach ($strFileName in $objFolder.items())
{funline "$($strFileName.name)"
for ($a ; $a -le 266; $a++)
{
$a
if($objFolder.getDetailsOf($strFileName, $a))
{
$hash += #{ $($objFolder.getDetailsOf($objFolder.items, $a)) = $a.tostring() + $($objFolder.getDetailsOf($strFileName, $a)) }
$hash | out-file c:\temp\output.txt -Append
$hash.clear()
}
}
$a=0
}
But in my script, I would like to loop through the folder(s) using Get-ChildItem and for selected files, I would like to use the getDetailsOf() to extract the authors of MS Office documents.
So, knowing the filename (example: $strFileName, can I skip the looping through each $strFileName in $objFolder.items() and just access the metadata details (where $a = 20) for the authors of $sFileName?
I have seen it done using "New-Object -ComObject word.application" but I believe that opens the document, so on a large file system with many files locked by users, this could be slow and painful.
Can I just jump to the index of $objFolder.items() for my selected filename?
Here, I was curious how it'd be done too so I looked it up and made a function that'll add that property to your [FileInfo] object (what's normally passed for a file by the Get-ChildItem cmdlet).
Function Get-CreatedBy{
[cmdletbinding()]
Param(
[Parameter(ValueFromPipelineByPropertyName=$true)]
[Alias("Path")]
[string[]]$FullName
)
Begin{
$Shell = New-Object -ComObject Shell.Application
}
Process{
ForEach($FilePath in $FullName){
$NameSpace = $Shell.NameSpace((Split-Path $FilePath))
$File = $NameSpace.ParseName((Split-Path $FilePath -Leaf))
$CreatedBy = $NameSpace.GetDetailsOf($File,20)
[System.IO.FileInfo]$FilePath|Add-Member 'CreatedBy' $CreatedBy -PassThru
}
}
}
Then you can just pipe things to that, or specify a path directly like:
Get-ChildItem *.docx | Get-CreatedBy | FT Name,CreatedBy
or
Get-CreatedBy 'C:\Temp\File.docx' | Select -Expand CreatedBy
Edit: Fixed for arrays of files! Sorry about the previous error.
Thanks Matt! Although that question was different, it had the one piece I was looking for - how to reference $objFolder.items().item($_.Name)
So this makes a quick little snippet to display the Authors (or any other metadata field):
$FullName = "S:\Temp\filename.xlsx"
$Folder = Split-Path $FullName
$File = Split-Path $FullName -Leaf
$objShell = New-Object -ComObject Shell.Application
$objFolder = $objShell.namespace($Folder)
$Item = $objFolder.items().item($File)
$Author = $objFolder.getDetailsOf($Item, 20)
Write-Host "$FullName is owned by $Author"
Where Author is the 20th metadata item.