Powershell function-get multple values - powershell

I have file 1.csv
number, name # column name
1,john
2,mike
3,test
4,test2
...
I created function for returning all values from this csv (number,name)
Function Get-CSV {
[CmdletBinding()]
param (
# CSV File path
[Parameter(Mandatory=$true)][string]$path
)
#Create an hashtable variable
[hashtable]$return = #{}
Import-Csv $path |
ForEach-Object {
$number = $_.number
$name = $_.name
$return.name = $name
$return.number = $number
return $return
}
# return $return un-commenting this line don't change output
}
# calling function
$a = Get-CSV "C:\Users\1.csv"
$a.number
$a.name
I get only one (last row from CSV) - a.name = test2 and a.number = 4
How to get all rows from CSV when calling this function ?

You need to construct an array of hashtables for this to work. Even better, I would create an array of objects because it gives you control over the property names. You can change the function definition to:
Function Get-CSV {
[CmdletBinding()]
param (
# CSV File path
[Parameter(Mandatory=$true)][string]$path
)
#Create an empty array variable
$return = #()
Import-Csv $path |
ForEach-Object {
$return += ,(New-Object PSObject -property #{'name'= $_.name; 'number'= $_.number})
}
return $return
}
This gives you:
$a = Get-CSV "C:\Users\1.csv"
$a
name number
---- ------
john 1
mike 2
test 3
test2 4
Note
I'm not sure on your exact use case, but Import-Csv already gives you the information back as an object, so there may not be a need to create a separate one.

After lot of googling and try/errors found solution:
Function Get-CSV {
[CmdletBinding()]
param (
# CSV File path
[Parameter(Mandatory=$true)][string]$path
)
Import-Csv $path |
ForEach-Object {
$number = $_.number
$name = $_.name
[pscustomobject]#{
name = $name
number = $number
}
}
}

Related

Powershell script to search and replace text in a file using two columns in a separate reference file

I want a script that can help me check for the name of keyset (column a) in Sample.cvs and then replace the current command(column b) with new command (column c) in the Source text file.
CSV file: Sample.csv
A. | B. | C.
Manock | 2B | 2ab
Sterling | 3F | 3sf
Source file text: Source.txt
keyset "Manock"
(
key("SELECT")
command ("display/app=%disapp% "2B")
);
So desired output:
keyset "Manock"
(
key("SELECT")
command ("display/app=%disapp% "2ab")
);
Powershell Script:
New-Item -Path "C:\Users\e076200\Desktop\ks_update\source.txt" -ItemType File -Force
$data = Get-Content C:\Users\e076200\Desktop\ks_update\source.ddl
Add-Content -Value $data -Path "C:\Users\e076200\Desktop\ks_update\source.txt"
$foundline = $false
$a = 0
$Etxt = foreach($line in Get-Content C:\Users\e076200\Desktop\ks_update\source.txt)
{
if ($line -match 'keyset "Manock"' )
{
$a = 0
$foundline = $true
}
$a= $a + 1
if($line -match "display/app" -and $a -eq 5 -and $foundline -eq $true)
{
$line = $line.replace('2b' , '2ab')
$line
}
else
{
$line
}
}
$Etxt | Set-Content C:\Users\e076200\Desktop\ks_update\source.txt -Force
$users = Import-CSV -Path:\Users\e076200\Desktop\ks_update\sample.csv
I've figured out how to find and replace one line in the file directly. I've also figured out how to import the csv. I need help on how to make the logic parameterized and use column A of CSV as the match piece and column c as the replacement piece.
Script Explanation.
New-Item -Path "C:\Users\e076200\Desktop\ks_update\source.txt" -ItemType File -Force
New-Item creates new text file # location defined by -Path using name specified at the end, source.
ItemType to define type of document, -Force is force command.
$data = Get-Content C:\Users\e076200\Desktop\ks_update\source.ddl
Retrieves ddl and stores in variable.
Add-Content -Value $data -Path "C:\Users\e076200\Desktop\ks_update\source.txt"
Transfers content from variable to new text file created.
$foundline = $false
conditional variable defined for when keyset identifier is found.
$a = 0
counter defined for if statement.
$Etxt = foreach($line in Get-Content C:\Users\e076200\Desktop\ks_update\source.txt)
$Etxt - for loop
$line - variable for each line in txt
{
if ($line -match 'keyset "Manock"' )
{
$a = 0
$foundline = $true
}
If keyset identifier is found, set counter to 0 and set conditional variable to true
$a= $a + 1
if($line -match "display/app" -and $a -eq 5 -and $foundline -eq $true)
{
$line = $line.replace('2b' , '2ab')
$line
Match found, PS runs logic, line with keyset identifier == 0 + 1....up until line = 5 where we find item to be replaced.
For redundancy, line reader set to check for line identifier, ("display/app") on expected line.
If Redundant check if met and counter is 5 then we replace word with the line.replace function.
Overwritten data is returned in $line
}
else
{
$line
}
Else retain line
}
$Etxt | Set-Content C:\Users\e076200\Desktop\ks_update\source.txt -Force
Updated text file
$users = Import-CSV -Path:\Users\e076200\Desktop\ks_update\sample.csv
Imports Reference csv file
Please make explanation as dumbed down as possible. Thank you.

Powershell : Find a group of text in a file then extract a specific line in that group of text

I've been on this for few days now, I'm trying to parse multiple text files containing data like this :
[Cluster1]
GatewayIp=xx.xxx.xxx.xx
IpAddress=xx.xxx.xxx.x
MTU=0000
NetMask=xxx.xxx.xxx.0
Port=xxx
Protocol=xxxx/xxxxx
Sessions=xxxxxx
Bands=xxx, xxx, x
Binding=xxxxx
GroupNumber=x
InitQueue=xxxxxx
Interface=xxxxxx
Process=xxx
SupportsCar=No
SupportsCom=Yes
SupportsPos=Yes
SupportsXvd=No
[Cluster2]
GatewayIp=xx.xxx.xxx.xx
IpAddress=xx.xxx.xxx.x
MTU=0000
NetMask=xxx.xxx.xxx.0
Port=xxx
Protocol=xxxx/xxxxx
Sessions=xxxxxx
Bands=xxx, xxx, x
Binding=xxxxx
GroupNumber=x
InitQueue=xxxxxx
Interface=xxxxxx
Process=xxx
SupportsCar=No
SupportsCom=No
SupportsPos=No
SupportsXvd=Yes
I want to extract the "IpAddress" in the section where thoses lines are present :
SupportsCom=Yes
SupportsPos=Yes
The thing is, I've tried using -context to grab the nth line after the section name "[Cluster1]", but that section name is different from file to file ...
$ip = Select-String -Path "$location" -Pattern "\[Cluster1\]" -Context 0,2 |
Foreach-Object {$_.Context.PostContext}
I've tried using the Precontext to grab the Nth line before SupportsCom=Yes, but the line position of "IpAddress=" is different from file to file ...
$ip = Select-String -Path "$location" -Pattern " SupportsCom=Yes" -Context 14,0 |
Foreach-Object { $_.Line,$_.Context.PreContext[0].Trim()}
Is there a way to grab the section containing "SupportsCom=Yes" knowing that the section is delimited by a blank line above and below, then search in that section a string that contains "IpAddress=" then return the value afterthe "=" ?
Ok, since you are not allowed to use a module (perhaps later..), this should get you what you want
# change the extension in the Filter to match that of your files
$configFiles = Get-ChildItem -Path 'X:\somewhere' -Filter '*.ini' -File
$result = foreach ($file in $configFiles) {
# initialize these variables to $null
$IpAddress = $supportsCom = $supportsPos = $null
# loop through the file line by line and try regex matches on them
switch -Regex -File $file {
'^\[([^\]]+)]' {
# did we get all wanted entries from the previous cluster?
if ($IpAddress -and $supportsCom -and $supportsPos) {
if ($supportsCom -eq 'Yes' -and $supportsPos -eq 'Yes') {
# just output the IpAddress so it gets collected in variable $result
$IpAddress
}
# reset the variables to $null
$IpAddress = $supportsCom = $supportsPos = $null
}
# start a new cluster
$cluster = $matches[1]
}
'^\s+IpAddress\s*=\s*(\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})' { $IpAddress = $matches[1]}
'^\s+SupportsCom\s*=\s*(Yes|No)' { $supportsCom = $matches[1] }
'^\s+SupportsPos\s*=\s*(Yes|No)' { $supportsPos = $matches[1]}
}
}
# show results on screen
$result
# or save as text file
$result | Set-Content -Path 'X:\somewhere\IpAddresses.txt'
Updated answer:
If you don't care about the name of the section(s), where IpAddress is found in, you can use this "one-liner" (broken into multiple lines for readability):
$ip = (Get-Content $location -Raw) -split '\[.+?\]' |
ConvertFrom-StringData |
Where-Object { $_.SupportsCom -eq 'Yes' -and $_.SupportsPos -eq 'Yes' } |
ForEach-Object IpAddress
The Get-Content line reads the input file as a single multi-line string and splits it at the section headers (e. g. [Cluster1]).
ConvertFrom-StringData converts the Key = Value lines into one hashtable per section.
For each hashtable, Where-Object checks whether it contains SupportsCom=Yes and SupportsPos=Yes
ForEach-Object IpAddress is shorthand for writing Select-Object -ExpandProperty IpAddress which gives you the actual value of IpAddress instead of an object that contains a member named IpAddress.
Note that $ip can be either a single string value or an array of strings (if there are multiple matching sections).
Original answer:
You could also write a general-purpose function that converts INI sections into objects. This enables you to use the pipeline with a simple Where-Object statement to get the data you are interested in.
Generic function to output INI sections as objects, one by one:
Function Read-IniObjects {
[CmdletBinding()]
param (
[Parameter(Mandatory, ValueFromPipeline)] [String] $Path
)
process {
$section = #{} # A hashtable that stores all properties of the currently processed INI section.
# Read file line by line and match each line by the given regular expressions.
switch -File $Path -RegEx {
'^\s*\[(.+?)\]\s*$' { # [SECTION]
# Output all data of previous section
if( $section.Count ) { [PSCustomObject] $section }
# Create new section data
$section = [ordered] #{ IniSection = $matches[ 1 ] }
}
'^\s*(.+?)\s*=\s*(.+?)\s*$' { # KEY = VALUE
$key, $value = $matches[ 1..2 ]
$section.$key = $value
}
}
# Output all data of last section
if( $section.Count ) { [PSCustomObject] $section }
}
}
Usage:
$ip = Read-IniObjects 'test.ini' |
Where-Object { $_.SupportsCom -eq 'Yes' -and $_.SupportsPos -eq 'Yes' } |
ForEach-Object IpAddress
Notes:
The INI file is parsed using the switch statement, which can directly use a file as input. This is much faster than using a Get-Content loop.
As we are using -RegEx parameter, the switch statement matches each line of the file to the given regular expressions, entering the case branches only if the current line matches.
Get detailed explanation about how the RegEx's work:
match lines like [Section] -> RegEx101
match lines like Key = Value -> RegEx101
ForEach-Object IpAddress is shorthand for writing Select-Object -ExpandProperty IpAddress which gives you the actual value of IpAddress instead of an object that contains a member named IpAddress.
Note that $ip can be either a single string value or an array of strings (if there are multiple matching sections).

Remove the need to use out-file only to import the file immediately using PowerShell just to convert the base type

I am attempting to turn the file below into one that contains no comments '#', no blank lines, no unneeded spaces, and only one entry per line. I'm unsure how to run the following code without the need to output the file and then reimport it. There should be code that doesn't require that step but I can't find it. The way I wrote my script also doesn't look right to me even though it works. As if there was a more elegant way of doing what I'm attempting but I just don't see it.
Before File Change: TNSNames.ora
#Created 9_27_16
#Updated 8_30_19
AAAA.world=(DESCRIPTION =(ADDRESS_LIST =
(ADDRESS =
(COMMUNITY = tcp.world)
(PROTOCOL = TCP)
(Host = www.url1111.com)
(Port = 1111)
)
)
(CONNECT_DATA = (SID = SID1111)
)
)
#Created 9_27_16
BBBB.world=(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(COMMUNITY=tcp.world)(PROTOCOL=TCP)(Host=url2222.COM)(Port=2222))(ADDRESS=(COMMUNITY=tcp.world)(PROTOCOL=TCP)(Host=url22222.COM)(Port=22222)))(CONNECT_DATA=(SID=SID2222)))
CCCC.world=(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(Host=url3333.COM)(Port=3333))(CONNECT_DATA=(SID=SID3333)))
DDDD.url =(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(COMMUNITY=tcp.world)(PROTOCOL=TCP)(Host=URL4444 )(Port=4444))(ADDRESS=(COMMUNITY=TCP.world)(PROTOCOL=TCP)(Host=URL44444 )(Port=44444)))(CONNECT_DATA=(SID=SID4444 )(GLOBAL_NAME=ASDF.URL)))
#Created 9_27_16
#Updated 8_30_19
After File Change:
AAAA.world=(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(COMMUNITY=tcp.world)(PROTOCOL=TCP)(Host=www.url1111.com)(Port=1111)))(CONNECT_DATA=(SID=SID1111)))
BBBB.world=(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(COMMUNITY=tcp.world)(PROTOCOL=TCP)(Host=url2222.COM)(Port=2222))(ADDRESS=(COMMUNITY=tcp.world)(PROTOCOL=TCP)(Host=url22222.COM)(Port=22222)))(CONNECT_DATA=(SID=SID2222)))
CCCC.world=(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(Host=url3333.COM)(Port=3333))(CONNECT_DATA=(SID=SID3333)))
DDDD.url=(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(COMMUNITY=tcp.world)(PROTOCOL=TCP)(Host=URL4444)(Port=4444))(ADDRESS=(COMMUNITY=TCP.world)(PROTOCOL=TCP)(Host=URL44444)(Port=44444)))(CONNECT_DATA=(SID=SID4444)(GLOBAL_NAME=ASDF.URL)))
Code:
# Get the file
[System.IO.FileInfo] $File = 'C:\temp\TNSNames.ora'
[string] $data = (Get-Content $File.FullName | Where-Object { !$_.StartsWith('#') }).ToUpper()
# Convert the data. This part is where any (CONNECT_DATA entry ends up on it's own line.
$Results = $data.Replace(" ", "").Replace("`t", "").Replace(")))", ")))`n")
# Convert $Results from BaseType of System.Object to System.Array
$Path = '.\.vscode\StringResults.txt'
$Results | Out-File -FilePath $Path
$Results = Get-Content $Path
# Find all lines that start with '(CONNECT_DATA'
for ($i = 0; $i -lt $Results.Length - 1; $i++) {
if ($Results[$i + 1].StartsWith("(CONNECT_DATA")) {
# Add the '(CONNECT_DATA' line to the previous line
$Results[$i] = $Results[$i] + $Results[$i + 1]
# Blank out the '(CONNECT_DATA' line
$Results[$i + 1] = ''
}
}
# Remove all blank lines
$FinalForm = $null
foreach ($Line in $Results) {
if ($Line -ne "") {
$FinalForm += "$Line`n"
}
}
$FinalForm
So the crux of your problem is that you have declared $data as a [string] which is fine because probably some of your replace operations work better as a single string. Its just that $Results also then ends up being a string so when you try to index into $Results near the bottom these operations fail. You can however easily turn your $Results variable into a string array using the -split operator this would eliminate the need to save the string to disk and import back in just to accomplish the same. See comments below.
# Get the file
[System.IO.FileInfo] $File = 'C:\temp\TNSNames.ora'
[string] $data = (Get-Content $File.FullName | Where-Object { !$_.StartsWith('#') }).ToUpper()
# Convert the data. This part is where any (CONNECT_DATA entry ends up on it's own line.
$Results = $data.Replace(' ', '').Replace("`t", '').Replace(')))', ")))`n")
# You do not need to do this next section. Essentially this is just saving your multiline string
# to a file and then using Get-Content to read it back in as a string array
# Convert $Results from BaseType of System.Object to System.Array
# $Path = 'c:\temp\StringResults.txt'
# $Results | Out-File -FilePath $Path
# $Results = Get-Content $Path
# Instead split your $Results string into multiple lines using -split
# this will do the same thing as above without writing to file
$Results = $Results -split "\r?\n"
# Find all lines that start with '(CONNECT_DATA'
for ($i = 0; $i -lt $Results.Length - 1; $i++) {
if ($Results[$i + 1].StartsWith('(CONNECT_DATA')) {
# Add the '(CONNECT_DATA' line to the previous line
$Results[$i] = $Results[$i] + $Results[$i + 1]
# Blank out the '(CONNECT_DATA' line
$Results[$i + 1] = ''
}
}
# Remove all blank lines
$FinalForm = $null
foreach ($Line in $Results) {
if ($Line -ne '') {
$FinalForm += "$Line`n"
}
}
$FinalForm
Also, for fun, try this out
((Get-Content 'C:\temp\tnsnames.ora' |
Where-Object {!$_.StartsWith('#') -and ![string]::IsNullOrWhiteSpace($_)}) -join '' -replace '\s' -replace '\)\s?\)\s?\)', ")))`n" -replace '\r?\n\(Connect_data','(connect_data').ToUpper()

Import Excel data into PowerShell variables

I have an Excel File which has an unknown number of records in it, and these 3 columns:
Variable Name, Store Number, Email Address
I use this in QlikView to import data for certain stores and then create a separate report for each store in the list. I then need to email each report to each individual store (store number will be in the report file name).
So in PowerShell I would like to read the Excel File and set variables for each store:
$Store1 = The Store Number in Row 2 of the Excel File
$Store1Email = The Store Email in Row 2 of the Excel File
$Store2 = The Store Number in Row 3 of the Excel File
$Store2Email = The Store Email in Row 3 of the Excel File
etc. for each Storein the file (can be any number of stores).
Please note the "Variable Name" in the excel file must be ignored (that is for QLikView) and the PowerShell variables must be named as per my above examples, each time incrementing the number.
Check out my PowerShell Excel Module on Github. You can also grab it from the PowerShell Gallery.
$stores = Import-Excel C:\Temp\store.xlsx
$stores[2].Name
$stores[2].StoreNumber
$stores[2].EmailAddress
''
'All stores'
'----------'
$stores
Ok, first off if you are going to be working with actual .XLS or .XLSX or .XLSM files I would highly suggest using the Import-XLS function from the TechNet gallery (found here).
After that, just reference the object it imports to send the emails instead of making objects for each store. Such as:
$StoreList = Import-XLS <path to Excel file>
GC <report folder> | %{
$Current = $_
$Store = $StoreList|?{$_.StoreNumber -match $Current.BaseName}|Select -ExpandProperty StoreNumber
$Email = $StoreList|?{$_.StoreNumber -match $Current.BaseName}|Select -ExpandProperty StoreEmail
<code to send $Current to $Email>
}
My preference is to Save-As the Excel file to a '.csv' type. The comma separated value can easily be imported into PowerShell.
$csvFile = Import-Csv -Path c:\scripts\temp\excelFile.csv
#now the entire Excel '.csv' file is saved into csvFile variable
$csvFile |Get-Member
#look at the properties
Remember to study the greats so your PowerShell script looks great. Jeffery Snover, Jason Hicks, Don Jones, Ashley McGlone, and anyone on their friends list ha ha
The above answers usually work, but I just had a project with excel datasheets that caused some problems.
edit: Here's a much more advanced version that will pull it into an object, can handle blank and duplicate column names, and can skip human information at the beginning of the worksheet by looking for something in the header row. I've also included some example usages
Your example:
$file = New-Module -AsCustomObject -ScriptBlock $file_template
$file.from_excel("c:\folder\file.xls")
$Store1 = $file.data[0]."Store Number" #first row, column named "Store Number"
$Store1Email = $file.data[0]."Store Email" #first row, column named "Store Email"
foreach ($row in $file.data)
{
write-host "Store: $($row."Store Number")"
write-host "Store Email: $($row."Store Email")"
}
Example 1:
# Simplest example
$file = New-Module -AsCustomObject -ScriptBlock $file_template
$file.from_excel("c:\folder\file.xls")
$file.data[0]
Example 2:
#advanced usage
$file = New-Module -AsCustomObject -ScriptBlock $file_template
$file.header_contains="First Name" # if included it will drop everything before the first line that contains this, useful if there are instructions for humans in the worksheet
$file.indexer_column = 5 # Default: 1 (first column); This column's contents will set the minimum number of rows, use if there are blank rows in your file but more data after them
$file.worksheet_index = "January" # Default: 1; can be a sheet index or sheet name
$file.filename = "c:\folder\file.xls" #can set this independently, useful for validation and troubleshooting
$file.from_excel() #This is where we actually pull from excel
$collected = $file.data|ogv -pass thru #this is a neat way to select some rows you want
$file.headers.count # It stores an array of the headers here, useful for troubleshooting and advanced logic
Excel Reader pseudoclass
$file_template = {
# -- universal --
$filename = ""
$delimiter = ","
$headers = #()
$data = #()
# -- used by some functions --
# we put these here to allow assigning them before calling functions, which improves readability and auditability
$header_contains=""
$indexer_column=1
$worksheet_index=1
function from_excel(
$filename=$this.filename,
$worksheet_index=$this.worksheet_index
)`
{
$this.filename = $filename
$this.worksheet_index = $worksheet_index
$data_by_row = $this.from_excel_as_csv() # $data_by_row = $file.from_excel_as_csv($test_file)
$data_by_row = $data_by_row -split"`n"
#if ($this.headers.count -lt 1) {$this.headers = $data_by_row[0] -split $this.delimiter} #this would let us set headers elsewhere which is more flexible but less adaptive, Because columns change unpredicably we need something more adaptive
$temp_headers = $data_by_row[0] -split $this.delimiter
$temp_headers = $this.fix_blank_headers($temp_headers)
$this.headers = $this.dedupe_headers($temp_headers)
$this.data = $data_by_row|select -Skip 1|ConvertFrom-Csv -Header $this.headers -Delimiter $this.delimiter
}
function from_csv($filename=$this.filename)`
{
$this.filename = $filename
$this.headers = (Get-Content $this.filename -ReadCount 1|select -first 1) -split $this.delimiter
$this.data = Get-Content $this.filename|ConvertFrom-Csv -Delimiter $this.delimiter
}
function from_excel_as_csv(
$filename=$this.filename,
$worksheet_index=$this.worksheet_index
)`
{
$this.filename = $filename
$this.worksheet_index = $worksheet_index
#set up excel
Write-Host "Importing from excel, this may take a little while..."
$excel = New-Object -ComObject Excel.Application
$excel.DisplayAlerts = $false
$excel.Visible = $false
$workbook = $excel.workbooks.open($this.filename)
$worksheet = $workbook.Worksheets.Item($this.worksheet_index)
#import from excel
try{
$data_by_row = ""
$indexed_column = $worksheet.columns.item($this.indexer_column).value2 #we use this to work around some files having headers with blank space
$minimum_rows = (($indexed_column -join "◘").TrimEnd("◘") -split "◘").count # This Strips the million or so extra blank rows excel appends to get a realistic column length.
[bool]$header_found = 0
$i=1
do `
{
$row = $worksheet.rows.item($i).value2
$row_as_text = $row -join "◘" # ◘ (alt+8) is just a placeholder that's unlikely to show up in the text
$row_as_text = $row_as_text -replace $this.delimiter,"."
$row_as_text = $row_as_text.TrimEnd("◘")
$row_as_text = $row_as_text -replace "◘",$this.delimiter
if ($row_as_text -like "*$($this.header_contains)*"){[bool]$header_found=1}
if ($header_found) {$data_by_row+="$row_as_text`n"}
$i++
}
while ( ($row_as_text.Length -gt 1) -or ($i -lt $minimum_rows) )
}
catch {Write-Warning "ERROR Importing from excel"}
#close excel
$workbook.Close()
$excel.Quit()
write-host "Done importing from excel"
return $data_by_row
}
function dedupe_headers($headers){
$dupes = ($headers|group)|?{$_.count -gt 1}
if ($dupes.count -ge 1)
{
foreach ($dupe in $dupes)
{ #$dupe = $dupes[0]
$i=1
$new_headers = #()
foreach ($header in $headers)
{ #$header = $headers[0]
if ($header -eq $dupe.name)
{
$header = "$($header)_$($i)" # "header_#"
$i++
}
$new_headers += $header
}
}
}
else {$new_headers = $headers} # no duplicates found
return $new_headers
}
function fix_blank_headers($headers)
{
$replace_blanks_with = "_"
$new_headers = #()
foreach ($header in $headers)
{
if ($header -eq "") {$new_headers += $replace_blanks_with}
else {$new_headers += $header}
}
if ($new_headers.count -ne $header)
{
$error_json = #($headers),#($new_headers)|ConvertTo-Json -Compress
Write-Error "Error when fixing blank headers, original and new counts are different $($error_json)"
}
return $new_headers
}
<# function some_function($some_parameter){return $some_parameter} #>
Export-ModuleMember -Function * -Variable *
}
Forgive the ugliness here. I am not a programmer, so there are undoubtedly more optimized ways to do this, as well as better formatting. It will work, however, if I understand your requirements correctly.
$excelfile = import-csv "c:\myfile.csv"
$i = 1
$excelfile | ForEach-Object {
New-Variable "Store$i" $_."Store Number"
$iemail = $i.ToString() + "Email"
New-Variable "Store$iemail" $_."Email Address"
$i ++
}
edit: as per the reply to your original post, this works with a csv file. Just save it to csv first if necessary.
$excelfile = import-csv "C:\Temp\store.csv"
$i = 1 $excelfile | ForEach-Object {
$NA= $_."Name"
$SN= $_."StoreNumber"
Write-Output "row $i"
$NA
$SN
$i++ }

Script to export to excel

I Have below script:-- looking for help to convert the output to excel format
$servers = get-content “c:\list.txt”
foreach ($server in $servers)
{
$server
$command = “quser /server:” + $server
invoke-expression $command
}
when executed getting in below format the output.
server1
USERNAME SESSIONNAME ID STATE IDLE TIME LOGON TIME
Vdw231 ica-tcp#8 7 Active . 11/5/2012 10:40 AM
Vdw232 ica-tcp#60 16 Active 16:18 11/5/2012 2:22 PM
Vdw233 ica-tcp#71 3 Active . 11/6/2012 6:10 AM
Vdw234 ica-tcp#72 1 Active 3 11/6/2012 6:59 AM
Vdw235 ica-tcp#73 5 Active . 11/6/2012 6:59 AM
Vdw236 rdp-tcp#74 2 Active . 11/6/2012 7:07 AM
server2
USERNAME SESSIONNAME ID STATE IDLE TIME LOGON TIME
Vdw210 ica-tcp#44 14 Active 13:50 11/5/2012 9:03 AM
Vdw211 ica-tcp#67 6 Active . 11/6/2012 1:56 AM
Vdw212 ica-tcp#70 1 Active 45 11/6/2012 6:34 AM
Vdw213 ica-tcp#72 9 Active 25 11/6/2012 6:53 AM
Vdw214
server3
USERNAME SESSIONNAME ID STATE IDLE TIME LOGON TIME
Vdw215 rdp-tcp#131 1 Active 19 11/5/2012 1:42 AM
Vdw216 rdp-tcp#132 4 Active 17 11/5/2012 2:06 AM
Vdw217 rdp-tcp#143 6 Active . 11/6/2012 3:31 AM
My requirement is i wanted to convert this output to excel format for submitting to management. Below is the excel format that i am thinking...to have from above script...
I've rewritten this, but I didn't test the full script and it's not optimized. If you encounter any
problems, feel free to contact me.
$statuses = #()
$servers = get-content "c:\list.txt"
$splitter = [regex]"\s+"
foreach ($server in $servers)
{
$command = "quser /server:$server"
$lines = #((invoke-expression $command | Out-String) -split "`n")
#remove header
$lines = $lines[1..$lines.count]
foreach ($line in $lines)
{
$attrs = #($splitter.Split($line.Trim(),6))
if ( $attrs -eq 6 )
{
$status = New-Object PSCustomObject -Property #{
"SERVER"=$server;
"USERNAME"=$attrs[0];
"SESSIONNAME"=$attrs[1];
"ID"=$attrs[2];
"STATE"=$attrs[3];
"IDLE_TIME"=$attrs[4];
"LOGON_TIME"=[datetime]$attrs[5]}
$statuses += $status
}
}
}
#your filter here
#$statuses = $statuses | where{ XXXXX }
$statuses | Export-Csv G:/test.csv -NoTypeInformation
You need to convert PSObject to an excel compatible Array and after you can write this array in excel sheet
include this code in your *.PS1 script, and use like this : get-process | Export-Excel
#=============================================================================
# Convert powershell Object to Array for Excel
#=============================================================================
function ConvertTo-MultiArray {
<#
.Notes
NAME: ConvertTo-MultiArray
AUTHOR: Tome Tanasovski
Website: http://powertoe.wordpress.com
Twitter: http://twitter.com/toenuff
Version: 1.2
.Synopsis
Converts a collection of PowerShell objects into a multi-dimensional array
.Description
Converts a collection of PowerShell objects into a multi-dimensional array. The first row of the array contains the property names. Each additional row contains the values for each object.
This cmdlet was created to act as an intermediary to importing PowerShell objects into a range of cells in Exchange. By using a multi-dimensional array you can greatly speed up the process of adding data to Excel through the Excel COM objects.
.Parameter InputObject
Specifies the objects to export into the multi dimensional array. Enter a variable that contains the objects or type a command or expression that gets the objects. You can also pipe objects to ConvertTo-MultiArray.
.Inputs
System.Management.Automation.PSObject
You can pipe any .NET Framework object to ConvertTo-MultiArray
.Outputs
[ref]
The cmdlet will return a reference to the multi-dimensional array. To access the array itself you will need to use the Value property of the reference
.Example
$arrayref = get-process |Convertto-MultiArray
.Example
$dir = Get-ChildItem c:\
$arrayref = Convertto-MultiArray -InputObject $dir
.Example
$range.value2 = (ConvertTo-MultiArray (get-process)).value
.LINK
http://powertoe.wordpress.com
#>
param(
[Parameter(Mandatory=$true, Position=1, ValueFromPipeline=$true)]
[PSObject[]]$InputObject
)
BEGIN {
$objects = #()
[ref]$array = [ref]$null
}
Process {
$objects += $InputObject
}
END {
$properties = $objects[0].psobject.properties |%{$_.name}
$array.Value = New-Object 'object[,]' ($objects.Count+1),$properties.count
# i = row and j = column
$j = 0
$properties |%{
$array.Value[0,$j] = $_.tostring()
$j++
}
$i = 1
$objects |% {
$item = $_
$j = 0
$properties | % {
if ($item.($_) -eq $null) {
$array.value[$i,$j] = ""
}
else {
$array.value[$i,$j] = $item.($_).tostring()
}
$j++
}
$i++
}
$array
}
}
#=============================================================================
# Export pipe in Excel file
#=============================================================================
function Export-Excel {
[cmdletBinding()]
Param(
[Parameter(Mandatory=$true, Position=1, ValueFromPipeline=$true)]
[PSObject[]]$InputObject
)
begin{
$header=$null
$row=1
$xl=New-Object -ComObject Excel.Application
$wb=$xl.WorkBooks.add(1)
$ws=$wb.WorkSheets.item(1)
$xl.Visible=$false
$xl.DisplayAlerts = $false
$xl.ScreenUpdating = $False
$objects = #()
}
process{
$objects += $InputObject
}
end{
$array4XL = ($objects | ConvertTo-MultiArray).value
$starta = [int][char]'a' - 1
if ($array4XL.GetLength(1) -gt 26) {
$col = [char]([int][math]::Floor($array4XL.GetLength(1)/26) + $starta) + [char](($array4XL.GetLength(1)%26) + $Starta)
} else {
$col = [char]($array4XL.GetLength(1) + $starta)
}
$ws.Range("a1","$col$($array4XL.GetLength(0))").value2=$array4XL
$wb.SaveAs("$([Environment]::GetFolderPath('desktop'))\Export-Excel ($(Get-Date -Format u)).xlsx")
$xl.Quit()
Remove-Variable xl
}
}
you get