In Powershell -- Export object to textfile in custom format - powershell

Since, i am a beginner, i 've no much hands-on to the powershell programming.Although
i had a script developed to insert data from an array to the csv file as follows:
#Following is the array
$InventoryReport = New-Object -TypeName PSobject -Property #{
ComputerName = "1myComputerName"
DomainName = "2myComputerDomain"
Manufacturer = "3myComputerManufacturer"
}
#Now to export the data to csv, i am using following:
$InventoryReport |Select-Object -Property ComputerName, DomainName, Manufacturer | Export-Csv -Path "c:\abc.csv" -NoTypeInformation -ErrorAction Stop
#This works fine
and the output of above is :
"ComputerName","DomainName","Manufacturer"
"1myComputerName","2myComputerDomain","3myComputerManufacturer"
....
Now, i don't want this , i want the ouput to appear in columnar fashion i.e.
"ComputerName","1myComputerName"
"DomainName","2myComputerDomain"
"Manufacturer","3myComputerManufacturer"
What code changes should be done to achieve this. ?

Either you want CSV, which you already have, or you want a custom txt-file. If you want the latter, try this:
$comp = gwmi win32_computersystem
#"
"ComputerName","$($comp.Name)"
"DomainName","$($comp.Domain)"
"Manufacturer","$($comp.Manufacturer)"
"# | Out-File test.txt
sample of test.txt output below. I've got a non-domain, custom built pc, so don't worry about the values.
"ComputerName","GRAIMER-PC"
"DomainName","WORKGROUP"
"Manufacturer","System manufacturer"
EDIT I suggest you learn what CSV is. Remember that CSV is not a fileformat, it's a formatting-style used in a normal textfile. The .csv extension is just cosmetic to let people know that the textfile uses the csv-style. Check out Wikipedia and Technet
In the CSV file, each object is represented by a comma-separated list
of the property values of the object. The property values are
converted to strings (by using the ToString() method of the object),
so they are generally represented by the name of the property value.
Export-CSV does not export the methods of the object.
The format of an exported file is as follows:
-- The first line of the CSV file contains the string '#TYPE ' followed by the fully qualified name of the object, such as #TYPE
System.Diagnostics.Process. To suppress this line, use the
NoTypeInformation parameter.
-- The next line of the CSV file represents the column headers. It contains a comma-separated list of the names of all the properties of
the first object.
-- Additional lines of the file consist of comma-separated lists of the property values of each object.

You could try something like this:
$InventoryReport | Format-List ComputerName, DomainName, Manufacturer `
| Out-String -Stream `
| ? { $_ -ne '' } `
| % { $_ -replace '\s+:\s+', '","' -replace '(^|$)', '"' }

Related

How can I replace a string inside a pipe?

I'm trying to replace some specific parts of a selected string but am only returning the length property. Here's my code:
Get-ChildItem "StartPath/Something/Files" -Recurse -File | Select "FullName | Foreach {$_.FullName -replace "StartPath",""} | Export-Csv "ResultPath.csv"
If I omit the foreach bit, this works in that it spits out the full path. I'd like to trim the full path as I'm iterating over tons of files. I'm trying to replace a bit of the path in the beginning of the string but my code above just spits out a CSV file with just string lengths.
Looks like:
"123"
"12"
"52"
and so forth.
The intended result would be a csv file with instead of:
StartPath/Something/Files1
StartPath/Something/Files2
I'd have
Something/Files1
Something/Files2
I've tried a number of things and can't seem to figure it out. Any help is appreciated.
If you pass a string to select / Select-Object (to its positionally implied -Property parameter), it must be a property name.[1]
If you want to perform open-ended operations and/or produce open-ended output for each input object, you must use the ForEach-Object cmdlet:
Get-ChildItem "StartPath/Something/Files" -Recurse -File |
ForEach-Object {
[pscustomobject] #{ FullName = $_.FullName -replace 'StartPath' }
} |
Export-Csv "ResultPath.csv"
Note the use of a [pscustomobject] wrapper that defines a FullName property, so that Export-Csv creates a CSV with that property as its (only) column.
If you pipe [string] instances directly to Export-Csv, their properties are serialized to the output file - and a [string]'s only (public) property is its length (.Length), which is what you saw.
[1] There's also a way to create properties dynamically, using so-called calculated properties, which are defined via hash tables.

How to clean up some data in a csv file using Powershell scripting and save the result as a new csv file?

I have a csv (employees.csv) file of 3 columns contain 'n' number of employee details and in my first column i have employeeid in a format 11_22$(contain integers and non integer values-string) and here I want to remove all special characters and i want to keep only 1122(only integers).
In my second column I have their website address and is of format www.website.com and here i want to replace www by http that is i need http.website.com. In my third column i have their dob in format YYYY:MM:DD and i want to change it to DD:MM:YYYY format .
Finally i want to save/export the result to a new csv file. How can i achieve all these using PowerShell scripting?
Although i have no idea why you would want websites to become something like 'http.website.com' instead of 'http://website.com', you can do that using the code below.
########################################################################
# your input file 'employees.csv" looks like this
########################################################################
"employeeid","website","dob"
"11_22$","www.website.com","2000:04:12"
"22_33$","www.stackoverflow.com","1990:04:12"
"33_44$","www.somothersite.org","1970:04:12"
########################################################################
# after running the code the new file 'newemployees.csv' looks like this
########################################################################
"employeeid","website","dob"
"1122","http.website.com","12:04:2000"
"2233","http.stackoverflow.com","12:04:1990"
"3344","http.somothersite.org","12:04:1970"
$newcsv = #()
Import-Csv -Path $PSScriptRoot\employees.csv | ForEach-Object {
$newcsv += New-Object -TypeName PSObject -Property ([ordered]#{
employeeid = $_.employeeid -replace '\D+', ''
website = $_.website -replace 'www', 'http'
dob = ([datetime]::ParseExact($_.dob, 'yyyy:MM:dd', [System.Globalization.CultureInfo]::InvariantCulture)).toString('dd:MM:yyyy')
})
}
$newcsv | Export-Csv -Path $PSScriptRoot\newemployees.csv -Force -NoTypeInformation

Adding columns and manipulating existing column values in csv file using powershell

I have a lot of csv files with values arranged like so:
X1,Y1
X2,Y2
...,...
Xn,Yn
I find it very tedious processing these with excel, so I want to setup a batch script to process these files such that they appear like this:
#where N is a specified value like 65536
X1,N-Y1,1
X2,N-Y2,2
...,...,...
Xn,N-Yn,n
I have only recently started using powershell for image processing (really simple scripts) and file name appending, so I am not certain how to go about this. A lot of the scripts I have encountered looking to answer this question use csv files with titles per column whereas my files are just arrays of values without object titles in the first row. I would like to avoid running multiple scripts to add titles.
My bonus question is something I have yet to find a good answer to at all, and is the most tedious part of processing. Using excels sort function, I usually change the order of the Yn values in Col2 such that they are sorted in the exported csv like so:
X1,N-Yn,n
...,...,...
Xn-1,N-Y2,2
Xn,N-Y1,1
Using the Col3 values as the sorting order (largest to smallest), then I delete this column so that the final saved csv only contains the first two columns (crucial step). Any help at all would be greatly appreciated, I apologize for the long-winded-ness of this question.
I have encountered looking to answer this question use csv files with titles per column whereas my files are just arrays of values without object titles in the first row.
The -Header parameter of Import-Csv is for adding column headers when the file does not contain them. It takes an array of strings, of however many columns there are.
I would like to avoid running multiple scripts to add titles.
If you couldn't use -Header, you could read the lines with Get-Content into memory, add a header in memory, and then use ConvertFrom-CSV all in one script.
That said, if I'm reading it rightly, you want:
No headers in the input file, and I imagine no headers in the output file
The whole point of adding the third column and sorting and removing it is just to reverse the lines?
The only column you keep is column 1?
I wouldn't use Import-Csv for this, it won't make it much nicer.
$n = 65536
# Read lines into a list, and reverse it
$lines = [Collections.Generic.List[String]](Get-Content -LiteralPath 'c:\test\test.csv')
$lines.Reverse()
# Split each line into two, create a new line with X and N-Y
# write new lines to an output file
$lines | ForEach-Object {
$x, $y = $_.split(',')
"$x,$($n - [int]$y)"
} | Set-Content -LiteralPath 'c:\test\output.csv' -Encoding Ascii
If you do want to use CSV handling, then:
$n = 65536
$counter = 1
Import-Csv -LiteralPath 'C:\test\test.csv' -Header 'ColX', 'ColY' |
Add-Member -MemberType ScriptProperty -Name 'ColN-Y' -Value {$n - $_.ColY} -PassThru |
Add-Member -MemberType ScriptProperty -Name 'N' -Value {$script:counter++} -PassThru |
Sort-Object -Property 'N' -Descending |
Select-Object -Property 'ColX', 'ColN-Y' |
Export-Csv -LiteralPath 'c:\test\output.csv' -NoTypeInformation
But the output will have CSV headers and double-quoted values.
I would try something like, by extending the original table with a calculatable script-property as a new column:
#Your N number
$N = 65536
# Import CSV file without header columns
$table = Import-Csv -Header #("colX","colY") `
-Delimiter ',' `
-Path './numbers.csv'
Write-Host "Original table"
$table | Format-Table
# Manipulate table
$newtable = $table |
Add-Member -MemberType ScriptProperty -Name colNX -Value { $N-$this.colX } - PassThru
Write-Host "New table"
$newtable | Format-Table

Import-csv Target cell on Csv

I have a csv with a list of usernames
I want to import just one cell of the csv file e.g. A2
Is it possible to be that specific? I have tried googling for this but don't see an exact solution. Tried powershell help also.
Can this be done ?
Thanks
Confuseis
The below example will select and output only 'cell' A2 from test.csv
In this example, the column header for row A is 'username'
$inFile = Import-Csv c:\Temp\test.csv
$targetCell = $inFile.username[1]
Write-Output $targetCell
This snippet is doing the following:
Import the csv file, yielding a PowerShell object.
Select the column you want to work with, the items from that column can be treated as an array. Select the desired item in that column by referring to it's zero based index value.
Output the results.
Import-CSV creates an array of objects from the input file. The column labels in the first row of the CSV become the property names. The other rows are objects in the array. Like any array you can call one element using brackets.
$arrUsers = Import-CSV c:\temp\users.csv
$arrUsers[1]
The second command, above, prints the second object, since counting starts with 0. This object came from the third line of the CSV file, since the first was used as column headers.
If you use Get-Member, it will show you the members (properties and methods) of an object.
$arrUsers | Get-Member
Assuming one of the members is username, combine this with array indexing, you can use:
$arrUsers[1].username
Import-CSV is a very flexible tool. Especially combined with Foreach and Export-CSV. Between Get-Help and Get-Member, you can explore Powershell with ease. Good luck.
When you use Import-Csv you convert the content into a PSCustomObject.
Examples on the following table:
PS> $csv = Import-Csv .\test.csv
PS> $csv
ProcessName Id WS CPU
----------- -- -- ---
sihost 5996 30015488 44.640625
pia_nw 11064 10620928 52.921875
pia_nw 2344 7933952 104.0625
RuntimeBroker 6500 77500416 177.34375
SettingSyncHost 6736 5074944 202.796875
explorer 6600 284934144 272.140625
ipoint 920 3162112 372.78125
rubyw 10648 18026496 389.46875
pia_nw 3108 31330304 1640.5625
OneDrive 10208 33206272 6422.4375
So you will need a NoteProperty name to call a value you're looking for.
PS> $csv.ProcessName[0]
sihost
Another way is to make a header array and use that to slice the data.
If working with a an object:
PS> $header = ($csv | ConvertTo-Csv -NoTypeInfo)[0] -replace '"' -split ",";
>>
PS> $header
ProcessName
Id
WS
CPU
Or if working with the file:
PS> $header = (gc .\test.csv)[0] -replace '"' -split ',';
ProcessName
Id
WS
CPU
Then just use the appropriate index:
PS> $csv[0]."$($header[0])"
sihost
Finally there is the Excel.Application ComObject method on an xlsx file. This will let you select cell's and ranges.
PS> $file = "C:\Some\Path\IMade\Up\test.xlsx"
PS> $objExcel = New-Object -ComObject Excel.Application
PS> $objExcel.Visible = $false
PS> $wb = $objExcel.Workbooks.Open($file)
PS> $ws = $wb.Sheets.Item(1)
PS> $ws.Range("A2").Text
sihost
More info on using the ComObjects can be found here:
Application Object (Excel)

Export csv spits out length only

I can only get the length when exporting this to csv, how should it be done properly.
$redo = Import-CSV c:\temp\testimport.txt | Group-Object email |
foreach { "{0} ,{1}" -f $_.Name, (($_.Group | foreach { $_.group }) -join ', ')
}
$redo | Export-CSV c:\temp\test.csv -NoTypeInformation
#
"Length" "46" "59" "110" "47" "149" "38" "69" "32" "62" "29" "49" "31"
"27" "48" "55" "42"
Export-Csv expects an object (or a list of objects) with properties, whereas your command pipeline produces an array of strings. If you feed this array into Export-Csv the cmdlet takes the properties of each given item (which is only Length for strings) and writes those properties to the output file.
You need to build a list of objects with the desired properties instead, e.g.:
Import-CSV c:\temp\testimport.txt `
| Group-Object email `
| select #{n="Name";e={$_.Name}},#{n="Group";e={($_.Group | %{$_.group}) -join ', '}} `
| Export-CSV c:\temp\test.csv -NoTypeInformation
This was exactly what I was looking for. So I'm just going to add, my comment to be sure everyone understands.
This does NOT work:
$str_list = #('Mark','Henry','John')
$str_list | Export-Csv .\ExportStrList.csv -NoType
Because Export-Csv takes Objects and outputs properties. The only properties for a String[ ] is Length, so the CSV file only contains Lengths.
To fix this, like the last guy said, we need to change the String[ ] into an Object[ ]. The simplest way is with Select-Object.
Put each String into the Name property of a new Object[ ], like this:
$str_list = #('Mark','Henry','John')
$str_list = $str_list | Select-Object #{Name='Name';Expression={$_}}
$str_list | Export-Csv .\ExportStrList.csv -NoType
Just to re-iterate, Select-Object outputs a custom PSObject that can easily be manipulated. This is very powerful information, use it wisely.
The method shown for converting data to objects is a very long-hand approach, at least in my circumstance.
I'm developing reports integrating with an IBM V7000 SAN storage subsystem and it's CLI, which I can call from PS using putty plink, returns either tabular output (which can be CSV) or list depending upon the query.
From either I desire to export the data as CSV.
From the list output I'm looping through the result set (one row = one field) and assembling the fields into a string, separating values with commas. (For the tabular output I get to skip this tedious step.)
The following works to write the output to a CSV file which I can then open as a spreadsheet.
$fhStream = [System.IO.StreamWriter] "20150527_QALUNTable.csv"
$fhStream.WriteLine($stColumnHeadings)
$fhStream.WriteLine($stColumnValues)
$fhStream.Close()
Import-Csv works to return the input as an object that I can easily use to prepare my reports (which are assembled from many such files of output, each gathered at a separate point in time -- hence the datestamp prefix).
There are 57 columns of data here so by converting to a CSV I avoid preparing 57 object statements.
(Found .Net technique for writing output (fastest) at http://blogs.technet.com/b/gbordier/archive/2009/05/05/powershell-and-writing-files-how-fast-can-you-write-to-a-file.aspx)