Write output data row to pipe delineated text file in powershell - powershell

I am a powershell newbie and I need a script that does a write output to a delineated text file in powershell.
In sequence, this is what I would like to do in powershell:
read data from excel file and store to variables
read data from table in mssql
write output the each row from (2) to a text file (pipe delineated) appending the value from (1)
I was able to figure out sequence 1 and 2 however I am stumped on (3).
Here is a snippet of what i am trying to do:
# Iterate through the dataset
foreach ($row in $ds.tables["location"].rows)
{
Out-file ?
}
Please help!

$myData | Export-CSV -delimiter '|' -Path $MyFileName
Note, the Export-CSV cmdlet will handle looping over the data, so no need for your own loop.

Related

Powershell: how to retrieve powershell commands from a csv and execute one by one, then output the result to the new csv

I have a Commands.csv file like:
| Command |
| -----------------------------------------------|
|(Get-FileHash C:\Users\UserA\Desktop\File1).Hash|
|(Get-FileHash C:\Users\UserA\Desktop\File2).Hash|
|(Get-FileHash C:\Users\UserA\Desktop\File3).Hash|
Header name is "Command"
My idea is to:
Use ForEach ($line in Get-Content C:\Users\UserA\Desktop\Commands.csv ) {echo $line}
Execute $line one by one via powershell.exe, then output a result to a new .csv file - "result.csv"
Can you give me some directions and suggestions to implement this idea? Thanks!
Important:
Only use the technique below with input files you either fully control or implicitly trust to not contain malicious commands.
To execute arbitrary PowerShell statements stored in strings, you can use Invoke-Expression, but note that it should typically be avoided, as there are usually better alternatives - see this answer.
There are advanced techniques that let you analyze the statements before executing them and/or let you use a separate runspace with a restrictive language mode that limits what kinds of statements are allowed to execute, but that is beyond the scope of this answer.
Given that your input file is a .csv file with a Commands column, import it with Import-Csv and access the .Commands property on the resulting objects.
Use Get-Content only if your input file is a plain-text file without a header row, in which case the extension should really be .txt. (If it has a header row but there's only one column, you could get away with Get-Content Commands.csv | Select-Object -Skip 1 | ...). If that is the case, use $_ instead of $_.Commands below.
To also use the CSV format for the output file, all commands must produce objects of the same type or at least with the same set of properties. The sample commands in your question output strings (the value of the .Hash property), which cannot meaningfully be passed to Export-Csv directly, so a [pscustomobject] wrapper with a Result property is used, which will result in a CSV file with a single column named Result.
Import-Csv Commands.csv |
ForEach-Object {
[pscustomobject] #{
# !! SEE CAVEAT AT THE TOP.
Result = Invoke-Expression $_.Commands
}
} |
Export-Csv -NoTypeInformation Results.csv

PowerShell and CSV: Stop CSV from turning text data into Scientific Notation

I have a CSV column with alpha numerical combinations in a column.
I am later going to use this csv file in a PowerShell script by importing the data.
Examples: 1A01, 1C66, 1E53.
Now before putting these values in, I made sure to format the column as text.
Now at first it works. I input the data, save. I test in PowerShell to import it and
all data shows up valid including 1E53. But lets say I edit the file again later to add data and then save and close. I re-import into PowerShell and 1E53 comes in as 1.00E+53. How can I prevent this permanently? Note that the column is filled with codes and there are lots of #E##.
Your issue is not with PowerShell, its with Excel. For a demonstration, take 1E53 and enter it into Excel and then save that excel file as a CSV file. You will see that the value is now changed to 1.00E+53.
How to fix this?
There are a few ways of disabling scientific notation:
https://superuser.com/questions/452832/turn-off-scientific-notation-in-excel
https://www.logicbroker.com/excel-scientific-notation-disable-prevent/
I hope some of them work for you.
I think you can rename the file to .txt instead of .csv and excel may treat it differently.
Good Luck
As commented:
You will probably load the csv from file:
$csv = Import-Csv -Path 'X:\original.csv' -UseCulture
The code below uses a dummy csv in a Here-String here:
$csv = #'
"Column1","Column2","ValueThatMatters"
"Something","SomethingElse","1E53"
"AnotherItem","Whatever","4E12"
'# | ConvertFrom-Csv
# in order to make Excel see the values as Text and not convert them into scientific numbers
$csv | ForEach-Object {
# add a TAB character in front of the values in the column
$_.ValueThatMatters = "`t{0}" -f $_.ValueThatMatters
}
$csv | Export-Csv -Path 'X:\ExcelFriendly.csv' -UseCulture -NoTypeInformation

Filtering data from CSV file with PowerShell

I have huge csv file where first line contains headers of the data. Because the file size I can't open it with excel or similar. I need to filter rows what I only need. I would want to create new csv file which contains only data where Header3 = "TextHere". Everything else is filtered away.
I have tried in PowerShell Get-Content Select-String | Out-File 'newfile.csv' but it lost header row and also messed up with the data putting data in to wrong fields. There is included empty fields in the data and I believe that is messing it. When I tried Get-Content -First or -Last data seemed to be in order.
I have no experience handling big data files or powershell before. Also other options besides PowerShell is also possible if it is free to use as "non-commercial use"
try like this (modify your delimiter if necessary):
import-csv "c:\temp\yourfile.csv" -delimiter ";" | where Header3 -eq "TextHere" | export-csv "c:\temp\result.csv" -delimiter ";" -notype

Process a CSV using PowerShell with different columns per row

I have a CSV file with no usable header row (the first row is information about the file, i.e. creation date).
There are set record types in the CSV file which are in the first column. i.e. column 1 could be PRA, ASA or POA.
Depending on the value of column 1 will determine what's in the rest of the fields. From this file I need to blank out all data that I'm not going to require for security before the file is sent out to a third-party. As we have different record types I can't do a simple loop and block out columns 3, 4 and 6 for example.
My plan was to go through the CSV line by line, look at the first column then output each record type to a separate file so they could be processed individually.
Import-Csv -Delimiter ~ -Encoding UTF8 -Path tinysample.dat | Foreach-Object {
foreach ($property in $_.PSObject.Properties) {
if ($property.Name -eq 'HDR') {
if ($($property.Value) -eq 'PRA' -OR $($property.Value) -eq 'POA') {
Export-Csv -InputObject $_ -Append -Delimiter ~ -Encoding UTF8 -LiteralPath "$($property.Value).dat"
}
}
}
}
At the moment, the records are being output along with the header row which I don't want. I wanted to set my own header in each respective file which I can then use easily to determine which columns should be hidden.
Also some of the records are being truncated in the new CSV.
I was able to achieve this with bash using grep and awk and was hoping I'd be able to do the same with Powershell.

Reducing one of csv columns contents down single character with powershell

I have a csv file with headers as follows
physicalDeliveryOfficeName,sn,middleName,givenName,info,Company,employeeID,Description
And I am wanting to change the contents of the middleName column down to just the first character then save it out as another csv file with all of the other columns unchanged.
Im not sure where to start with this.
The csv file is over 12000 rows and Im wanting to do it the most efficient way with powershell.
I am new to using Powershell so advise is greatly appreciated.
You should show some effort. A couple of google searches would go a long way. Here's one way:
Import-CSV myfile.csv |
Foreach-Object {
$_.middleName = $_.middleName.Substring(0,1)
$_
} |
Select-Object physicalDeliveryOfficeName,sn,middleName,givenName,info,Company,employeeID,Description |
Export-CSV myupdatedfile.csv -NoTypeInformation