Windows Powershell Export-CSV column/row value as filename - powershell

I'm having a problem with Powershell. I need to import a csv file, format it and export it.
Now one of the fields is the same value on every line so I want to name the export file the same as a cell value.
I am importing the csv with
Import-Csv c:\tmp\200114.csv
Formatting the output with
Select-Object #{expression={$_.code}; label='ID NUMBER'} etc etc but one of the fields being DESCRIPTION
Then exporting with:
Export-Csv -NoTypeInformation c:\tmp\test1.csv
So basically I want to name the file something like the following where DESCRIPTION is the Field name and the row is 1(as they are all the same):
Export-Csv -NoTypeInformation c:\tmp\#{expression={$_.DESCRIPTION[1]};}.csv
But I just get:
Export-Csv : Cannot validate argument on parameter 'Delimiter'. The argument is null. Supply a non-null argument and tr
y the command again.
Ideally I could like the file name to be:
Todays Date - DESCRIPTION Column ROW 1 Value.csv
Many thanks for any input....
SAMPLE DATA:
ID NUMBER,NAME,ADDRESS 1,ADDRESS 2,CITY,STATE,ZIP,Spare,PHONE #,DESCRIPTION,Spare,Spare,VALUE
1,Name 1,address 1,address 2,city 1,state 1,zip 1,,Phone 1,CC098-1,,,NCV
2,Name 2,address 2,address 3,city 2,state 2,zip 2,,Phone 2,CC098-1,,,NCV
Thanks, I am very grateful for this. I am struggling to get this working though. It must be to do with:
Select #{n='ID NUMBER';e={$_.code}}, #{n='DESCRIPTION';e={...}}
I Don't know the Select command and can't find anything on it, I can't even see how the n= & e= etc are doing. I trimmed it down to:
$csv = Import-Csv c:\tmp\200114.csv | Select-Object #{expression={$_.code}; label='ID NUMBER'} | $filename = "$(Get-Date -f 'yyyy-MM-dd') - $($csv[0].DESCRIPTION).csv" | $csv | Export-Csv $filename -NoTypeInformation
but I just get errors:
Expressions are only allowed as the first element of a pipeline.
At line:1 char:108
+ $csv = Import-Csv c:\tmp\200114.csv | Select-Object #{expression={$_.code}; label='ID NUMBER'} | $filename <<<< = "$
(Get-Date -f 'yyyy-MM-dd') - $($csv[0].DESCRIPTION).csv" | $csv | Export-Csv $filename -NoTypeInformation
+ CategoryInfo : ParserError: (:) [], ParentContainsErrorRecordException
+ FullyQualifiedErrorId : ExpressionsMustBeFirstInPipeline
Thanks again.

Capture the processed CSV in a variable:
$csv = Import-Csv c:\tmp\200114.csv | Select #{n='ID NUMBER';e={$_.code}},
#{n='DESCRIPTION';e={...}},
...
Construct your filename:
$filename = "$(Get-Date -f 'yyyy-MM-dd') - $($csv[0].DESCRIPTION).csv"
Then export the data:
$csv | Export-Csv $filename -NoTypeInformation

Related

Powershell Replace Regex Import CSV File

I have a CSV file named test.csv (C:\testing\test.csv) in this format:
File Name,Location,Added (GMT),Created (GMT),Last Modified (GMT),File Size (Bytes),File Size,Extension,Incident Type
10-MB-Test (1).docx,\\blah\Test 3,10/8/2020 21:13,10/8/2020 19:33,10/8/2020 16:26,10723331,10.23 (MB),docx,low_data_discover
10-MB-Test (1).xlsx,\\blah2\Test 3\,10/8/2020 21:14,10/8/2020 19:33,10/8/2020 16:25,9566567,9.12 (MB),xlsx,high_data_discover
1-MB-Test.docx,\\blah3\Test 3\,10/8/2020 21:13,10/8/2020 19:33,10/8/2020 16:37,1045970,1021.46 (KB),docx,medium_data_discover
I'm trying to replace trailing "\" characters (if they exist) for values in the Location column with nothing using this Powershell code:
$file1 = import-csv -path "C:\testing\test.csv" | % {$_."Location" -replace "\\$",""} | Select-Object * | export-csv -NoTypeInformation "C:\testing\blah.csv"
However, when I run the code, the only output I get is a column named "Length" with a numerical value. Can you assist?
You're only sending the new string (updated location) down the pipeline. You can update each location and then export it at the end.
$file1 = import-csv -path "C:\testing\test.csv"
$file1 | ForEach-Object {$_.location = $_.location -replace '\\$'}
$file1 | export-csv -NoTypeInformation "C:\testing\blah.csv"

Copy altered CSV Data to new CSV

The whole point of this issue is going to be: How to copy data from one CSV to another without knowing/listing the headers of the original CSV.
The cmdlet I'm building is meant to convert a report from CSV to a spreadsheet eventually. And if I write the column headers to the code, each time somebody changes the report, the code will break and it would have to be updated.
The steps I would take right now:
# Import the Source CSV. Gonna pull data from this later.
$SourceCSV = Import-Csv -Path $reportSourceCSV -Delimiter ";"
# Remove NULL characters, white spaces and change comma separator to semicolon.
(Get-Content -Path $reportSourceCSV | Where-Object {-not [string]::IsNullOrWhiteSpace($PSItem)}).Replace('","',";") | Out-File -FilePath $TMP1
# Import the modified new temp CSV.
$Input = Import-Csv -Path $TMP1 -Delimiter ";"
# Take existing CSV file headers and append some new ones. Rename a long column name.
((($GetHeaders = foreach ($Header in $SourceCSV[0].PSObject.Properties.Name) {
"`"$Header`""
}) + '"column4"','"column5"','"column6"') -join ";").Replace("VerylongOldColumnName","ShortName") | Out-File -FilePath $TMP2
foreach ($Item in $Input) {
"`"$($Item.column1)`";`"$($Item.'column2')`";`"$($Item.column3)`"" | Out-File -FilePath $TMP2 -Append
}
$exportToXLSX = Import-Csv -Path $TMP2 -Delimiter ";" | Export-Excel -Path $Target -WorkSheetname "reportname" -TableName "tablename" -TableStyle Medium2 -FreezeTopRow -AutoSize -PassThru
$exportToXLSX.Save()
$exportToXLSX.Dispose()
Remove-Item -Path $TMP1, $TMP2
This works! But I don't want to create infinite amount of different reports and just as many different logic blocks to process all these reports.
So this is as far as I was able to get trying a more dynamic way of processing the report CSVs:
(Get-Content -Path $reportSourceCSV | Where-Object {-not [string]::IsNullOrWhiteSpace($PSItem)}).Replace('","',";") | Out-File -FilePath $TMP1
$import = Import-Csv -Path $TMP1 -Delimiter ";"
$headers = ($import[0].PSObject.Properties.Name).Replace("VerylongOldColumnName","ShortName")
$headers | Out-File -FilePath "C:\TEMP\test.csv"
foreach ($item in $import) {
for ($h = 0; $h -le ($headers).Count; $h++) {
$($item.$($headers[$h]))
}
}
Now, this works... kind of. If I run the script like this, it shows me the output I want, but I was NOT able to export this to CSV.
I added Export-Csv to this line: $($item.$($headers[$h])) so this particular line would look like this:
$($item.$($headers[$h])) | Export-Csv -Path $Output -Delimiter ";" -Append -NoTypeInformation
And this is the error I get:
Export-Csv : Cannot append CSV content to the following file: C:\TEMP\test.csv.
The appended object does not have a property that corresponds to the following
column: column1. To continue with mismatched properties, add the -Force parameter,
and then retry the command.
At line:11 char:36
+ ... ers[$h])) | Export-Csv -Path $Output -Delimiter ";" -Append -NoTypeIn ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidData: (column1:String) [Export-Csv], InvalidOperationException
+ FullyQualifiedErrorId : CannotAppendCsvWithMismatchedPropertyNames,Microsoft.PowerShell.Commands.ExportCsvCommand
If I add -Force parameter, the output will be the headers and a bunch of empty lines.
As little as I understand, is that the output is for some reason a string? To my knowledge everything should be an object in PS, unless converted to string (Write-Host cmdlet being an exception). And I don't really know how to force the output back to being objects.
Edit: Added sample source CSV
"Plugin","Plugin Name","Family","Severity","IP Address","Protocol","Port","Exploit?","Repository","DNS Name","NetBIOS Name","Plugin Text","Synopsis","Description","Solution","See Also","Vulnerability Priority Rating","CVSS V3 Base Score","CVSS V3 Temporal Score","CVSS V3 Vector","CPE","CVE","Cross References","First Discovered","Last Observed","Vuln Publication Date","Patch Publication Date","Exploit Ease","Exploit Frameworks"
"65057","Insecure Windows Service Permissions","Windows","High","127.0.0.1","TCP","445","No","Individual Scan","computer.domain.tld","NetBIOS Name","Plugin Output:
Path : c:\program files (x86)\application\folder\service.exe
Used by services : application
File write allowed for groups : Users, Authenticated Users
Full control of directory allowed for groups : Users, Authenticated Users","At least one improperly configured Windows service may have a privilege escalation vulnerability.","At least one Windows service executable with insecure permissions was detected on the remote host. Services configured to use an executable with weak permissions are vulnerable to privilege escalation attacks.
An unprivileged user could modify or overwrite the executable with arbitrary code, which would be executed the next time the service is started. Depending on the user that the service runs as, this could result in privilege escalation.
This plugin checks if any of the following groups have permissions to modify executable files that are started by Windows services :
- Everyone
- Users
- Domain Users
- Authenticated Users","Ensure the groups listed above do not have permissions to modify or write service executables. Additionally, ensure these groups do not have Full Control permission to any directories that contain service executables.","http://www.nessus.org/u?e4e766b2","","8.4","","AV:L/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H","cpe:/o:microsoft:windows","","","Jul 11, 2029 06:48:20 CEST","Jul 11, 2029 06:48:20 CEST","N/A","N/A","",""
Edit: I think I found another way how to accomplish this and looking at it, it looks I tried to overdo it quite a bit.
# Doing cleanup, changing delimiters, renaming that one known column. All in one line.
$importCSV = 'C:\TEMP\sourceReport.csv'
(Get-Content -Path $importCSV | Where-Object {-not [string]::IsNullOrWhiteSpace($PSItem)}).Replace('","','";"').Replace"VerylongOldColumnName","ShortName") | Out-File -FilePath C:\TEMP\tmp1.csv
# Adding additional columns and exporting it all to result CSV.
Import-Csv -Path C:\TEMP\tmp1.csv -Delimiter ";" | Select-Object *, "Column1", "Column2" | Export-Csv -Path C:\TEMP\result.csv -NoTypeInformation -Delimiter ";"
You should not simply replace , with ; because the fields actually contain commas as in ..Additionally, ensure these groups .. By replacing just like that, the field will get separated from the rest of its content and you'll end up with a mis-aligned csv.
The below approach will do what you want, leaving the structure of the csv file intact:
$importCSV = 'C:\TEMP\sourceReport.csv'
$exportCSV = 'C:\TEMP\result.csv'
$columnsToAdd = "Column1", "Column2"
# read the file as string array, not including empty lines
$content = Get-Content -Path $importCSV | Where-Object { $_ -match '\S' }
# replace the column header in the top line only
$content[0] = $content[0].Replace("VerylongOldColumnName", "ShortName")
# join the string array with newlines and convert that to an object with ConvertFrom-Csv
# add the columns to the object and export it using the semi-colon as delimiter
($content -join [Environment]::NewLine) | ConvertFrom-Csv |
Select-Object *, $columnsToAdd |
Export-Csv -Path $exportCSV -NoTypeInformation -Delimiter ";"

Powershell Select Expanded Property and Custom Expression

I want to get the content of the first value of a specific column using Import-Csv output as a table alongside the name of the file.
I can do:
$File = '\\webserver\Data_20190626.csv'
Import-Csv -Path $File -Delimiter ',' | select 'Effective Date' -First 1
Which gives me my expected output:
Effective Date
--------------
25-May-2019
What I want to see is:
Effective Date FileName
-------------- ---------
25-May-2019 Data_20190626.csv
I have tried this:
$File = '\\webserver\Data_20190626.csv'
Import-Csv -Path $File -Delimiter ',' | select 'Effective Date', #{N='FileName';E={$_.Name}} -First 1
Which resulted in:
Effective Date FileName
-------------- ---------
25-May-2019
How should I proceed?
You can achieve this with Select-Object's property hash table and splitting the file path by \ with the .Split() method. The [-1] indicates the last item in the split result.
$File = '\\webserver\Data_20190626.csv'
Import-Csv -Path $File |
Select-Object 'Effective Date',#{n='FileName';e={$File.Split('\')[-1]}} -First 1
I'm surprised there's no PSPath or anything.
import-csv $file -delimiter ',' |
select 'Effective Date', #{n='Filename';e={split-path -leaf $file}} -first 1
this is a slightly different way to do things. [grin] the #region >>> create a file to work with section can be ignored - it is there to provide the data file you didn't provide.
it gets the EffectiveDate info by using the way that PoSh can dot-address one property of an entire collection and then grabbing the 1st item from the resulting array.
it also avoids the often confusing side effects of allowing spaces or other special chacters in property names by using EffectiveDate instead of Effective Date. if you need to keep using the poorly thot out embedded space, then change that string as needed.
$File = "$env:TEMP\Data_20190626.csv"
#region >>> create a file to work with
#'
EffectiveDate
2019-06-25
2006-06-06
2005-05-05
'# |
ConvertFrom-Csv |
Export-Csv -LiteralPath $File -NoTypeInformation
#endregion >>> create a file to work with
$Results = [PSCustomObject]#{
EffectiveDate = (Import-Csv -LiteralPath $File).EffectiveDate[0]
FileName = $File
}
$Results
output ...
EffectiveDate FileName
------------- --------
2019-06-25 C:\Temp\Data_20190626.csv

Extract Columns based on Row data from .CSV

Total Newbie with PowerShell, but used to use WSH with .vbs back in the day - so hopefully can structure this question correctly.
I would like to extract x number of columns from a .csv file, only if the row data equals a certain value - and then send the filtered data to a new .csv in another destination.
So taking a saved Windows event log as an example, I would like to extract Columns A-F but only on rows where column 'A' equals 'Error' - and then send that output to a new .csv in a child directory.
I think I am pretty close, but can only get it to save columns A-F but no rows with the data I need!
Can anyone help me figure this out or show me where I am going wrong please?
$folderPath = 'C:\DLA\'
$folderPathDest = 'C:\DLA\OUT\'
$desiredColumns = 'A','B','C','D','E','F'
$topics.Where({$desiredColumns.play -eq 'Error'}).topic
Get-ChildItem $folderPath -Name |
ForEach-Object {
$filePath = $folderPath + $_
$filePathdest = $folderPathDest + $_
Import-Csv $filePath | Select $desiredColumns | Select $topics |
Export-Csv -Path $filePathDest –NoTypeInformation
}
Just put the filter directly in your pipeline:
Import-Csv $filePath | Select $desiredColumns | where {$_.A -eq 'Error'} | Export-Csv -Path $filePathDest –NoTypeInformation
Below command worked for me to extract all the columns to a new file. This can be modified to select the desired columns:
import-csv $filePath | ? { $_.columnName -eq 'Error' } | export-csv $filePathDest -NoTypeInformation
columnName is the header title of the columns

Powershell import-csv with empty headers

I'm using PowerShell To import a TAB separated file with headers. The generated file has a few empty strings "" on the end of first line of headers. PowerShell fails with an error:
"Cannot process argument because the
value of argument "name" is invalid.
Change the value of the "name"
argument and run the operation again"
because the header's require a name.
I'm wondering if anyone has any ideas on how to manipulate the file to either remove the double quotes or enumerate them with a "1" "2" "3" ... "10" etc.
Ideally I would like to not modify my original file. I was thinking something like this
$fileContents = Get-Content -Path = $tsvFileName
$firstLine = $fileContents[0].ToString().Replace('`t""',"")
$fileContents[0] = $firstLine
Import-Csv $fileContents -Delimiter "`t"
But Import-Csv is expecting $fileContents to be a path. Can I get it to use Content as a source?
You can either provide your own headers and ignore the first line of the csv, or you can use convertfrom-csv on the end like Keith says.
ps> import-csv -Header a,b,c,d,e foo.csv
Now the invalid headers in the file is just a row that you can skip.
-Oisin
If you want to work with strings instead use ConvertFrom-Csv e.g.:
PS> 'FName,LName','John,Doe','Jane,Doe' | ConvertFrom-Csv | Format-Table -Auto
FName LName
----- -----
John Doe
Jane Doe
I ended up needing to handle multiple instances of this issue. Rather than use the -Header and manually setting up each import instance I wrote a more generic method to apply to all of them. I cull out all of the `t"" instances of the first line and save the file to open as a $filename + _bak and import that one.
$fileContents = Get-Content -Path $tsvFileName
if( ([string]$fileContents[0]).ToString().Contains('""') )
{
[string]$fixedFirstLine = $fileContents[0].ToString().Replace('`t""',"")
$fileContents[0] = $fixedFirstLine
$tsvFileName = [string]::Format("{0}_bak",$tsvFileName
$fileContents | Out-File -FilePath $tsvFileName
}
Import-Csv $tsvFileName -Delimiter "`t"
My Solution if you have much columns :
$index=0
$ColumnsName=(Get-Content "C:\temp\yourCSCFile.csv" | select -First 1) -split ";" | %{
$index++
"Header_{0:d5}" -f $index
}
import-csv "C:\temp\yourCSCFile.csvv" -Header $ColumnsName