Converting txt file to csv in powershell - powershell

I have a tab separated text file after executing some command in powershell i want to convert this tab separated file into csv now how can i do so?

A tab separated text file could be treated as a CSV file with a tab instead of comma as its delimiter.
To convert:
import-csv tabdelimitedfile.txt -delimiter "`t" | export-csv csvfile.csv

Related

Powershell Converting Tab Delimited CSV to Comma delimited CSV without Quotes

We get a tab delimited CSV from COGNOS External system in a public folder. This fails to upload to Salesforce via Dataloader CLI.
com.salesforce.dataloader.exception.DataAccessRowException: Error
reading row #0: the number of data columns (98) exceeds the number of
columns in the header (97)
But if you open the csv in MS Excel, and save as a new CSV (UTF-8) and then pass it to data loader CLI it works without any issue.
The difference in EXCEL converted file seems to be it's Comma separated instead of Tab.
Then I tried to convert Original Tab Delimited CSV to Comma separated CSV using below command,
import-csv source.csv -delimiter "`t" | export-csv target.csv -notype
But the output of this has quotes, Data Loader now runs with the File, but imports nothing into Salesforce, it seems it's not able to identify field-names properly.
Then I tried below command to remove the double quotes,
import-csv source.csv -delimiter "`t" | export-csv target.csv -notype
(Get-Content target.csv) | Foreach-Object {$_ -replace '"', ''}|Out-File target.csv
But this resulted in an Index out of range error, which is not clear.
What would be the best approach to do this conversion for Data Loader CLI?
What can make this conversion same as EXCEL's conversion?
Highly appreciate Any suggestions, thoughts, help to achieve this.
Thanks!
SalesForce has strict rules for CSV files. Also, on this page it says that no more than 50000 records can be imported at one time.
Main thing here is that the file MUST be in UTF8 format.
The quotes around the values are needed.
This should do it (provided you do not have more than 50000 records in the Csv):
Import-Csv -Path 'source.csv' -Delimiter "`t" | Export-Csv -Path 'target.csv' -Encoding UTF8 -NoTypeInformation
(source.csv is the TAB-delimited file you receive from COGNOS)

how to convert from Newline Delimited text file to csv with Powershell

I have a text file containing the below test data:
1234
\\test
QATest
Silk
Chrome
I have a requirement to convert this text file into a CSV file using Powershell which would look somewhat like
1234,\\test,QATest,Silk,Chrome
Could anybody please suggest the right way?
Read all the lines in, then concatenate them using the -join operator:
$originalLines = Get-Content .\input.txt
$originalLines -join ',' |Set-Content .\output.csv

Code to Open File, Select All and Copy

I am trying to create a script that will open up a .txt file and select and then copy the data in the .txt file. I am able to open the .txt file but what I am not able to do is create a script to open, select and copy at the same time and in the same script line.
Here are some code examples that I have tried:
Invoke-Item adddata.txt; object.SendKeys "^(a)"
Invoke-Item adddata.txt; WshShell.SendKeys "^"; WshShell.SendKeys "a"
Invoke-Item adddata.txt; WshShell.SendKeys "{^}a"
What happens is the file does open up but nothing selects/copies on either example.
I think the Get-Content cmdlet is what you should be using:
Get-Content -Path "C:\ExampleFolder\adddata.txt" | clip
I think php123's answer does the job, however because your paste target is Excel and if the data in your file is in columns with a delimiter and you can improve the final result. You need to use the Import-Csv cmdlet to convert to an object and then use the ConvertTo-Csv to turn you object into a tab delimited string which will then paste nicely into Excel. Something like this assuming your file is comma separated:
Import-Csv "C:\ExampleFolder\adddata.txt" -Delimiter ',' | ConvertTo-Csv -Delimiter "`t" | clip

Powershell .txt to CSV Formatting Irregularities

I have a large number of .txt files pulled from pdf and formatted with comma delimiters.
I'm trying to append these text files to one another with a new line between each. Earlier in the formatting process I took multi-line input and formatted it into one line with entries separated by commas.
Yet when appending one txt file to another in a csv the previous formatting with many line breaks returns. So my final output is valid csv, but not representative of each text file being one line of csv entries. How can I ensure the transition from txt to csv retains the formatting of the txt files?
I've used Export-CSV, Add-Content, and the >> operator with similar outcomes.
To summarize, individual .txt files with the following format:
,927,Dance like Misty"," shine like Lupita"," slay like Serena. speak like Viola"," fight like Rosa! ,United States ,16 - 65+
Turn into the following when appended together in a csv file:
,927
,Dance like Misty"," shine like Lupita"," slay like Serena. speak like Viola"," fight like Rosa!
,United States
,16 - 65+
How the data was prepped:
Removing new lines
Foreach($f in $FILES){(Get-Content $f -Raw).Replace("`n","") | Set-Content $f -Force}
Adding one new line to the end of each txt file
foreach($f in $FILES){Add-Content -Path $f -value "`n" |Set-Content $f -Force}
Trying to Convert to CSV, one text file per line with comma delimiter:
cat $FILES | sc csv.csv
Or
foreach($f in $FILES){import-csv $f -delimiter "," | export-csv $f}
Or
foreach($f in $FILES){ Export-Csv -InputObject $f -append -path "test.csv"}
Return csv with each comma separated value on a new line, instead of each txt file as one line.
This was resolved by realizing that even though notepad was showing no newlines, there were still hidden return carriage characters. On loading the apparently one line csv files into Notepad++ and toggling "show hidden characters" this oversight was evident.
By replacing both \r and \n characters before converting to CSV,
Foreach($f in $FILES){(Get-Content $f -Raw).Replace("\n","").Replace("\r","" |
Set-Content $f -Force}
The CSV conversion process worked as planned using the following
cat $FILES | sc final.csv
Final verdict --
The text file that appeared to be a one line entry ready to become CSV
,927,Dance like Misty"," shine like Lupita"," slay like Serena. speak like Viola"," fight like Rosa! ,United States ,16 - 65+
Still had return carriage characters between each value. This was made evident by trying another text editor with the feature "show hidden characters."

How to remove the first 3 symbols in a text file?

How to remove the first 3 symbols in a text file with PowerShell and keep the file with the same name?
Just read the file using the Get-Content cmdlet, remove the file using a regex that replaces the first three characters with nothing and finally write it back using the Set-Content cmdlet:
(Get-Content 'yourfilePath.txt' -raw) -replace '^...' | Set-Content 'yourfilePath.txt'
Note: You probably want to specify the encoding using the -Encoding parameter when writing the content back to the file.