I have the following code:
$databaseContents = "col1,col2,col3,col4"
$theDatabaseFile = "C:\NewFolder\Database.csv
$databaseContents | Out-File $theDatabaseFile
However when I open the csv file in Excel, it has col1,col2,col3,col4 in cell A1 rather than col1 in cell A1, col2 in cell B1 etc.
Something strange I've noticed:
If I open the file in Notepad and copy the text into another Notepad instance and save it as Database1.csv, then open it in Excel, it displays as expected.
How can I get the Out-File commandlet to save it as a .csv file with the contents in 4 columns as expected?
EDIT:
I've noticed if I use Set-Content rather than Out-File, the csv file is then displayed correctly in Excel.
Does anyone know why this is?
Why it makes a difference to Excel I am unclear, but it comes down to the encoding of the resulting output file - Unicode (in the cases that do not work) vs. ASCII (in the cases that do).
#JPBlanc's alternate approach works because it sets the encoding of the output file to ASCII where your original example (implicitly) set the encoding of the output file to Unicode.
Just adding -Encoding ascii to your original example works fine too:
$databaseContents = "col1,col2,col3,col4"
$theDatabaseFile = "C:\NewFolder\Database.csv
$databaseContents | Out-File $theDatabaseFile -Encoding ascii
And adding an explicit -Encoding unicode to your original example yields the same broken result you encountered:
$databaseContents = "col1,col2,col3,col4"
$theDatabaseFile = "C:\NewFolder\Database.csv
$databaseContents | Out-File $theDatabaseFile -Encoding unicode
This is basically what was happening implicitly.
This works also :
$databaseContents = "col1;col2;col3;col4"
$theDatabaseFile = "C:\temp\Database.csv"
$databaseContents | Out-File $theDatabaseFile -Encoding ascii
By default CSV separator in Excel seams to be ';' and Out-File save as unicode forcing ASCII seams to give the result you look for.
I was having the same problem than Backwards_Dave and just like him, using the Set-Content instead of Out-File command worked for me:
#Build $temp String
$temp = "$scriptApplication;$applicationName;$type;$description;$maxSessions;$enabled;$tempParams`n"
#Write $temp String to $fichierCsv file
"$temp" | Set-Content $fichierCsv
I tried using JPBlanc's and J0e3gan's solution but it did not work (-Encoding ascii option): I wonder why.
Related
In a powershell script, I am creating a txt file:
#Create a logfile
$logfile = "E:\scripts\Password expiration\PasswordExpiryLog.txt"
#Initialize the log file with the date
$date = Get-Date -Format "dd.MM.yyyy HH:mm:ss"
$message= "********************" + $date + "********************"
Add-Content $logfile $message
After that, I do some other things to get some info. When I try to send that info into the TXT file, formatting the info as a table, it writes the information in a way that the information is not in columns, missing the break of line. I mean, if I press enter where the next line should start, it is written in a column.
$Expired | FORMAT-TABLE #{Name='UserPrincipalName';Expression={"$($_.UserPrincipalName)"};align ='left'}, #{Name='rapasswordexpiring';Expression={"$($_.rapasswordexpiring)"};align ='center'}, #{Name='PasswordLastSet';Expression={"$($_.PasswordLastSet)"};align ='left'} -AutoSize | Out-File -Append 'E:\scripts\Password expiration\PasswordExpiryLog.txt'
I have also realized, while writing this message, that between the characters there is always a space
How it looks
How it should look
If I instead of out-file to the same logfile I use a new txt file, the information is written correctly.
$Expired | FORMAT-TABLE #{Name='UserPrincipalName';Expression={"$($_.UserPrincipalName)"};align ='left'}, #{Name='rapasswordexpiring';Expression={"$($_.rapasswordexpiring)"};align ='center'}, #{Name='PasswordLastSet';Expression={"$($_.PasswordLastSet)"};align ='left'} -AutoSize | Out-File -Append 'E:\scripts\Password expiration\table.txt'
Correct
However, I cannot get here the date and time.
Can you tell me what am I doing wrong or how to improve the output? I would be able to use cvs or html file if would be better.
Different default encoding:
Add-Content:
-Encoding Specifies the type of encoding for the target file. The
default value is Default.
Default Uses the encoding that corresponds to the system's active
code page (usually ANSI).
Out-File
-Encoding Specifies the type of encoding for the target file. The
default value is unicode.
unicode Uses UTF-16 with the little-endian byte order.
I have an array of objects in Powershell. It was working, but now when I do an Export-Csv on the array, it property and value names are transformed like:
Account_No -> +ACI-Account+AF8-No+ACI-
Does anyone know why it is doing this?
Thanks
I am using PS 5.1, and the command is:
$rowsWithErrs | Export-Csv -Path $rowErrCsvPath -NoTypeInformation -Encoding UTF7
It looks like there isn't anything wrong with what you are doing. Everything is getting sent out in the format that you are expecting.
The only problem is that the application that you are using to view your data is not using the same encoding that was used to write the data.
The extra characters are what you see when interpreting text as UTF8 or something similar or compatible with UTF8 (which is the standard for most systems) instead of UTF7 when the text was encoded as UTF7.
example
> "Account_No" | Out-File -FilePath test.txt -Encoding UTF7
> Get-Content test.txt -Encoding UTF8
Account+AF8-No
> Get-Content test.txt -Encoding UTF7
Account_No
if reading csv data in Powershell you can do the following
> $csv = Import-Csv -FilePath $filepath -Encoding UTF7
if reading csv data in Excel, on the data tab select From Text/CSV at the top of the import window select File Origin 65000: Unicode (UTF-7)
For other applications like VS Code or Notepad++ you may be out of luck if you want to view the data there because it looks like they do not support UTF-7 encoding.
I'm trying to understand some weird behaviour with this cmdlet.
If I use "Out-File -append Filename.txt" on a text file that I created and entered text into via the windows context menu, the string will append to the last line in that file as a series of space separated characters.
So:
"This is a test" | out-file -append textfile.txt
Will produce:
T h i s i s a t e s t
This wont happen if out-file creates the file, or if the text file has no text in it prior to appending. Why does this happen?
I will also note that repeating the command will just append in the same way to the same line. I guess it doesn't recognise newline or line break terminator or something due to changed encoding?
Out-File defaults to unicode encoding which is why you are seeing the behavior you are. Use -Encoding Ascii to change this behavior. In your case
Out-File -Encoding Ascii -append textfile.txt.
Add-Content uses Ascii and also appends by default.
"This is a test" | Add-Content textfile.txt.
As for the lack of newline: You did not send a newline so it will not write one to file.
Add-Content is default ASCII and add new line however Add-Content brings locked files issues too.
When i use this:
"$LogTime $CurrentDate Invalid Archive Grouping selected. You selected '$ArchiveGrouping'" | Out-File $LogFile -Append -Force
To write to a file i generated with this code:
$Logdate = Get-Date
$Logname = $Logdate.ToString("yyyyMMdd") + ".txt"
Add-Content -Path C:\ArchiveStorage\$Logname "Log" -force
$LogFile = "C:\ArchiveStorage\$Logname"
The Text in the file looks weird it looks like this:
if i change the code to just write to an existing file like:
$LogFile = "C:\ArchiveStorage\Log.txt"
The text file is as it should be:
why does it make spaces and all that random crap ?
You've been hit by double byte Unicode. When, say, letter A is written into the file, there usually is a single byte value 0x41. In double byte Unicode, the byte value for A is 0x0041 or 0x4100 depending on endianess.
When Notepad opens a file that has no Unicode byte order mark, it assumes all the extra 00 in file contents are blank spaces. That's why you do see w e r i d s p a c i n g.
For a fix, try using -Encoding ASCII parameter with Add-Content.
The "Log" entries look different because they were written with Add-Content, which uses a default encoding of ASCII, and Out-File uses a default encoding of Unicode.
Either specify the encoding type on Out-File, or switch to using Add-Content for both the "Log" heading and the detail lines.
Background
I have a pipe delimited csv file that looks like this:
ColA|ColB|ColC|ColD|ColE|ColF|ColG|ColH|ColI|ColJ|ColK
00000000|000000507|0000000|STATUS|0000|000000|000|0000|00|0000|00000
00000000|000000500|0000000|STATUS|0000|000000|000|0000|00|0000|00000
00000000|000007077|0000000|STATUS|0000|000000|000|0000|00|0000|00000
I want to take ColB on lines with a certain status and put it in a headless csv file like this:
000000507,0000000001,0,0
000000500,0000000001,0,0
000007077,0000000001,0,0
The values 0000000001,0,0 on the end of the line are identical for every item.
The Script
The trimmed down/generic version of the script looks like this:
$infile = Import-Csv ".\incsv.txt" -Delimiter '|'
$outfile = ".\outcsv.txt"
Foreach($inline in $infile){
If($inline.Status -ne 'Some Status'){
$outline = $inline.'ColB' + ',0000000001,0,0'
Add-Content $outfile $outline -Encoding ASCII
}
}
The Problem
The problem is that the new file that is created is about twice the size it should be, and I know it has to do with the encoding. Unfortunately, I can't find an encoding that works. I've tried -Encoding ASCII, -Encoding Default, -Encoding UTF8, but they all are too large.
This wouldn't be an issue, but the program that reads the created text file won't read it correctly.
What I can do, is copy the text from the new file in Notepad, save it as ANSI, and it works fine. ANSI isn't an option in the -Encoding parameter though.
How do I get Powershell to output the correct file type? Is there a better way to approach this?
I've found this Google Groups conversation and this Social TechNet post, but neither one actually worked for me.
If the output file already exists and is in Unicode format the parameter -Encoding ASCII is ignored when appending to the file. Change your loop to this:
$infile | % {
if ($_.Status -ne 'Some Status') {
$_.'ColB' + ',0000000001,0,0'
}
} | Set-Content $outfile -Encoding ASCII