Why leading zeros from a numbers ignore on exporting polars dataframe to CSV? - python-polars

I have a polars dataframe with a number in a string datatype;
On exporting it to csv using write_csv method it ignores the leading zeros and the output it-
How to retain the zeros on exporting the files to csv?

Excel literally assumes numbers seeing those strings and promptly eliminates leading zeros (numbers are right-adjusted in Excel). Instead of saving the file with .csv extension pick .txt and open it in Excel. You should then be presented with the option (import dialogue) of choosing the column format, so pick "Text" and voilá.

Related

how to export large numbers from HeidiSQL to csv

I use HeidiSQL to manage my database.
When I export grid row to CSV format file, the large number 89610185002145111111 become 8.96102E+19
How can I keep the number without science notation conversion?
HeidiSQL does not do such a conversion. I tried to reproduce but I get the unformatted number:
id;name
89610185002145111111;hey
Using a text editor, by the way. If you use Excel, you may have to format the cell in a different format.

How to use numbers as attributes in PostgreSQL?

I have a .csv file that has numbers as column names. I want to import that file to a table in PostgreSQL, but it gives an error.
I have 1024 columns so I can't manually change it in my file. Is there a way around that?
This is the Excel file that I got:
If you want a table with 1024 columns you are doing something wrong.
You should choose a different data model.
But it is possible to use numbers as column names, as long as you surround them with double quotes.

Keep leading zeros when joining data sources in tableau

I am trying to create a data source in Tableau (10.0) where I am joining a table from SQL with an Excel file. The join happens on a site id but when reading the id from the excel source, Tableau strips the leading zeros (and SQL keeps leading zeros). I see this example
to add the leading zeros back as a new, calculated field. But the join is still dropping rows because the id is not properly formatted when making the join.
How do I get the excel data source to read the column with the leading zeros so I can do the join?
Launch Excel and choose to open a new blank workbook.
Click the Data tab and select From Text.
Browse to the saved CSV file and select Import.
Ensure that Delimited is selected and click Next.
Leave Tab as the delimiter and click Next.
Select the column containing the data with leading zeros and click
Text.
Repeat for each column which contains leading zeros.
Click Finish.
Click OK.
Never heard of or used tableau, but it sounds as though something (jet/ace database driver being used to read excel file?) is determining the column to be numeric and parsing the data as numbers, losing leading zeroes
If your attempts at putting them back are giving you grief, I'd recommend trying the other direction instead; get sqlserver to convert its strings to numbers. Number matching should be more reliable than String matching, so long as the two systems don't handle rounding differently :)
If your Excel file was read in from a CSV and the Site ID is showing "Number Stored as Text", I think you can solve your problem by telling Tableau on the Data Source entry that the field is actually a string. On the preview data source view, change the "#" (designating number) to string so that both the SQL source and the Excel source are both strings before doing the join.
This typically has to do with the way Excel stores values as mentioned above. I would play around with the number formatting for the Site ID column in Excel itself, not Tableau, and changed that two "Text" in Excel. You can verify if Tableau will read it properly with the leading 0s by exporting your excel file to csv and looking in the csv files to see if the leading 0s are still there.

Read csv file excluding first column and first line

I have a csv file containing 8 lines and 1777 columns.
I need to read all the contents in matlab, excluding the first line and first column. First line and first column contain strings and matlab can't parse them.
Do you have any idea?
data = csvread(filepath);
The code above reads all the contents
As suggested, csvread with a range will read in the numeric data. If you would like to read in the strings as well (which are presumably column headers), you can use readtable:
t = readtable(filepath);
This will create a table with the column headers in your file as variable names of the columns of the table. This way you can keep the strings associated with the data, if need be.

Prevent thousand separator in TSQL export to CSV

When exporting a TSQL select result to CSV the values show a strange thousandseparator. Partial code is:
CONVERT(DECIMAL(10,2),i.UserNumber_04) as CAP
The query results show perfect values, for example: 1470.00 but the CSV or txt file show strange values like 1,470,00. How can I prevent the first comma?
At first I thought it was just the formatting style in excel, but it does the same in txt files.