pgadmin automatically display thousand seperator - postgresql

SELECT TO_CHAR(76543210.98, '999G999G990D00');
This line code works well in pgadmin4, and psql.
But
It convert the numeric to text, obviously that's not i want.
Is there any possible way to change the configure settings.
So that pgadmin4 will automatically display the thousand separator.

Related

postgres 'only' shows numbers with comma seperator, even if data is saved without

my postgres shows me only comma separated number values like:
even when the original number comes without the separator.
lc_numeric shows me German_Germany.1252, which seems to be right.
So when the data is saved correctly, where can I change the format of the shown output?

Decimals less than 1 appear as ",x" in output file while they appear correctly in the result window

I am having difficulty with my decimal columns. I have defined a view in which I convert my decimal values like this
E.g.
SELECT CONVERT(decimal(8,2), [ps_index]) AS PriceSensitivityIndex
When I query my view, the numbers appear correctly on the results window e.g. 0,50, 0,35.
However, when I export my view to file using Tasks > Export Data ... feature of SSMS, the decimals lower than zero appear as ,5, ,35.
How can I get the same output as in the results window?
Change your query to this:
SELECT CAST( CONVERT(decimal(8,2), [ps_index]) AS VARCHAR( 20 ) ) AS PriceSensitivityIndex
Not sure why, but bcp is dropping leading zero. My guess is it's either because of the transition from SQL Storage to a text file. Similar to how the "empty string" and nulls are exchanged on BCP in or out. Or there is some deeper config (windows, sql server, ?) where a SQL Server config differs from an OS config? Not sure yet. But since you are going to text/character data anyway when you BCP to a text file, it's safe (and likely better in most cases) to first cast/convert your data to a character data type.

Date in table is dd.mm.yyyy - Can't import to postgres via csv

I'm trying to add a .csv to a table in database.
All dates in the .csv is in this format dd.mm.yyyy ( 18.10.2017).
I'm importing via pgadmin and always get an invalid input error.
I've tried to use almost all date formatting options for the column but without any luck.
I would rather not change the csv manually.
Can anyone help me with this?
I almost always import data into a staging table where all the columns are strings.
Then I use queries to load the final table.
This has several advantages:
It gives me much more control over how the data is transformed.
It makes it easier to debug problems -- the entire staging table can be queried to find all rows with a particular issue (for instance).
Additional validations can be performed before loading into the final table.
This is just a suggestion, but you might find that overall this takes less time.
The DateStyle setting is probably set to MDY. You can check this by running:
show datestyle;
Although dd.mm.yyy isn't listed as a standard input format, if you expect it to work, you will need the DateStyle to line up with the ordering here (DMY).
The date/time style can be selected by the user using the SET datestyle command, the DateStyle parameter in the postgresql.conf configuration file, or the PGDATESTYLE environment variable on the server or client.
See section "Date Order Conventions":
https://www.postgresql.org/docs/current/static/datatype-datetime.html

Keep leading zeros when joining data sources in tableau

I am trying to create a data source in Tableau (10.0) where I am joining a table from SQL with an Excel file. The join happens on a site id but when reading the id from the excel source, Tableau strips the leading zeros (and SQL keeps leading zeros). I see this example
to add the leading zeros back as a new, calculated field. But the join is still dropping rows because the id is not properly formatted when making the join.
How do I get the excel data source to read the column with the leading zeros so I can do the join?
Launch Excel and choose to open a new blank workbook.
Click the Data tab and select From Text.
Browse to the saved CSV file and select Import.
Ensure that Delimited is selected and click Next.
Leave Tab as the delimiter and click Next.
Select the column containing the data with leading zeros and click
Text.
Repeat for each column which contains leading zeros.
Click Finish.
Click OK.
Never heard of or used tableau, but it sounds as though something (jet/ace database driver being used to read excel file?) is determining the column to be numeric and parsing the data as numbers, losing leading zeroes
If your attempts at putting them back are giving you grief, I'd recommend trying the other direction instead; get sqlserver to convert its strings to numbers. Number matching should be more reliable than String matching, so long as the two systems don't handle rounding differently :)
If your Excel file was read in from a CSV and the Site ID is showing "Number Stored as Text", I think you can solve your problem by telling Tableau on the Data Source entry that the field is actually a string. On the preview data source view, change the "#" (designating number) to string so that both the SQL source and the Excel source are both strings before doing the join.
This typically has to do with the way Excel stores values as mentioned above. I would play around with the number formatting for the Site ID column in Excel itself, not Tableau, and changed that two "Text" in Excel. You can verify if Tableau will read it properly with the leading 0s by exporting your excel file to csv and looking in the csv files to see if the leading 0s are still there.

How to increase display length in pg admin tool [duplicate]

This question already has answers here:
pgAdmin III Why query results are shortened?
(2 answers)
Closed 6 years ago.
I have a dumb problem. Basically I just upgraded from pgsql 8.4 to 9.1 and upgrade to pgAdmin 1.20.
I have some tables that have large text fields and in the previous query tool I could query a row and copy-paste the data out of it to modify. In this case, I had a table that stored queries that I could run.
Once I upgraded to the new pgAdmin version, when I use the tool and query a row to pull out the text from a field in that row, it truncates the result and ends with an ellipsis (...).
I tried figuring out how to increase the mem on this so it doesn't truncate after 100 characters or so but couldn't.
Anybody have any ideas??
In pgAdmin options, you can change the length of the field. Do the following,
Go to:
File > Options > Query Tool > Max. characters per column
By default it is 256, you can increase it accordingly.
Hope this helps
Marlon Abeykoon's answer is good, but if you want a one-off output and don't want to change settings, then simply output to a file (two buttons along from the usual green 'go' arrow). This saves the entire output in a csv file.