Decimals in DB2 - db2

I use iNavigator to query Data from DB2. During calculations, is there a possibility to get the output in the following format? Currently the output is exported to excel and decimals are assigned.
1235*3.24 = 4,001.40

This will resolve the issue; formatting adds commas:
VARCHAR_FORMAT( SUM(BAL_BOOK_AMT), '999,999,999,999.99') AS SUM

Related

Import CSV file into PostgreSQL while rounding from decimal to integer

I am loading a 10 GB CSV file into an AWS Aurora postgres database. This file has a few fields where the values are decimal and the values are +/- 0.1 from whole number, but in reality they are supposed to be integers. When I loaded this data into Oracle using SQLLDR I was able to round the fields from decimal to integers. I would like to do the same in the PostgreSQL database using the \copy command, but I can't find any options which allow this.
Is there a way to import this data and round the values during a \copy without going through a multistep process like creating a temporary table?
There doesn't seem to be a built-in way to do this as I have seen in other database applications.
I didn't use an external program as suggested in the comments, but I did preprocess the data using an awk script that read each line and reformatted the incorrect field with the printf function to round the output with the parameter "%.0f".

Crystal Reports Export to .csv comma delimited string inserting blank columns

I have a issue and need some expert help! I'm trying to export a .csv field directly from crystal reports and I keep getting blank columns in between my datasets. The report is one formula only in the details section, that contains a string separated by comma's like below. Any help or suggestions is greatly appreciated!
*Side note the requirements are an export directly to .csv so no export to data only then save to .csv will work.
Your approach implicitly converts numbers to text. This might results in extra commas due to thousand separators.
Instead, use explicit conversion such as
ToText({your number}, 2, "")
to avoid the extra commas.

how to export large numbers from HeidiSQL to csv

I use HeidiSQL to manage my database.
When I export grid row to CSV format file, the large number 89610185002145111111 become 8.96102E+19
How can I keep the number without science notation conversion?
HeidiSQL does not do such a conversion. I tried to reproduce but I get the unformatted number:
id;name
89610185002145111111;hey
Using a text editor, by the way. If you use Excel, you may have to format the cell in a different format.

Datastage decimal separator, how can modify?

I am using datastage in order to generate a csv file with teradata source, when i modifu the job properties searching comma as a decimal separator in local categories it dosnt change, what is the correct way to do this change ?
My issues: datasource teradata, with a job action need to transfor and get out a csv file, but had more than 8 decimal files, my datastage configuration have point as a decimal separator and i need comma. In teradata more than 8 oreplaces give u back row over size error.
Solution: Get the source and cast to varchar, and with datastage trasnform used change function converting the replacing comma in place to point, and get out fields with varchar data type with correct decimal separator.

SAP HANA Decimal to timestamp or seconddate SLT

I am using SLT to load tables into our Hana DB. SLT uses the ABAP dictionary and sends timestamps as decimal (15,0) to the HANA DB. Once in the HANA DB via a calculated column in a calculation view, I am trying to convert the decimals to timestamps or seconddates. Table looks like this:
I run a small SLT transformation to populate columns 27-30. The ABAP layer in SLT populates the columns based on the Database transactions.
The problem comes when I try and convert columns 28-30 to timestamps or seconddates. using syntax like this:
Select to_timestamp(DELETE_TIME)
FROM SLT_REP.AUSP
Select to_seconddate(DELETE_TIME)
FROM SLT_REP.AUSP
I get the following errors:
Problem being, It works some times as well:
The syntax in calculated column looks like this:
With the error from calculation view being:
Has anyone found a good way to convert ABAP timestamps (Decimal (15,0)) to Timestamp or Seconddate in HANA?
There are conversion functions available, that you can use here (unfortunately not very well documented).
select tstmp_to_seconddate(TO_DECIMAL(20110518082403, 15, 0)) from dummy;
TSTMP_TO_SECONDDATE(TO_DECIMAL(20110518082403,15,0))
2011-05-18 08:24:03.0
The problem was with the ABAP data type. I was declaring the target variable as DEC(15,0). The ABAP extracting the data was rounding up the timestamp in some instances to the 60th second. Once in Target Hana, the to_timestamp(target_field) would come back invalid when a time looked like "20150101121060" with the last two digits being the 60th second. This is invalid and would fail. The base Hana layer did not care as it was merely putting a length 14 into into a field. I changed the source variable to be DEC(21,0). This eliminated the ABAP rounding and fixed my problem.