Datastage decimal separator, how can modify? - datastage

I am using datastage in order to generate a csv file with teradata source, when i modifu the job properties searching comma as a decimal separator in local categories it dosnt change, what is the correct way to do this change ?

My issues: datasource teradata, with a job action need to transfor and get out a csv file, but had more than 8 decimal files, my datastage configuration have point as a decimal separator and i need comma. In teradata more than 8 oreplaces give u back row over size error.
Solution: Get the source and cast to varchar, and with datastage trasnform used change function converting the replacing comma in place to point, and get out fields with varchar data type with correct decimal separator.

Related

Keep leading zeros when joining data sources in tableau

I am trying to create a data source in Tableau (10.0) where I am joining a table from SQL with an Excel file. The join happens on a site id but when reading the id from the excel source, Tableau strips the leading zeros (and SQL keeps leading zeros). I see this example
to add the leading zeros back as a new, calculated field. But the join is still dropping rows because the id is not properly formatted when making the join.
How do I get the excel data source to read the column with the leading zeros so I can do the join?
Launch Excel and choose to open a new blank workbook.
Click the Data tab and select From Text.
Browse to the saved CSV file and select Import.
Ensure that Delimited is selected and click Next.
Leave Tab as the delimiter and click Next.
Select the column containing the data with leading zeros and click
Text.
Repeat for each column which contains leading zeros.
Click Finish.
Click OK.
Never heard of or used tableau, but it sounds as though something (jet/ace database driver being used to read excel file?) is determining the column to be numeric and parsing the data as numbers, losing leading zeroes
If your attempts at putting them back are giving you grief, I'd recommend trying the other direction instead; get sqlserver to convert its strings to numbers. Number matching should be more reliable than String matching, so long as the two systems don't handle rounding differently :)
If your Excel file was read in from a CSV and the Site ID is showing "Number Stored as Text", I think you can solve your problem by telling Tableau on the Data Source entry that the field is actually a string. On the preview data source view, change the "#" (designating number) to string so that both the SQL source and the Excel source are both strings before doing the join.
This typically has to do with the way Excel stores values as mentioned above. I would play around with the number formatting for the Site ID column in Excel itself, not Tableau, and changed that two "Text" in Excel. You can verify if Tableau will read it properly with the leading 0s by exporting your excel file to csv and looking in the csv files to see if the leading 0s are still there.

sqlldr test for number format

I am loading data into Oracle 12c using sqlldr using a CTL file as below :
OPTIONS (rows=1000, bindsize=100000, readsize=100000, silent=header,feedback)
load data
CHARACTERSET UTF8
insert into table TABLEA
fields terminated by '^' optionally enclosed by ','
trailing nullcols
(
NAME,
VOLUME "decode(:VOLUME,null,0,to_number(:VOLUME,'9999999999D999'))",
TEXT
)
I am facing difficulty when the number field VOLUME defined in table as NUMBER(13,3) comes in different formats.
ABCD^1089.830^CIQ
ABCD^1,089.830^CIQ
ABCD^1.089,830^CIQ
Is there a way to load all three formats of number field in field 2 above using sqlldr ?
Expected value in the table is 1089.830 for the all three cases .
Thanks.
This is a tad ugly but it should work. It assumes your volume value data will always have 3 decimal points and the decimal symbol will be a period (based on your number format). First pass removes all region-specific characters, then second pass puts the period in 3 places from the end:
...
VOLUME decimal external "regexp_replace(regexp_replace(:VOLUME, '[\.,]', ''), '([0-9]+)([0-9]{3})', '\\1.\\2')",
...
You may not need the "decimal external", try it without and see.

Microsoft Word Mail Merge - Percentage format when cell contains both percent and text

I am performing a mail merge and have an issue when trying to correct the percentage format. The problem is that the source column contains both a percent value and text. If I map the field, percents display as decimal in word. If I use the following, it displays correctly:
{=«Percent»*100 # 0%}
However, now when the row contains text I receive an error.
Is there another way I can do this?
Here is the formula you need
{={MERGEFIELD XYZ}*100\ #0.00%}
No, Word has no way to do string manipulation in its fields. Add another field / column to your data source for the text, or format the percentile in Excel before performing the merge.

Postgresql: Execute query write results to csv file - datatype money gets broken into two columns because of comma

After running Execute query write results to file - the columns in my output file for datatype money get broken into two columns. e.g if my revenue is $500 it is displayed correctly. But, if my revenue is $1,500.00 - there is an issue. It gets broken into two columns $1 and $500.00
Can you please help me getting my results in a csv file in a single column for datatype money?
What is this command "execute query write results to file"? Do you mean COPY? If so, have a look at the FORCE QUOTE option http://www.postgresql.org/docs/current/static/sql-copy.html
Eg.
COPY yourtable to '/some/path/and/file.csv' CSV HEADER FORCE QUOTE *;
Note: if the application that is consuming the csv files still fails because of the comma, you can change the delimiter from "," to whatever works for you (eg. "|").
Additionally, if you do not want CSV, but you do want TSV, you can omit the CSV HEADER keywords and the results will output in tab-separated format.
Comma is the list separator of our computer for some regions, some region semicolon is the list separator. so I think you need to replace the comma when you write it to csv.

Converting / Casting an nVarChar with Comma Separator to Decimal

I am supporting an ETL process that transforms flat-file inputs into a SqlServer database table. The code is almost 100% T-SQL and runs inside the DB. I do not own the code and cannot change the workflow. I can only help configure the "translation" SQL that takes the file data and converts it to table data (more on this later).
Now that the disclaimers are out of the way...
One of our file providers recently changed how they represent a monetary amount from '12345.67' to '12,345.67'. Our SQL that transforms the value looks like SELECT FLOOR( CAST([inputValue] AS DECIMAL(24,10))) and no longer works. I.e., the comma breaks the cast.
Given that I have to store the final value as Decimal (24,10) datatype (yes, I realize the FLOOR wipes out all post-decimal-point precision - the designer was not in sync with the customer), what can I do to cast this string efficiently?'
Thank you for your ideas.
try using REPLACE (Transact-SQL):
SELECT REPLACE('12,345.67',',','')
OUTPUT:
12345.67
so it would be:
SELECT FLOOR( CAST(REPLACE([input value],',','') AS DECIMAL(24,10)))
This works for me:
DECLARE #foo NVARCHAR(100)
SET #foo='12,345.67'
SELECT FLOOR(CAST(REPLACE(#foo,',','') AS DECIMAL(24,10)))
This is probably only valid for collations/culture where the comma is not the decimal separator (ie: Spanish)
While not necessarily the best approach for my situation, I wanted to leave a potential solution for future use that we uncovered while researching this problem.
It appears that the SqlServer datatype MONEY can be used as a direct cast for strings with a comma separating the non-decimal portion. So, where SELECT CAST('12,345.56' AS DECIMAL(24,10)) fails, SELECT CAST('12,345.56' AS MONEY) will succeed.
One caveat is that the MONEY datatype has a precision of 4 decimal places and would require an explicit cast to get it to DECIMAL, should you need it.
SELECT FLOOR (CAST(REPLACE([inputValue], ',', '') AS DECIMAL(24,10)))