I am working in Talend and have managed to get data from a CSV file and filter it before putting into a table. The problem I am having now is to do with the length of the data.
This is the red text I get:
For input string: "null"
For input string: "null"
For input string: "null"
Data truncation: Data too long for column 'Street' at row 58
For input string: "null"
For input string: "null"
For input string: "null"
For input string: "null"
For input string: "null"
Data truncation: Data too long for column 'Street' at row 23
...
etc.
I have changed the length of the data in my schema such that it is propagated everywhere. I really don't know what to do to resolve this.
As pointed above, this problem is caused because the output data is longer than the maximum defined length in the database's table. Check the definition and alter it if possible.
Other solution is to output a short version of the data using an expresion like row1.column.substring(0,50) in the output flow.
Related
I have a json file file that contains some empty strings:
I have changed them by null using regex expression on ReplaceText :
Then i transformed the Json to CSV using Convert Record:
The final step was loading the csv to postgres db.
i got this result:
But i don't need the empty string in the database i need the Null value:
Have anyone an idea that can help me solve that issue ?
Trying to extract value stored after key threeDSEci i.e. "01" .Unable to understand what does these s:5/s:2 etc. means here.
Sample Column value
a:8:{s:18:"additional_charges";N;s:12:"salt_version";s:7:"NEWSALT";s:5:"sbemi";N;s:2:"si";s:1:"0";s:15:"mcpBaseCurrency";s:3:"INR";s:10:"threeDSEci";s:2:"01";s:15:"threeDSEnrolled";s:1:"Y";s:13:"threeDSStatus";s:1:"A";}"
I am using a .CSV file and I am using SSIS 2008.
Source: .CSV
Destination: Database
SSIS see all the columns in my .CSV as a string.
But I need to push 1 column into a database column of type INT.
Nothing want to work though, I have placed a DataConversion in my SSIS package between the source and the destination and none of the Data Type's of integer wants to work.
Error message:
The conversion returned status value 2 and status text "The value
could not be converted because of a potential loss of data."
Any input on whats going on here? It does not even want to work using the Wizard.
Edit: Sample data as requested------------------------------------------------------------------------
Name|ID
Mike|1266
John|NULL
ok... your issue is the fact that you have the physical value "NULL" in your CSV. these should be left blank. By having "NULL" there, it is trying to convert the string "NULL" to an int which it cannot do.
Just make sure all "NULL" values are blank (zero length string).
that should sort you out.
I run this
db2 "IMPORT FROM C:\my.csv OF DEL MODIFIED BY COLDEL, LOBSINFILE DATEFORMAT=\"D/MM/YYYY\" SKIPCOUNT 1 REPLACE INTO scratch.table_name"
However some of my rows have a empty date field so I get this error
SQL3191N which begins with """" does not match the user specified DATEFORMAT, TIMEFORMAT, or TIMESTAMPFORMAT. The row will be rejected.
My CSV file looks like this
"XX","25/10/1985"
"YY",""
"ZZ","25/10/1985"
I realise if I insert charater instead of a blank string I could use NULL INDICATORS paramater.
However I do not have access to change the CSV file. Is there a way to ignore import a blank string as a null?
This is an error in your input file. DB2 differentiates between a NULL and a zero-length string. If you need to have NULL dates, a NULL would have no quotes at all, like:
"AA",
If you can't change the format of the input file, you have 2 options:
Insert your data into a staging table (changing the DATE column to a char) and then using SQL to populate the ultimate target table
Write a program to parse ("fix") the input file and then import the resulting fixed data. You can often do this without having to write the entire file out to disk – your program could write to a named pipe, and the DB2 IMPORT (and LOAD) utility is capable of reading from named pipes.
I'm not aware of anything. Yes, ideally that date field should be null.
Probably the best thing to do would be load the data into a scratch/temp table where that isn't a date column - just leave it as character data (it looks like you're already using a scratch table anyways). It should be trivial after that to use a CASE statement to transform the information into a null date if the value is blank, when doing your INSERT to the real table.
Has anyone had experience dealing with DB2 stored procedures which take a CLOB input parameter and calling that stored procedure from BizTalk?
I've tried changing the schema type to string, base64binary, hexbinary, byte, but no matter what I get this error:
Error details: The parameter value for parameter 1 could not be converted to a native data type. Parameter Name: P_EML_BODY, Data Type: Long strings of input text<br> More long strings of input text <br>More long strings of input text, Value : CharForBit
It might be the way the parameters are being created and in what order. Take a look here. Are any of them null or empty?