How to change date format in talend txmlmap component - talend

Unable to parce the date parceTodate
Error am getting in talend txmlmap component
How to parce the date from 2022-11-28T02:24:20.755-05:00 to 2022-11-28 02:24:20 in txmlmap component in talend

Related

How to speed up writing into Impala from Talend

I'm using Talend Open Studio for Big Data (7.3.1), and I write files from various sources to Cloudera Impala (Cloudera QuickStart 5.13) but that takes too much time and writes only ~3300 rows/s (take a look at the pictures).
Is there way to raise writing to ~10000-100000 rows/s or even greater?
Am i using wrong approach for the load?
Or do i need to configure Impala/Talend better?
Any advice is welcome!
UPDATE
I install JDBC Impala driver:
But OutputFile looks like it not configured for Impala:
Error:
Exception in component tDBOutput_1 (db_2_impala)
org.talend.components.api.exception.ComponentException: UNEXPECTED_EXCEPTION:{message=[Cloudera]ImpalaJDBCDriver ERROR processing query/statement. Error Code: 0, SQL state: TStatus(statusCode:ERROR_STATUS, sqlState:HY000, errorMessage:AnalysisException: Impala does not support modifying a non-Kudu table: algebra_db.source_data_textfile_2
), Query: DELETE FROM algebra_db.source_data_textfile_2.} at org.talend.components.jdbc.CommonUtils.newComponentException(CommonUtils.java:583)

Issues while using Snowflake component In Talend

To transfer data from Ms sql server 2008 to Snowflake I used talend , but every time I get error as
java.io.IOException: net.snowflake.client.loader.Loader$ConnectionError: State: CREATE_TEMP_TABLE, SQL compilation error: error line 1 at position 68
invalid identifier '"columnname"'
at org.talend.components.snowflake.runtime.SnowflakeWriter.close(SnowflakeWriter.java:397)
at org.talend.components.snowflake.runtime.SnowflakeWriter.close(SnowflakeWriter.java:52)
at local_project.load_jobnotes_0_1.Load_Jobnotes.tMSSqlInput_1Process(Load_Jobnotes.java:2684)
at local_project.load_jobnotes_0_1.Load_Jobnotes.runJobInTOS(Load_Jobnotes.java:3435)
at local_project.load_jobnotes_0_1.Load_Jobnotes.main(Load_Jobnotes.java:2978)
Caused by: net.snowflake.client.loader.Loader$ConnectionError: State: CREATE_TEMP_TABLE, SQL compilation error: error line 1 at position 68
invalid identifier '"ID"'
at net.snowflake.client.loader.ProcessQueue.run(ProcessQueue.java:349)
at java.lang.Thread.run(Thread.java:748)
Caused by: net.snowflake.client.jdbc.SnowflakeSQLException: SQL compilation error: error line 1 at position 68
The column does exist in my Snowflake DB still I get error as column does not exist
On analysing what query Talend executing in snowflake I found that It tries to create a temporary table to store data but in doing so it selects all column from table between “ ” double quotes and hence error comes as invalid identifier '"columnname"'
If I execute the same query manually without double quotes its works fine , can you please let us know what is workaround of this issue
Query executed by talend in snowflake for your reference
CREATE TEMPORARY TABLE "Tablename_20171024_115736_814_1"
AS SELECT "column1","column2","column3"
FROM "database"."schema"."table" WHERE FALSE
The issue is most likely due to a case mismatch between the object names in Snowflake and what is being sent through the connector. On the Snowflake side, all object names are stored as UPPER CASE. Suggest you try passing COLUMN1, COLUMN2, etc and see if that works.
You can also try setting the QUOTED_IDENTIFIERS_IGNORE_CASE to true, it might help.
I found that this issue is due to mixed case database or schema names not properly being applied by Talend. I discover a hack by updating the Snowflake connector role parameter and added something such as this screenshot:

Error while trying to create a new edmx file - VS 2013 , Entity Framework 5.0

As per my requirement i need to create a edmx file and establish a connection to sql server thats residing in a remote server and am following the VS 2013 , and using wizard model am trying to connect to DB.
But its throwing the error:
what am I missing, do i need set any settings / some installables?
Firstly i am getting the sqlclr types error, so i installed the same and other error was sharedmanagementobjects error, so i installed that msi as well.
my sql server is SQL SERVER 2012 STD - SP2,VERSION NUMBER : 11.0.5058
Loading metadata from the database took 00:00:03.2592313.
Generating the model took 00:00:03.1761556.
Could not save the XML to the configuration file 'D:\PoCSolutions\EDMXTEST_WINFORMS1\EDMXTEST_WINFORMS1\App.config' because of the error 'Access to the path 'D:\PoCSolutions\EDMXTEST_WINFORMS1\EDMXTEST_WINFORMS1\App.config' is denied.'.
Unable to update the App.Config file because of the following exception: 'Access to the path 'D:\PoCSolutions\EDMXTEST_WINFORMS1\EDMXTEST_WINFORMS1\App.config' is denied.'
Writing the .edmx file took 00:00:00.0009981.
I am getting this error when trying to create a new .edmx file using VS 2013 with Entity Framework 5.0
acess-denied-edmx-5.0-VS2013

How to convert isodate to utc in talend

I am getting
"Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main".In talend for version 6.2.2 in big data while trying to convert string which is formatted in UTC time format ,And the error occurred during the conversion of UTC string to long .

Solr 4.5 not saving time correctly

I have defined a date field in the Solr, I'm using DIH to populate value from DB to Solr. InsertTs value in solr always storing either 4:00:00 or 5:00:00 but the date part is stored properly.
Solr Value: 2013-11-07T05:00:00Z or 2015-05-13T04:00:00Z
DB Value: 07-11-13 02:29:53.00 PM or 07-11-13 12:00:00.00 AM
Schema.xml: INSERTTS is defined as type "date"
DIH: name="INSERTTS" column="INSERTTS"
DIH Query:
SELECT TO_DATE(TO_CHAR(INSERTTS, 'yyyy-mm-dd hh24:mi:ss'), 'yyyy-mm-dd hh24:mi:ss') AS INSERTTS FROM EMPLOYEE
InsertTs is defined as TimeStamp in the db.
Solr is Running on Tomcat server in Linux machine. Linux machine is in EDT timezone.
DB is Oracle 11g and in UTC timezone.
Issue was with JDBC driver it was not fetching time part from the date field.