Importing data to SQL from Excel using Talend - talend

I am trying to import data to SQL from Excel. I have created a successful connection with the database but while trying to retrieve the schema I am not getting my table, instead I am having the schema of the database (Type CATALOG).
How do I get the schema of the table to which I will export the Excel data?
I have refereed to this video to do the import.
http://www.youtube.com/watch?v=JDBYU9f1p-I

What you can use is tFileExcelSheetOutput, map what you need with tMap and send the to t[DB]Input.
http://www.talendbyexample.com/talend-tdbinput-reference.html

Related

How to transform data type in Azure Data Factory

I would like to copy the data from local csv file to sql server in Azure Data Factory. The table in sql server is created already. The local csv file is exported from mysql.
When I use copy data in Azure Data Factory, there is an error "Exception occurred when converting value 'NULL' for column name 'deleted' from type 'String' to type 'DateTime'. The string was not recognized as a valid DateTime.
What I have done:
I checked the original value from column name 'deleted' is NULL, without quotes(i.e. not 'NULL').
I cannot change the data type during file format settings. The data type for all column is preset to string as default.
I tried to create data flow instead of copy data. I can change the data type from source projection. But the sink dataset cannot select sql server.
What can I do to copy data from CSV file to sql server via Azure Data Factory?
Data Flow doesn't support on-premise SQL. We can't create the source and sink.
You can use copy active or copy data tool to do that. I made an example data which delete is NULL:
As you said the delete column data is Null or contains NULL, and ALL will be considered as String. The key is that your Sink SQL Server table schema if it allows NULL.
I tested many times and it all works well.

Can't use Data Explorer as a sink in Data Flow

I'm trying to do a Data Flow using ADL1 as the source and Data Explorer as the sink; I can create the source but when I select Dataset for Sink Type the only available options in the Dataset pulldown are my ADL1 Datasets. If I use Data Copy instead I can choose Data Explorer as a sink but this won't work as Data Copy won't allow null values into Data Explorer number data types. Any insight on how to fix this?
I figured out a workaround. First I Data Copy the csv file into a staging table where all columns are strings. Then I Data Copy from staging table to production table using a KQL query that converts strings to their destination data types.

How to create star schema from csv files of different schema using ETL talend

I need to apply ETL process on many csv files having different schema, to load them at the end into a single star schema.
How can I do that using talend ETL tool ?
Firstly import the csv file into talend. Create a job tom move csv data to database. In output component (if oracle tOracleOutput) you have an option to create table if it does not exists. Like wise you can import all the csv files and create schema in the database.
Thank you

Lift load Dateformat issue from csv file

we are migrating db2 data to db2 on cloud. We are using below lift cli operation for migration.
Extracting a database table to a CSV file using lift extract from source database.
Then loading the extracted CSV file to db2 on cloud using 'lift load'
ISSUE:
We have created some tables using ddl on the target db2oncloud which have some columns with DATA TYPE "TIMESTAMP"
while load operation(lift load), we are getting below error"
"MESSAGE": "The field in row \"2\", column \"8\" which begins with
\"\"2018-08-08-04.35.58.597660\"\" does not match the user specified
DATEFORMAT, TIMEFORMAT, or TIMESTAMPFORMAT. The row will be
rejected.", "SQLCODE": "SQL3191W"
If you use db2 as a source database, then use either:
the following property during export (to export dates, times, timestamps as usual for db2 utilities - without double quotes):
source-database-type=db2
try to use the following property during load, if you have already
exported timestamps surrounded by double quotes:
timestamp-format="YYYY-MM-DD-HH24.MI.SS.FFFFFF"
If the data was extracted using lift extract then for sure you should load the data with source-database-type=db2. Using this parameter will preconfigure all the necessary load details automatically.

Import a CSV file into MySQL workbench into a new table dynamically

I can import a CSV file data into a existing table in MySQL Workbench, using Load data infile , but what if I have a file with 25 columns, it becomes a pain to create a structure for such tables before importing.
Is there a way to import CSV files without creating the structure, like proc import in SAS?
Yes, there is. Try the new 6.3 release (currently in RC, soon to be GA) which comes with a new table data import/export feature that supports CSV and JSON data. It creates the table on the fly during import.