I have a job that flows like this.
tAccessDatabse_1 ---> tFileOutputXML_1.
Now, my database has a schema, with usename and userid. My task to create/send data from the database to xml file, file name with username i.e, one file has to create for every user with his/her name.
I tried like creating a conetxt varible but how can i set username to that context variable from the database ??
Select distinct username from table.
Use tFlowToIterate to iterate on each of the usernames. (connect table component to this component using main link)
Use Iterate link to connect to tJava component.
Assign the username to context variable using tjava component. For eg. if output row from the table component is row1, then context.username=row1.username.
Connect tJava to a table component using 'OnComponentOk' to Select data from table based on the where condition: username='"+context variable+"'
Write data into file. Give filename as "<path>\"+context.username.
tYOURDBInput -> [row1] -> tFlowToItterate -> [itterate] -> tJava -> "globalMap.put("DESC", (String)row1.column);"
If you have just the one line then pick it up elsewhere via
(String) globalMap.get("DESC")
I used this setup to retrieve passwords to foreign systems stored in a table which are to be refreshed regurarily. This prevents code rebuilds everytime the password changes. Do protect your table naturally.
Related
I have a dataset based on a csv file. This exposes a data as follows:
Name,Age
John,23
I have an Azure SQL Server instance with a table named: [People]
This has columns
Name, Age
I am using the Copy Data task activity and trying to copy data from the csv data set into the azure table.
There is no option to indicate the table name as a source. Instead I have a space to input a Stored Procedure name?
How does this work? Where do I put the target table name in the image below?
You should DEFINITELY have a table name to write to. If you don't have a table, something is wrong with your setup. Anyway, make sure you have a table to write to; make sure the field names in your table match the fields in the CSV file. Then, follow the steps outlined in the description below. There are several steps to click through, but all are pretty intuitive, so just follow the instructions step by step and you should be fine.
http://normalian.hatenablog.com/entry/2017/09/04/233320
You can add records into the SQL Database table directly without stored procedures, by configuring the table value on the Sink Dataset rather than the Copy Activity which is what is happening.
Have a look at the below screenshot which shows the Table field within my dataset.
I'm designing a Copy Data task where the Sink SQL Server table contains an Identity column. The Copy Data task always wants me to map that column when, in my opinion, it should just not include the column in the list of columns to map. Does anyone know how I can get the ADF Copy Data task to ignore Sink Identity columns?
If you are using copy data tool, and in your sql server, the ID is set as auto-increment, then it should not show out at the mapping step. Please tell us if it is not the case.
If you are using the create pipeline/dataset, you could just go to the sink dataset schema tab, remove the id column. And then go to the copy activity mapping tab, click import schemes again. ID column should has disappeared now.
You could include a SET_IDENTITY_INSERT_ON statement for the given table before executing the copy step. After completed, set it to OFF.
I have one already existing Talend Open Studio tMySQLInput component with some sql code inside it, in order to retrieve some joined columns linked to a tMySQLOuput component (pointing to an already existing MySQL table) with few records.
QUESTION:
Will the "tMySQLInput" component overwrite the already existing table data that the tMySQLOutput component relates to? I mean is there an option to check in the tMySQLInput our output in order to say, overwrite each time this job is executed ?
Thank you all.
Yes, there is an option where in tMySQLOutput where you can specify what action you want to do to your table. Follow following steps:
Go to component tab of tMySQLOutput, it will open the basic settings of this component.
If you will look closer you will find Action on table. This is the action which you can perform on the table which is pointed by tMySQLOutput. It has options as Default, Drop and Create Table etc.
Then you have Action on data. These are the options which you can perform on the data like Insert, Update etc.
In your case I suppose you can choose Action on Table as Default and Action on Data as Insert. Default action would not do anything on the table and Insert option would insert the records at the end of table. But in case of Insert if you will have duplicate rows then job would stop the moment it will find any duplicate row.
With a few solutions Ive worked with I've created temp table's or history tables. Normally I script it to take a handful of fields needed from a main table and copy it over to the other table by
Setting a variable then setting field to the variable for each field in the new table / new record.
I have a situation now, where Im building a history table that needs to copy the current record as is. A snapshot where all fields from that instance of the record are copied to the history table.
Rather then setting a variable then set field to the variable, Id like to get some input on a quicker way to get this done where I can do this on a record level and not type out field by field to get it done. Also if fields are added to both tables then I have to make sure my script gets updated.
Ill keep hunting around.. appreciate any help.
-Rich
Do you have a sample of copying a record from 1 table to another
including all fields and setting some fields?
As I suggested in comments, use the Import Records[] script step, and select the same file as the source. If you choose Arrange by: [ matching names ] in the Import Field Mapping dialog, it will automatically map all source fields to their similarly named counterparts.
Note that you must establish a found set in the source table before importing.
For "setting some fields", you can define auto-enter options and activate them during the import, or run Replace Field Contents[] immediately after the import.
I would like to use a parameter for the table name. I have an application that creates several new tables each month. I therefore need the table name to be sent into CR via a parameter. The fields for the tabkes are always identical. I can present a list (view) from the database to the end user that would display a user friendly name for the table, when the user selects the instance they want I then have the table name I want to report from.
I'm not sure if that is possible. Even if your tables have the same structure, the field names will ultimately be different. Let's say you have a {table1.field1} in your report. Now you want to run the report from table2 instead of table1. So your field now would have to become {table2.field1}. Does that make sense? I think a better approach might be to try stored procedures that will create the fields you need so the field names won't change.