How to get multiple resultset from single tDBinput in Talend? - talend

In talend open studio, which component should I use to call my stored procedure, which returns multiple resultsets?
Most of the component only has a single output.

Related

ADF - what's the best way to execute one from a list of Data Flow activities based on a condition

I have 20 file formats and 1 Data Flow activity that maps to each one of them. Based on the file name, I know which data flow activity to execute. Is the only way to handle this through a "Switch" activity? Is there another way? e.g. can I parameterize the data flow to execute by a variable name?:
Unfortunately , there is no option to run one out of list of dataflows based on input condition.
To perform data migration and transformation for multiple tables, you can use same dataflow and parameterize the dataflow by providing the table names either during the runtime or use a control table to hold all the tablenames and inside foreach , call the dataflow activity. In the sink settings, use merge schema option.

Exporting the results of an Anylogic experiment

I have built my model and run the experiment. I cannot seem to find where the data is stored.
I now need to conduct several runs and compare the results, I am using normally distributed repair times so the results should vary between runs without modifying parameters.
How can I keep the results of each run and then present them all in the same data set?
There are two main options for getting data out of your simulation
Using the internal AnyLogic database
Using external files like Excel or txt
Step 1: Setup your objects
Internal Database
Create an empty table with the columns you require
External object
Setup either an Excel or text file using the objects provided by AnyLogic in the Connectivity palette
Step 2: Saving your data
For both cases you need to write your data to the object of your choosing, either as the data gets generated or at the end of the simulation model
using Internal DB
The best option is to write data using the following command
insertInto(table_name)
.columns(column_name)
.values(value);
This will just insert a new line into a database table that you created, you can save multiple values to multiple columns by adding comma-separated entries into the parameters for columns and values.
e.g
insertInto(temeprature_output_table)
.columns(scenario_name, time, temperature)
.values("sceanrio1", 10,5, 102);
External files
2.1) Using Excel
filename.setCellValue(value, sheetName, row, column);
or even better you can write out an entire dataset
excelFile.writeDataSet(dataset, sheetName, row, column);
2.2) Using a text file
fileName.println("value" + "\t" + " value 2");
You can use whatever separator you want "\t" for tab separated or "," for comma and so on
Step 3: Finish and export data
Internal Database
At the end of a simulation run, you can simply export the data
See help here https://anylogic.help/anylogic/connectivity/export-excel.html#exporting-data-to-ms-excel-workbook
P.S. It is possible to automate this with some effort
External file
On Excel you need to call .writeFile() to finish.
On both objects, you need to call .close() for them to be closed and saved to memory.
FYI
Excel has the option to save on termination.
Read more on using Excel here - https://anylogic.help/anylogic/connectivity/excel-file.html#writing-to-excel-file
And on text file here
https://anylogic.help/anylogic/connectivity/text-file.html#replicated
There is also an example model

SSAS Tabular - Deploy to multiple models

We are trying to create a SSAS tabular model for 60-100 customers.
In regards to creating a single model and process all customer's data is time consuming (until the data refresh is finished,each and every customer need to wait for the latest data - we update every 15 min).
However creating multiple tabular models is easy to re process and trouble shoot but difficult to maintain or deploy changes. If I need to add new measures or tables,I would like to apply to all the models.
I was wondering if anyone can suggest best way to deploy changes/additions across different tabular models.
If you've worked with SSIS this can be used to deploy across multiple sites. An overview of this is below. What this will do is take a list a server names that you supply, iterate through them, and execute the DDL for the updated Tabular model to each one. This same method can also be used for cube processing, with the create DDL replaced with a processing script. If the model is deployed to a server for the first time ensure that it's processed before it's queried or used by any client tools, and make sure the processing of changed objects is handled accordingly as well.
When connected to SSAS in SSMS, right-click the model, select Script > Script Database As > Create or Replace To > then choose where to output the Script. Note that this will not include the password for security purposes and this will need to be handled accordingly.
Create an SSIS package. In the package create an Analysis Server Connection Manager. This can be set to a server where this Tabular database currently exists.
Create a String variable and leave in blank. This can be called DeployServerName. Also create an object variable, which can be called ServerList. On the SSAS Connection Manager, go to the properties window (press F4), then select the Expressions ellipsis. On the window that comes up, choose the ServerName property and set the DeployServerName variable as the expression. This will allow the server name to change to multiple servers for deployment.
Add an Execute SQL Task in the data flow. This is where you will get the server names to deploy to. If they're stored in a master/lookup table just select the column holding the server name as the SQL statement. You can also add the destination server names individually with UNIONs selecting plain text.
Example
SELECT 'Server1' AS DestServer
UNION
SELECT 'Server2' AS DestServer
On the Execute SQL Task, set the ResultSet property to Full Result Set. Then on the Result Set pane, enter 0 for the Result Name and the object variable created earlier (ServerList) for the Variable Name field.
Next create a Foreach Loop after the Execute SQL Task and connect this to it. Use the Foreach ADO Enumerator Enumerator type and select the object variable (ServerList) as the ADO Object Source Variable. On the Variable Mappings pane, place the string variable (DeployServerName) at Index 0.
Inside the Foreach loop add an Analysis Services Execute DDL Task. Use the SSAS Connection manager you created as the connection, Direct Input as the SourceType, and enter the script generated in SSMS as the SourceDirect statement.

Tracking the job progress in Talend

I have to copy data from excel sheets to the sql server tables.
I want to track my job progress as in I would like to have output message saying 'data is been loaded in tableX' after each table's completion.
I tried to use tLogRow but it outputs each row being copied.
Which component should I use and how do I do it?
I want my messages to be printed while running from command line as well.
You can do this by logging to the console in a tJava component for each of your tMSSqlOutput components and link them with an onComponentOk link.
To print to the console you can use System.out.println("data is been loaded in tableX");.
You'll then see the output of this in your run tab and also in any logs produced when the job is ran just as you would with a tLogRow component.
A slightly more lengthy approach but without writing this small snippet of Java code would be to link a tFixedFlowInput with an onComponentOk to your database output component. In this you could specify a single row of data with a single column "message" (or whatever you want to call it) and then put your message in the tFixedFlowInput component. From here just link it to a tLogRow as normal.

SSIS Access DB data into TSQL with a stored procedure

I want to use a Data Flow Task that has two sources one from an Access DB and one from a SQL Server DB. I then want to manipulate the data and finally call a stored procedure on the same SQL Server that was my second source.
So I can't use an Execute SQL Task since I want to manipulate the data and call the stored procedure at the end.
What toolbox component would I use and what format would the stored procedure call?
I tried to do an OLE DB Destination with a stored procedure called something like this.
Exec sppUpdateAATable1 ?,?
SSIS uses concept of a pipeline to organise DataFlow task. Data flows from source to destination and processing of this data happens in between. Since you want to use the result of processing in your stored procedure as parameters, it cannot be done under pipeline concept. SP is not really a destination for your data, as SP will do something with it too.
You can populate an in-memory Recordset (destination), and use ForEach loop container to execute your SP for each row of recordset.
Update
You package should look something like this:
Data Flow task:
OLE DB connection to Access
OLE DB connection to SQL Server
To combine 2 data streams use UNOIN task
Record set destination, in properties you name a variable of type Object (MyRecordsetVar). It will hold recordset data.
ForEach Loop Container. In properties select type of loop container - ADO Recorset, specify MyRecordsetVar variable in loop properties.
Assign two more (or as many as needed) variables to hold data from each column of the recordset. Data from each row of the recordset willbe passed to these variables. One row at a time.
Inside the loop put Execute SQL task. In Input "menu" of the task specify your INPUT variables - those that have data from columns of recordset. I would assume that you know how to do it.
Put your query into the task as execute sp_MyProc ?,?.
This should be it.
You can save yourself the trouble of the recordset destination and foreach loop route and instead use an OLE DB Command in your Data Flow. It's going to fire that stored proc for each row that flows through it.