Sending multiple emails with data from rows in Talend Open Studio - talend

I m working of a project of Enterprise application architecture using software talend
i have this table : User(Id_user, name_user, Email)
what i want to do is select Data from this table and sending email to each user using Tsendemail component
i could so far make a connection to Database using TMssinput and send a single email using Tsendemail
but i dont know how to select values of Row and use them as "email" for Tsendemail
Can someone help me pls ? and thank you

As tSendMail component is not a processing component (ie, it cannot handles more than one vector in input) but a starting component, the best way to do so is to use the good-ol' tFlowToIterate as we did here. Your job will almost look like:
tMssInput---row---->tFlowtoIterate--->Iterate---->tSendEmail
Inside the tFlowToIterate instance you're going to put everything you need from row into the globalMap. Every data-processing operation should be done before that, on the row context (for example, filtering out users you won't the mail to be sent, etc.).

Related

Avoid performing server request on row grouping in server side model for ag grid

I have implemented a server side table, with row grouping, using ag grid. The table makes a request to server for the time to load groups, but when you manually open a group it makes another request. There is a way to send the rows for each group in the first request?
In my application I need the ability to expand or collapse all groups manually, but it is a bit annoying to perform a request to server for each possible group.
If I try implementing a client side model and setting data rows like gridOptions.api.setRowData(data) and works, but if I change rowModelType: to 'serverSide' in gridOptions and configure serverSideDatasource sending the same data used in the client side model it doesn't work.
I have find a partial solution here.
Thank you for your help.

Is there anyway to check duplicate the message control id (MSH:10) in MSH segment using Mirth connect?

Is there anyway to check duplicate the message control id (MSH:10) in MSH segment using Mirth connect?
MSH|^~&|sss|xxx|INSTANCE2|KKLIU 0063/2021|20190905162034||ADT^A28^ADT_A05|Zx20190905162034|P|2.4|||NE|NE|||||
whenever message enters it needs to be validated whether duplicate of control id Zx20190905162034 is already processed or not?
Mirth will not do this for you, but you can write your own JavaScript transformer to check a database or your own set of previously encountered control ids.
Your JavaScript can make use of any appropriate Java classes.
The database check (you can implement this using code template) is the easier way out. You might want to designate the column storing MSH:10 values as a primary key or define an index on it. Queries against unique entries would be faster. Other alternatives include periodically redeploying the Channel while reading all MSH:10 values already in the database and placing them in a global map variable or maintained in an API that you can make a GET request to when processing every message. Any of the options depends on the number of records we are speaking about.

Talend - Stats and Logs - On database - error

I have a job that inserts data from sql server to mysql. I have set the project settings as -
Have checked the check box for - Use statistics(tStatCatcher), Use logs (tLogcatcher), Use volumentrics (tflowmetercatcher)
Have selected 'On Databases'. And put in the table names
(stats_table,logs_table,flowmeter_table) as well. These tables were created before. The schema of these tables were determined using tcreatetable component.
The problem is when I run the job, data is inserted in the stats_table but not in flowmeter_table
My job is as follows
tmssInput -->tmap --> tmysqoutput.
I have not included tstatcatcher,tlogcatcher,tflowmetercatcher. The stats and logs for this job are taken from the project settings.
My question - Why is there no data entered in flowmeter_table? Should I include tStatCatcher , tlogcatcher and tflowmetercatcher explicitly in the job for it to run fine?
I am using TOS
Thanks in advance
Rathi
Using flow meter requires you to manually configure the flows you want to monitor.
On every flow you want to monitor, right-click on the row >parameters>advanced settings>Monitor connection.
Then you should be able to see data in your flow table.
If you are using the project settings , you don't need to add the *Catcher component on your job.
You need to use tstatcatcher,tlogcatcher,tflowmetercatcher composant in the job directly.
The composant have already their schema defined so you jusneed to put a tmap and redirect in the table you want like :
Moreover in order tu use the tlog catcher you need to put some tdie or twarn in your job.

Talend two DB inputs from single REST

Trying to do following scheme in Talend ESB:
tREST_request ---- tXMLMap -- Split it to two DB_Inputs -- tXMLMap --- tREST_response
getting ID from RESTrequest,
then get some info from two different DBs by this ID
and finally combine the result REST responses from both DB inputs
Talend do not allow me to combine both DB inputs to single XMLmap,
as i understand may be only one Main flow.
Are there any other way to do it?
Found a workaround by serializing the process, use global variable and hashes.
Not sure if it is a best solution but at least it is working.
Attached image with flow.

How to get list of aggregates using JOliviers's CommonDomain and EventStore?

The repository in the CommonDomain only exposes the "GetById()". So what to do if my Handler needs a list of Customers for example?
On face value of your question, if you needed to perform operations on multiple aggregates, you would just provide the ID's of each aggregate in your command (which the client would obtain from the query side), then you get each aggregate from the repository.
However, looking at one of your comments in response to another answer I see what you are actually referring to is set based validation.
This very question has raised quite a lot debate about how to do this, and Greg Young has written an blog post on it.
The classic question is 'how do I check that the username hasn't already been used when processing my 'CreateUserCommand'. I believe the suggested approach is to assume that the client has already done this check by asking the query side before issuing the command. When the user aggregate is created the UserCreatedEvent will be raised and handled by the query side. Here, the insert query will fail (either because of a check or unique constraint in the DB), and a compensating command would be issued, which would delete the newly created aggregate and perhaps email the user telling them the username is already taken.
The main point is, you assume that the client has done the check. I know this is approach is difficult to grasp at first - but it's the nature of eventual consistency.
Also you might want to read this other question which is similar, and contains some wise words from Udi Dahan.
In the classic event sourcing model, queries like get all customers would be carried out by a separate query handler which listens to all events in the domain and builds a query model to satisfy the relevant questions.
If you need to query customers by last name, for instance, you could listen to all customer created and customer name change events and just update one table of last-name to customer-id pairs. You could hold other information relevant to the UI that is showing the data, or you could simply hold IDs and go to the repository for the relevant customers in order to work further with them.
You don't need list of customers in your handler. Each aggregate MUST be processed in its own transaction. If you want to show this list to user - just build appropriate view.
Your command needs to contain the id of the aggregate root it should operate on.
This id will be looked up by the client sending the command using a view in your readmodel. This view will be populated with data from the events that your AR emits.