Expose job as Web Service Talend - rest

I have got a Talend job that as a finish product prepares zip package, now i want to expose this job as a REST server with GET request, so each time clients calls service the package is made and available to download. I know that there is a thread with exact same name expose job as web service but accepted answer has links that are no longer valid.
Currently my job looks like this
And the idea standing behind this design is that i got one column in output flow in tRESTRequest named "userfile" of type byte array, calling GET request tJavaRow starts and wrap file into byte array, then i transfer it through tMap to tRestRespone body as byte array. What am I missing?

Related

Problems creating a soap connection with IBM SDI

I have recently been working with IBM SDI software for identity and governance.
To get started I was given the exercise of building a calculator using a soap request to this WSDL server.
Given a CSV file, with ID, number1, number2 and operation attributes, i need to create a csv output file with the id attribute and the result of the operation.
Some advices were:
use "invoke soap" connector to make a request to the service
take FormEntry connector to take operations calculated by the SOAP, setting a parameter of this connector called "EntryRawData"
Up to now, the only thing I was able to do was to crate a file connector that reads the csv file in input.
The problems start with SOAP connector. Any help is kindly appreciated.
Even more I have some problems understanding what a WSDL server is, what it does, and what a SOAP request is. Thank you in advance.

Peoplesoft Integration Broker REST services POST - how to set up in Peoplesoft

This is the first time that I am setting a POST up in Peoplesoft, so need some input. The Service Operation needs to be set up with the parameters in the BODY, as I understand. I was fine with setting up GET, since all the parameters come in via the URL/URI.
How do I configure the Service Operation for the POST? I have set up the document and the message already.
We receive in json-format.
Since it a very high leve question, I'm gonna refer you to peoplebooks. Refer to page.219 in the below peoplebook. You can also find a lot of blog posts on the same topic.
https://docs.oracle.com/cd/F52214_01/psft/pdf/pt859tibr-b032022.pdf
On high level, to give you a head start,
create document definition
attach it to message definition
create service operation definition and use this message definition for request
create a message definition for response based on the expected structure
populate the document body and invoke the service operation from peoplecode.(refer attached peoplebook for code reference)
Please post a question with specifics if you get stuck somewhere.

asynchronous bulk data validations service - GET or POST?

Here is a different scenario for GET or POST confusion. I am working on a web application built with spring-boot microservice architecture where there is a need of validate and update some bulk data from excel sheet.
There can be 500-1000 records in excel sheet with 6 different columns for bulk processing. Once UI submits the excel sheet to server from then the total process is asynchronous. There are microservice to microservice calls which I am getting confused to have GET or POST.
Here is a problem: I have 4 microservices (let's say orchestra-service,A-service,B-service and C-service).
OrchestraService creates a DTO list from excel sheet which will be used in further calls. Orchestra calls 'A'. 'A' validates the data with DB and marks success and failure records in DTO list object and returns the list back to orchestra. Again orchestra calls 'B', it does the similar job like 'A' and returns back to orchestra.
Now orchestra calls 'C' which will update success records into database, updates the file status on database and also creates a new resultant excel sheet with error messages per row which will be emailed to the user later(small report kind of thing).
In above microservice to microservice calls only C is updating database and creating resource on server. All above calls I used POST method because I need the request body to pass my input list to all services.
According to HTTP Standards am I doing right?
https://www.rfc-editor.org/rfc/rfc7231#section-4.3.3
Providing a block of data, such as the fields entered into an HTML form, to a data-handling process it should be a POST call.
Please advice me whether:
I should use POST for only 'C' and GET for others or
It should be POST for all as other process involves in data filtering process.
NOTE: service A,B, and C not all services uses all the columns of excel but some of them in combinations. One column having 18 characters long data so I think it can be a problem with GET header limit for bulk operation.
Http Protocol
There is no actual violation on passing information on GET and if that request doesn't mutate between identical requests, then it's fine.
Microservice wise
Now for clarification, are Service A and Service B actually needed ?
Aren't they the same Domain as Service C, and can reside inside of him ?
It's more then good practice to have a Microservice validate its own domain and return a collection of success and failure with the relevant messages.
I had the similar question few years back and here is the possible solution for the first part of your question.
As mentioned by #Oreal Eraki in his answer, I would also question whether you need services A and B. If its just validation and data transformation it can be done in the same domain where the data is actually stored.

Tarantool shiny dashboard

I want to use Tarantool database for logging user activity.
Are there any out of the box solutions to create web dashboard with nice charts based on the collected data?
A long time ago, using an old-old version of tarantool I've created a draft of tarbon - time-series database, with carbon-cache identical interface.
Since that time the protocol have changed, but the generic idea still the same: use spaces to store data, compact data organization and correct indexes to access spaces as time-series rows and lua for preparing resulting jsons.
That solution was perfect in performance (either on reads or on writes), but that old version lacks disk storage and without disk I was very limited to metrics capacity.
Tarantool has embedded lua language so u could generate json from your data and use any charting library. For example D3.js has method to load json directly from url.
d3.json(url[, callback])
Creates a request for the JSON file at the specified url with the mime type "application/json". If a callback is specified, the request is immediately issued with the GET method, and the callback will be invoked asynchronously when the file is loaded or the request fails; the callback is invoked with two arguments: the error, if any, and the parsed JSON. The parsed JSON is undefined if an error occurs. If no callback is specified, the returned request can be issued using xhr.get or similar, and handled using xhr.on.
You also could look at c3.js simple facade for d3

How do I set up a mock queue using mockrunner to test an xml filter?

I'm using the mockrunner package from http://mockrunner.sourceforge.net/ to set up a mock queue for JUnit testing an XML filter which operates like this:
sets recognized properties for an ftp server to put and get xml input and a jms queue server that keeps track of jobs. Remotely there waits a server that actually parses the xml once a queue message is received.
creates a remote directory using ftp and starts a queue connection using mqconnectionfactory to the given address of the queue server.
once the new queue entry is made in 2), the filter waits for a new queue message to appear signifying the job has been completed by the remote server. The filter then grabs the modified xml file from the ftp and passes it along to the next filter.
The JUnit test I am working on simply needs to emulate this environment by starting a local ftp and mock queue server for the filter to connect to, then waiting for the filter to connect to the queue and put the new xml input file on a local directory via a local ftp server, wait for the queue message and then modify the xml input slightly, put the modified xml in a new directory and post another message to the queue signifying the job has completed.
All of the tutorials I have found on the net have used EJB and JNDI to lookup the queue server once it has been made. If possible, I'd like to sidestep that route by just creating a mock queue on my local machine and connecting to it in the simplest manner possible, not using EJB and JNDI.
Thanks in advance!
I'm using MockEjb and there are some examples among them one for using mock queues, so take a look to the info and to the example
Hopefully it helps.
I'd recommend having a look at using Apache Camel to create your test case. Then its really easy to switch your test case from any of the available components and most importantly Camel comes with some really handy Mock Endpoints which makes it super easy to test complex routing logic particularly with asynchronous operations.
If you also use Spring, then maybe start by trying out these Spring unit tests with mock endpoints in Camel which let you inject the mock endpoints to perform assertions on together with the ProducerTemplate object to make it really easy to fire your messages for your test case. e.g. see the last example on that page.
Start off using simple endpoints like the SEDA endpoint - then when you've got your head around the core spring/mock framework, try using the JMS endpoint or FTP endpoint endpoints etc.