I am very new to the pentaho data integration tool.
I want to consume a restful service(post web service) from my pentaho. For that i found that i should use a rest client. But when i give the url and body(as i want to consume post service) it is not making the necessary changes in the database.
Can anyone tell me how to give body in the rest client in pentaho? i suspect that there is some different way to give the body in the body field of rest client.
I am attaching the screenshot of the rest client which i am making.
Also can anyone suggest me some good links where i can find tutorials for pentaho.
In PDI, you need to receive rows, and process them in the next step.
For this case, as stated in the forum, you first need the "GENERATE ROWS" step, to make at least one call to the REST webservice.
Remember to generate only 1 row, else, you will make a X number of calls to the REST webservice.
Then, in the "REST CLIENT" step, choose the field that is generating the row with the URL in the attribute "Accept URL from field".
Hope it helps.
Use Generate Rows for your url then pass that to the REST Client and use the 'Accept URL from field'. That's the only I found it to work.
Related
I don't mind if you use an example from another API that is not Adobe Analytics'. I just need to know the pattern that I have to follow in order to succesfully convert a Postman request into a NiFi request.
After successfully creating requests to pull reports from Adobe Analytics via Postman, I´m having difficulties to migrate these Postman requests to NiFi. I haven´t been able to find concrete use cases that explicity explain how to do this kind of task step-by-step.
I'm trying to build a backend on top of NiFi to handle multiple data extracts from Adobe Analytics in an efficient and robust way. That is instead of having to create all required scripts by myself. Yet, there is more documentation about REST APIs and Postman cases than there is about REST APIs and NiFi cases.
In the screenshot below we can see how the Postman request looks like. It takes 3 headers and 1 temporary header that includes the authorization value (Bearer token). This temporary header is generated automatically after filling in the OAuth 2.0 authorization form in the Authorization tab, as shown here.
Then, we have the body of the request. This json text is generated automatically by debugging Adobe Analytics' workspaces as shown here.
I'd like to know the following in a step-by-step manner with screenshots if possible:
Which processor(s) should I use in NiFi to obtain a similar response as the one I got in Postman?
Which properties should I add/remove from the processor to make this work?
How should I name these properties?
Is there a default property whose value/name I should modify?
As you can see, the question mainly refers to properties setup in NiFi, as well as Processor selection. I already tried to configure some processors but I don't seem to get the correct properties setup, or maybe I'm selecting the wrong processors.
I'm using NiFi v1.6.0 and Postman v7.8.0
This is most likely an easy task for users already familiar with NiFi and API requests, but it has proven challenging to me. Hopefully this will help other users looking to build more robust pipelines by using NiFi instead of doing it manually.
Thanks.
It only takes 3 NiFi processors to replicate a REST API request that works in Postman. In this solution we use a request that contains a nested JSON request. The advantage of this simple approach is that it reduces the amount of configuration required to obtain a successful response from the API. That is, even if you are using a complex JSON request. In this case the body of the JSON request is passed through the GenerateFlowFile processor, without the need of any other processor to parse/format the request.
Step #1. Create a processor called GenerateFlowFile. The only property that you will have to modify is the Custom Text. Paste in there your whole JSON request just as it was in Postman. In this case I'm using the very same JSON shown in the question above. It's a good idea to setup Yield Duration to 10 seconds or more.
Step #2. Create a processor called InvokeHTTP. Then modify the 6 properties shown in the screenshots below. Use the same Authorization details you've used in Postman. Make sure to copy the Bearer token from Postman after it has been tested. Also, don't forget to setup the HTTP Method, Remote URL and Content-Type as well.
Step #3. Finally, add a couple of LogAttribute processors to store the output of InvokeHTTP. One of these LogAttribute processors should store successful responses. The other one can be used for Failure, Original, Retry and No-Retry. Or you can create LogAttribute for each of these outputs.
Step #4. Now, connect the processors and Start your data flow! You should start seeing data populate the Successful LogAttribute. Then you can use the Data Provenance option to review the incoming data and confirm that this is exactly the same result you previously obtained from Postman.
Note: This is a simple, straightforward, "for starters" solution to replicate a Postman API request using a nested static JSON. There are more solutions in StackOverflow that tackle more complex cases, like dynamic JSON. Here's a list of some other posts:
nifi invokehttp post complex json
In NiFi processor 'InvokeHTTP' where do you write body of POST request?
Configuring HTTP POST request from Nifi
I am a bit confused. The requirement is that we need to create a REST API in Salesforce(Apex class) that has one POST method. Right now, I have been testing it with POSTMAN tool in 2 steps:
Making a POST request first with username, password, client_id, client_secret(that are coming from connected app in Salesforce), grant_type to receive access token.
Then I make another POST request in POSTMAN to create a lead in Salesforce, using the access token I received before and the body.
However, the REST API that I have in Salesforce would be called from various different web forms. So once someone fills out the webform, on the backend it would call this REST API in Salesforce and submits lead request.
I am wondering how would that happen since we can't use POSTMAN for that.
Thanks
These "various different web forms" would have to send requests to Salesforce just like Postman does. You'd need two POST calls (one for login, one to call the service you've created). It'll be bit out of your control, you provided the SF code and proven it works, now it's for these website developers to pick it up.
What's exactly your question? There are tons of libraries to connect to SF from Java, Python, .NET, PHP... Or they could hand-craft these HTTP messages, just Google for "PHP HTTP POST" or something...
https://developer.salesforce.com/index.php?title=Getting_Started_with_the_Force.com_Toolkit_for_PHP&oldid=51397
https://github.com/developerforce/Force.com-Toolkit-for-NET
https://pypi.org/project/simple-salesforce/ / https://pypi.org/project/salesforce-python/
Depending how much time they'll have they can:
cache the session id (so they don't call login every time), try to reuse it, call login again only if session id is blank / got "session expired or invalid" error back
try to batch it somehow (do they need to save these Leads to SF asap or in say hourly intervals is OK? How did YOU write the service, accepts 1 lead or list of records?
be smart about storing the credentials to SF (some secure way, not hardcoded). Ideally in a way that it's easy to use the integration against sandbox or production changing just 1 config file or environment variables or something like that
I want to use a RESTful API of a web service that I have. However, I really don't know how the web service knows how to "give it" to the stand alone application since it does not have an URL. Is there a mechanism that makes URLs in this case not needed?
I think you need to read up a little more on what REST is and does. By its nature REST is a mechanism for requesting data. I.e. it is a "pull" not a "push". REST is typically used over Http - hence the need for a Url, In the same way you request/pull data everytime you visit a webpage.
If you wish to notify from one system to another as soon as change happen then you need to look at something other than REST. Alternatively your client can poll the REST service continually to check its response.
I have created a quite simple RESTful web service. It only supports the GET (=read) method, e.g.:
http://localhost/application/id/xyz
The corresponding information for this ID is queried from a data source and returned as JSON.
Now my question: (How) should I implement HATEOAS in this case? Does it even make sense? I understand that HATEOAS is reasonable when having a more complex structure. But in this case, there are no other resources I could link to. The client calls the web service with a certain ID and the server returns information.
Thank you!
As you've said "The client calls the web service with a certain ID" it sounds like you've written your client to visit a specific URL in your service which has the URL to visit generated by the client, i.e. your client application already knows it can visit http://localhost/application/id/xyz for the xyz ID.
If you'd like to leverage some of the power of HATEOAS and decouple yourself from this (slight) dependency, you could instead be querying http://localhost/application/id?query=xyz which could return a list of valid links (if any exist). That way you could change the format or structure of the linked URL without issues for your client (of course, you'd still be dependent on the query URL in some way).
However, as your usage is so simple, this sounds like overkill and unnecessary work so I'd suggest you don't need to worry about HATEOAS until you have a more complex system or clients :)
In HATEOS your return value is not an ID but a URL. Invoking that URL links you to the next resource in the web. Just like a web page containing links to other web pages.
I'm new to REST APIs and trying to understand the basics of them. So lets begin by saying that I have created a simple CMS web application using PHP (You create an user, you post an entry and assign some categories maybe, etc...).
That being said, if I wanted to create a mobile app that would do the same, I'll have to create some PHP functions in order to send data as JSON or XML and also in order to process a POST or PUT request.
is a REST API the collection of those functions I'd use to handle the mobile app POST, PUT and GET request using JSON or XML as the data format? if not, can I get an example, not a definition, please.
To answer your question,yes, the REST API is a collection of those functions for any client you wish to expose it to for creating an user, posting an entry etc. The accepted data format is something you decide for your API. It may be JSON, XML or even both.
Some examples:
http://coenraets.org/blog/2011/12/restful-services-with-jquery-php-and-the-slim-framework/
http://peter.neish.net/building-a-mobile-app-backend-using-mongodb-and-slim-a-php-rest-framework/