Can I pass JSON to this REST API call? Apache Ignite - rest

We are trying to put data in Apahe Ignite Cache using this REST API provided by Ignite. https://apacheignite.readme.io/docs/rest-api.
I want to know if I can pass JSON data to it from spring boot application. Tried the basic GET and PUT it's working fine. But how to pass lots of Data from the JSON.?
Like Example JSON
{
Name : CYZ,
Id:12345
Dept: xyz
}
P.S The JSON is for understanding purposes only. I will tweak the answer as per my requirement.
Thanks.

You can use a ConnectorMessageInterceptor to convert the JSON representation into a Java object.
You can specify it in Ignite configuration as ConnectorConfiguration#messageInterceptor property. ConnectorConfiguration can be specified as IgniteConfiguration#connectorConfiguration property.

Related

How to list data in one datastore with geoserver rest interfaces?

I have tried to use "rest/layers.json" to get all data,but only got the services have been published.
I thought it might be
"rest/workspaces/{workspaceName}/datastores/{datastoreName}/featuretypes.json" ,bu i always get the emty object.
it should look like:"rest/workspaces/{workspaceName}/datastores/{datastoreName}/data.json.
so,how to get the data list with a datastore exactly.
GeoServer REST services are only managing configuration, there is not data access. If you want to get data, use WFS, it's really simple, e.g.:
http://demo.geo-solutions.it/geoserver/wfs?service=WFS&version=1.0.0&request=GetFeature&typeName=topp:states&outputformat=application/json

How to create a H2OFrame using H2O REST API

Is it possible to create a H2OFrame using the H2O's REST API and if so how?
My main objective is to utilize models stored inside H2O so as to make predictions on external H2OFrames.
I need to be able to generate those H2OFrames externally from JSON (I suppose by calling an endpoint)
I read the API documentation but couldn't find any clear explanation.
I believe that the closest endpoints are
/3/CreateFrame which creates random data and /3/ParseSetup
but I couldn't find any reliable tutorial.
Currently there is no REST API endpoint to directly convert some JSON record into a Frame object. Thus, the only way forward for you would be to first write the data to a CSV file, then upload it to h2o using POST /3/PostFile, and then parse using POST /3/Parse.
(Note that POST /3/PostFile endpoint is not in the documentation. This is because it is handled separately from the other endpoints. Basically, it's an endpoint that takes an arbitrary file in the body of the post request, and saves it as "raw data file").
The same job is much easier to do in Python or in R: for example in order to upload some dataset into h2o for scoring, you only need to say
df = h2o.H2OFrame(plaindata)
I am already doing something similar in my project. Since, there is no REST API endpoint to directly convert JSON record into a Frame object. So, I am doing the following: -
1- For Model Building:- first transfer and write the data into the CSV file where h2o server or cluster is running.Then import data into the h2o using POST /3/ImportFiles, and then parse and build a model etc. I am using the h2o-bindings APIs (RESTful APIs) for it. Since I have a large data (hundreds MBs to few GBs), so I use /3/ImportFiles instead POST /3/PostFile as latter is slow to upload large data.
2- For Model Scoring or Prediction:- I am using the Model MOJO and POJO. In your case, you use POST /3/PostFile as suggested by #Pasha, if your data is not large. But, as per h2o documentation, it's advisable to use the MOJO or POJO for model scoring or prediction in a production environment and not to call h2o server/cluster directly. MOJO and POJO are thread safe, so you can scale it using multithreading for concurrent requests.

Conditional routing in Apache NiFi

I'm using NiFi to get data from an Oracle database and put some of this data in Kafka (using the processor PutKafka).
Example : if the attribute "id" contains "aaabb"
Is that possible in Apache NiFi? How can i do it?
This should definitely be possible, the flow might be something like this...
1) ExecuteSQL or QueryDatabaseTable to get the data from the database, these produce Avro
2) ConvertAvroToJson processor to convert the Avro to Json
3) EvaluateJsonPath to extract the id field into an attribute
4) RouteOnAttribute to route flow files where the id attribute contains "aaabbb"
5) PutKafka to deliver any of the matching results from RouteOnAttribute
To add on to Bryan's example flow, I wanted to point you to some great documentation that should help introduce you to Apache NiFi.
Firstly, I would suggest checking out the NiFi documentation. It is very good and should help a lot. In addition to providing details on each of the processors Bryan mentioned it also has general documentation for every type of user.
For a basic introduction to build a NiFi flow check out this video.
For example templates check out this repo. It's a has an excel file at it's root level which has a description and list of processors for each template.

jbpm populating a data object using rest api

In jBPM I have a process that contains a human task. This human task is used to populate a custom data object.
With the jBPM REST API, you can complete a task with parameters like so:
localhost:8080/jbpm-console/rest/task/93/complete?map_price=1800
And the process will have a process variable "price" with value 1800.
But how can you send a custom data object?
My object is called "expense" and if I complete the task manually in jbpm-console using the form, the variable expense in the process has the value "expensetest.Expense#33d6ffc0"
My guess is I'll have to provide this data object in the body of my POST but I can't seem to get it working. Perhaps I'm missing a step?
The task/{id}/complete REST url only supports simple data types. To use custom data types, you should use the /execute operation. This supports (de)serializing Java Objects to XML using JAXB.

Is it correct to use Post instead of Get to fetch data in Web API

I am currently creating Restful API through ASP.Net WebAPI technology. I have 2 questions related to WebAPI
I had done below:
Created below method in Controller class:
public HttpResponseMessage PostOrderData(OrderParam OrderInfo)
Based on Parameter OrderInfo, Query the SQL Server and get list of orders.
Set the Response.Content with the collection object:
List<Orders> ordList = new List<Orders>();
//filled the ordList from SQL query result
var response = Request.CreateResponse<List<Orders>>(HttpStatusCode.OK, ordList);
On Client side,
OrderParam ordparam = new OrderParam();
response = client.PostAsJsonAsync("api/order", ordparam).Result;
if (response.IsSuccessStatusCode)
{
List<Orders> mydata = response.Content.ReadAsAsync<List<Orders>>().Result;
}
So question: is it fine to Post the data to server to Get the data i.e. usage of Post data insted of Get is correct? Is there any disadvantage in approach? (One disadvantage is: I will not able to query directly from browser) I have used Post here because parameter "OrderParam" might extend in future and there might be problem due to increase in Length of URL due to that.
2nd Question is: I have used classes for parameter and for returning objects i.e. OrderParam and Orders. Now consumer (clients) of this web api are different customers and they will consume API through .Net (C#) or through Jquery/JS. So do we need to pass this class file containing defination of OrderParam and Orders classes manually to each client? and send each time to client when there will be any change in above classes?
Thanks in advance
Shah
Typically no.
POST is not safe nor idempotent - as such cannot be cached. It is meant to be used for cases where you are changing the state on the server.
If you have a big critieria, you need to redesign but in most cases, URL fragments or querystring params work. Have a look at OData which uses querystring for very complex queries and uses GET.
With regard to second question, also no. Server can expose schema (similar to WSDL) or docs but should not know about the client.
Yes you can, RESTFUL is nothing to do with Security, it is just a Convention and for Web API you can use it because you do not need any caching for web api.