How to submit Hive Jobs programmatically from JSP - rest

We are trying to build a wrapper system for business users and we want to explore option of building a capability to submit the HIVE query from a JSP page. I could not find a best example or suggested mechanism for this. Anyone tried this before? If so can someone share their best ideas? We are looking for the REST API mechanism. If that wont work, then we can use java from JSP servlets.
Appreciate your support.
Kiran

You can use JDBC. I dont think there is a REST API for Hive.
But since most developers & application typically use JDBC this should be the preferred mode.
More details can be found here (Assuming you are using latest Hive versions) : Hive 2 Clients
Sample code sample code

Related

Ionic Creator exported, how to make database and server?

So I am making a hybrid app and have used Ionic Creator to design the UI which I have now exported. So I now have an Ionic wrapper (I think) with all the pages and tabs linking nicely. However, no functionality e.g database. Please could someone tell me what to do next as I am very confused. Do I need to set up a server as well? And which is the best database solution to work with Ionic?
Thanks in advance.
Not just a C dummy, also a web dummy
Yes, You have to Set up Server by yourself.
you can use any Back-end Language for this e.g
1) Node.Js
2) PHP
3) C#
4) Java
5) Python
For Nodejs see this Example Nodejs Example Server
Your can use Any data base of your choice. like MySql, Sql server, MongoDB or Firebase.

How do I query an external GraphQL endpoint in Gatsby JS?

I don't seen any clear way to query an outside GraphQL endpoint (i.e. https://somewebsite.com/graphql) for data. Has anyone else done this, and how did you do it?
I assume it requires you to build a custom plugin, but that seems overkill for such a simple need. I have searched the docs and this issue doesn't really ever get addressed. 🤔
In Gatsby V2 you don't need to create your own plugin.
What you need is a plugin called gatsby-source-graphql
gatsby-source-graphql uses schema stitching to combine the schema of a
third-party API with the Gatsby schema.
You can find more info here.
The answer is, as you mentioned, writing a new source plugin. This is how Gatsby gets the data into it's internal GraphQL structure to then query.
Plugins are, at their core, just additions to the gatsby-node, gatsby-browser, and gatsby-ssr files. So you could write the logic needed at the top of your gatsby-node file to avoid abstracting it out into it's own plugin.
If you're not so into writing plugins for gatsby, like me, have a look here.
It explains in detail how you query any graphQL server via the Gatsby sourceNode API and the use of graphql-request.
Helped me to get data for e.g. from graph.cool as well as GraphCMS.
The problem though is that you always have to write 2 different kinds of graphQL queries, as they are usually not compatible to Gatsby's relay style queries. But still easier than building a whole plugin.

How to connect ontology and web application

I'm working on a project, within which we are using semantic web technologies and creating web application allows user to get recommendation in order to take right decision ( won't get into the details).
For me and my team its a first experience to work with ontology.
We've already created ontology (have rdf and owl formatted files)(We are using eclipse to keep them).
Separately, we've created web application. My question how to connect web page and owl, rdf formatted data, more precisely, how to ensure input through webpage to dataset and get output on page.
I've found some info( on old forums), that its easyrdf which can be used as embedded in php script. But not clear.
Based on youtube tutorials, I've downloaded jena fuseki and don't know what is the next step.
I would be glad to get any advice, suggestion :)
In my view, there is no a single way to do this.
I usually set up some back-end application in order to pre-process this kind of information (build SPARQL queries, execute them and parse the results) and then return to the front in some way understandable by that side.
So, you could have all your data in RDF format store, for example, in a TDF exposed by Fuseki and interact with that data with some back-end, aimed to consume, update and parse the results you could find there.
That's my advice, hope could be useful for you.
Good luck!

How to extract data from web api with Talend Open Studio

How can I extract data using Talend from websites such as below to do some data analysis:
Airbnb,
change.org
monster.com
ebay
I am new to TOS and not familiar with internet components. I think I may be confused regarding what connectors to use (trest, tsoap...). If anyone could help me understand which kind of connectors are needed that would be great.
You can use following architecture
tREST --> tExtractJSON or tExtractXMLFields component depending on your requirement.

How do I query LDAP in GWT?

I am fairly new to GWT and have built an application to manage working times. So far it was convenient to register/create the users in the datastore manually, which made testing a little easier. Now, to verify a login and automatically create the users, I want to retrieve the usernames and the passwords by querying an LDAP server.
My problem is that I was not able to find any examples, tutorials or forum entries that helped me completing this task. My so far only hope of success was an example using the javax.naming package which turned out not to be supported by GWT.
My question now is: is a database connection using LDAP supported by GWT at all? And if it is: how do I retrieve the data I need?
Thank you in advance!
You will need to construct a remote service that the GWT client talks to that communicates with your backends. There are several frameworks to choose from, depending on your needs. See the Communications DevGuide and the RequestFactory DevGuide for introductory material.
GWT client <-- XHR request --> Remote Service-+--> LDAP
|--> Other Database(s)