Create/Retrieve/Update/Delete using Java and Rexster - titan

I want to manage a graph (create, delete and update vertexes and edges) remotely using Java.
I have all my DAO layer implemented, using Blueprints and a TitanGraph object to access and manipulate graph information.
I thought that installing Rexster and replacing the TitanGraph factory with a remote connection will be enough. But after install Rexster, seems that it is not possible create a TitanGraph object connected to Rexster.
In the documentation talk about two options to work with Rexster:
Using RexsterClient : But it is only possible to use Gremlin, so it
is not possible to create/delete/update information. It is possible to create/delete/update executing Groovy scripts, as said #stephen-mallette.
Using Rexster HTTP Rest API: It is possible, but I need rewrite all
my DAO layer.
My question is: How to create/update/delete vertex/edge using Java and a remote Rexster server? Where are there examples?
Regards and Thanks.

I'm not sure what you mean by this:
Using RexsterClient : But it is only possible to use Gremlin, so it is not possible to create/delete/update information.
RexsterClient issues Gremlin scripts to Rexster. While we typically think of Gremlin as a traversal language for querying the graph, it is perfectly capable of mutating a graph as well. The following is perfectly valid Gremlin:
v = g.addVertex()
v.setProperty("name","bill")
So, in that sense you could certainly issue remote Gremlin that modified the graph from RexsterClient. However, in your case, you already have DAO code. Why not re-use that code and simply host it in Rexster?
Copy your jar containing your DAO code to Rexster /ext directory
Modify rexster.xml to include your package imports from your jar for the classes you want to have available to you: <imports>com.myco.dao.*</imports>
Start Rexster
At that point, when you use RexsterClient, you should be able to access those DAO classes in the Gremlin scripts that you send.

Related

Generated classes

I try to sync data via GraphQL from my backend. Therefore I use the artemis package to generate my classes. However I then want to cache the data localy and therefore use the sqfentity_gen package to generate classes to safe my data via sql. I can use a json constructor with each framework to convert the data between.However I want to encapsulate certain functionality since I dont want to just safe changed data localy but sync it to the backend and handle certain errors like merge conflicts or missing network. Therefore I am thinking about wrapping the classes with an other one since I cant change the generated code. Is this a good idea or are there other solutions which work better? Would you use a completly diffrent setup? Glad for any suggestions
Instead of generate and/or re-generate (update) classes which are based on db-tables (I assume), you can use solution from the box, in example, NReco.GraphQL
It allows you set db-schema in the json file and just set db-connection. If you have a lot of tables, you can just generate that json file based on metatable info.
Storing and updating classes, from my point of view is useless.

Microsoft Dynamics predefined query

We are writing a web service that needs to query a somewhat complex dataset from MS Dynamics. By "somewhat complex", I mean it needs to query multiple levels of the entity hierarchy. The Dynamics oData implementation limits the depth of oData queries to two (i.e. you can query a parent and a child - but not that child's child).
One way of moving forward would be to walk the entity hierarchy by making thousands of oData calls, but this is not at all appealing (kind of defeats the entire purpose of server side databases).
Another wrinkle is that we need to interact with many different Dynamics systems, none of which will allow us to load a custom .NET plugin. It probably doesn't matter, but we are writing our client-side code in Java, and really do need to use regular REST type calls.
I was thinking that if there were a way to predefine a data report in Dynamics that hits the complex query, that we could then maybe query that report via the REST interface. So far, I haven't found a way to do this, and I thought I'd toss this out to the community to see if anyone has any suggestions.
you can create a Query object in NAV and publish the Query Object as a Webservice
If the Query Object is not available in NAV (NAV version < 2013) you can create codeunit, generate the dataset in there and publish the codeunit as a webservice.
The third option is if the NAV is running on SQL server you can create a View/TempTable/Stored Procedure in there to create the dataset; or use SQL Analysis Services
Cheers!
We wound up dropping the REST API entirely and using SOAP instead. From there, we can do FetchXML and get what we need.

Telosys : How can i get database table records in template?

I am using Telosys tools for code generation. It's very nice tool and help me a lot.
But there is one problem, that is, it provides database schema information and i can access in templates (templates are formerly velocity templates), which is good, but how can i get selected entity's data from database? There is no way i can find, by which i can get that selected table data.
Please provide solution if any, or provide alternate way to do it.
Thanking You!
Telosys Tools is designed to retrieve the model from the database,
not the data stored in the tables.
But it allows to create your own specific tooling classes usable
in the templates, so it's possible to create a specific Java Class to retrieve the data from the database.
There's an example of this kind of specific class in the "database-doc" bundle
https://github.com/telosys-tools/database-doc-bundle-TT210 ( in classes folder )
To simplify the loading the simplest way is to create the class in the "default package" (no java package)
NB:
The problem is that the jar containing the JDBC driver
is not accessible by the generator class-loader, so you will have to use a specific class-loader and to connect directly with the JDBC driver.
Here is an example : https://gist.github.com/l-gu/ed0c8726807e5e8dd83a
Don't use it as is (the connection is never closed) but it can be easily adapted.

Generating DAO with Hibernate Tools on Eclipse

How can I use the tools to generate DAOs?
In fact, instead of passing through the hbm files, I need to configure hibernate tools to generate the DAO and the annotations.
See Hibernate Tools - DAO generation and How generate DAO with Hibernate Tools in Eclipse?
First let me assume DAO as POJO/Entity beans. Basically you can accomplish your task either through forward or reverse engineering. In case of forward engineering, probably you can look into AndroMDA tool. In case If u wish to accomplish it through reverse engineering Click here ..
Hope this will be helpful.
Welcome. You got to write all your data access logic by your hand (if I’m not wrong). Hiberante let you interact with database in three ways.
Native SQL which is nothing but DDL/plain SQL query. This can be used very rarely in hibernate projects even though this is faster than below mentioned options. Reason is simple “One of the key advantage of hibernate or any other popular ORM framework over JDBC Is you can get rid of database specific queries from your application code!”
HQL stands for hibernate query language which is proprietary query language of hibernate. This looks similar to native SQL queries but the key difference is object/class name will be used instead of table name and public variable names will be used instead of column names. This is more Object oriented approach. Some interesting things will be happening in background and check if you are keen!
Criteria API is a more object oriented and elegant alternative to Hibernate Query Language (HQL). It’s always a good solution to an application which has many optional search criteria.
You can find lots of examples on internet. Please post your specific requirements for further clarification on your problem.
Cheers!

TG2.1: Proper location to store a database session instance?

I am using a custom database (MongoDB) with TG 2.1 and i am wondering where the proper place to store the PyMongo connection/database instances would be?
Eg, at the moment they are getting created inside of my inherited instance of AppConfig. Is there a standard location to store this? Would shoving the variables into the project.model.__init__ be the best location, given that under SQLAlchemy, the database seems to commonly be retrieved via:
from project.model import DBSession, metadata
Anyway, just curious what the best practice is.
As of TurboGears 2.1.3, MongoDB support is integrated via the Ming ORM. I would look at a quickstarted project using the --ming option to get best practices if you want to do some customization: http://turbogears.org/2.1/docs/main/Ming.html