connecting multiple databases(Instances) in Openbravo - postgresql

I'm new to Openbravo development and i have a scenario where in, we have two openbravo instances lets say OB1,OB2. I need to write one DALProcess process which runs on OB1 instance and it should be able to connect to another OB2 instance and retrieve data from OB2 and insert/update data in OB1. So i need to manage two connections in my process.
I have no clue how to go ahead with this.It would be helpful if folks worked on openbravo development can share their ideas and suggestions.
Regards,
Raghu

I think you can fulfill your requirement using web service. Openbravo provides webservices api that you can use to create endpoints for data exchange through restful web service on both the instances.
Say OB1 is receiver, and OB2 is the producer. So you need to create endpoint on OB2 (which basically is an URI) that will take some input parameter through GET/POST and then query DB2 and send the result in json format to OB1.
Take a look on this, it might help you regarding the Openbravo Rest API - Openbravo Rest

Related

Data synchronization between primary and redundant servers

I want to synchronize data among a set of REST API servers(Spring Boot based API cluster) periodically. Any instance in the cluster should be able to broadcast new information to all others.
I don't want to use a DB here. I am trying to find a lightweight library that can be used inside the API for this purpose. Is it possible to use Atomoix/Hazelcast/ZooKeeper for this purpose? If so, it will be really helpful if someone can post a sample code - if possible.
My thanks in advance.
In Hazelcast you can do it through WAN replication.
It is an enterprise feature you have to buy a license.
Hazelcast can be used for this use-case. Each of the REST instances will create an embedded Hazelcast member within its JVM. Hazelcast members then discover each other and form the cluster. Your REST apps will use the IMap or ReplicatedMap service - a distributed key-value store (IMap can store more data, ReplicatedMap is faster). Once you write a data to the IMap all other instances see it right away.
See the code sample here: https://docs.hazelcast.com/hazelcast/latest/getting-started/get-started-java.html#complete-code-samples
This feature and the Spring integration are open-source.

Postgresql in Node-red

I use node-red to create an Api from a server. I want to read and send data via http. I use the browser-based programming method. I want to send data from a postgresql database. I installed the package node-red-contrib-postgres-multi. I donĀ“t know how to put data into the database and how to read data from the database, because I cannot find examples.
Does anybody knows, how I can do that?
You can use postgrestor postgrestor package where you can define query or data modification statement directly in node window. It also allows parametrized queries via template engine.
The problem I've encountered is working with more than one db instance in the same flow but if you need only one db instance connection it should work for you.

How to setup a REST service with a result as hhtprespon on Hadoop

actually I'm using a big data Environment where I ingest and process data with Apache Nifi. The results a saved in an HBase table and I want to access the stored data with Hive. Now I want to setup a REST service to read from the HBase table. Example
get http://localhost:50111/userid/42 --> http-response { "userid": 42, "name": "foobar" }
I thought that this is a standard problem with a existing solution, but it isn't. The problem is, that I can not send the result as http-response.
First I try it with Nifi and the webservice is working, but only with static content such as "200 OK", or "404 Not found". So I try it without Nifi. I read and everybody was saying, that WebHCat is the tool to go, because it is the REST API tool for Hive. Great! But... the same problem: I can execute queries over a REST service with WebHCat and Hive on HBase, but there is no option to retreive the result.
With my current knowledge there is no ready-made solution and I have to develop my own REST service. Right? Really?!
Do I have to develop my own REST service, or what is the best practice in this case? Nifi? Hiveserver2? Additionally I want to secure the REST service with Knox and Ranger.
I hope someone could help me and show me the right(!) way to go, because I don't want to make something new and special, if there is a better or best practice solution.
thanks
~n3
#n-3 - You have a number of options here.
Have a look at http://hortonworks.com/blog/hbase-via-hive-part-1/ for a basic hbase interaction with hive. You are also probably better off looking at Phoenix for this sort of usecase http://phoenix.apache.org/.
If you do go the direction of WebHCat and/or HiveServer2 you can already process access to both through Apache Knox with access control provided by Apache Ranger.
In addition, Apache HBase has a REST server of its own which you can also access through Apache Knox and protect with Apache Ranger. http://hbase.apache.org/book.html#_rest
Hope this provides some help.

How to integrate Oracle APEX and Alfresco via CMIS

A question regarding the integration of the document management system Alfresco into Oracle Application Express (APEX) based on CMIs-repository:
The aim is to use APEX as the portal-page and Alfresco showing it's results (document lists) based on search parameters coming form APEX.
A search result from a CMIS-query should be displayed in an APEX page-region.
Unfortunately I have no experience in this sector (REST, CMIS) - so any advice would be welcome!
A related question regarding user authentication and authorization via CMIS does also arise.
Has anyone out there implemented something like this or used these components together, yet?
The first thing that pops into my mind is making the choice where you want your communication with the repository to take place: client side or server side?
Alfresco supports Web Scripts, so I would be possible to create a javascript-heavy thick client which connects to your repository, get information about your files and redirect to their download links.
The alternative would be to design some way to connect to the repository from the database server. Again there are many ways to do this. You can connect to the repository during your page load and use PL/SQL regions to fire scripts that connect to your repository, get the data you want, and render your region with that information.
Another way would be to periodically check the repository for changes, and maintain a 'shadow copy' of the repository within your oracle database tables.
Of course all of these solutions have their own drawbacks.

Using Local storage and REST adapter at the same time?

I'm pretty new with an Ember so for the start I have a noob question - is it possible to use Local Storage and REST adapter at the same time?
For example, if I want to do a login via API, if login is success the server will return an API key which is used for later communication with a service. Is it possible to store that information locally on the client and to retrieve it when necessary but also, for other models, to use REST adapter?
If this is not a good way to handle such case, which one would you propose and is there any kind of example which would me lead me in the right direction?
Thanks to the people from #emberjs, I found out that there is a wonderful ember-auth authentication framework for the Ember.js which does what I need.