Using oreintdb in production with jdbc - orientdb

I am planning to use orientdb in production using the jdbc drive so i need confirm some points
is jdbc driver can give all the orientdb Features like (transaction and links ...etc) or using the the java api is the best choice
I noticed that you have spring data implementation in the orientdb github is it ready to use in the production

At this link a discussion on the issue that you wrote.
in general, JDBC driver supports only a subset of OrientDB, only the part you can use with commands.
If you're a Java developer, I suggest you to use the Java Graph API: http://orientdb.com/docs/last/Graph-Database-Tinkerpop.html

Related

automatic pulling REST API data to visualize it in Apache Superset

I work in a large enterprise and have a project to build some custom automated dashboards for our IT department, the small amount of data needs to be fetched only from the REST API endpoints. This process needs to be fully automated and there is not enough time to build a custom API wrapper. For this approach I was going to use Apache Airflow + Apache Superset tools. I have been googling for a couple of days for more easier open source solution than the Apache Airflow to move data from the REST API endpoints to visualize it in Superset. Please share your experience what would you choose instead of the Apache Airflow?
I chose to go with fhe following solution:
Apache Airflow + PostgreSQL + Grafana (instead of a Superset, because in Grafana you can actually create a drill-down option using a workaround)

apache ignite or any other in memory cache for postgres

Looking for caching layer sitting on top of postgres like Redis…
If we change anything in memory it should get updated to Postgres…open source out of box integration between in memory and postgres…
I believe we can do with apache ignite..can you please point me how to do it? or any other in memory solution with Postgres
What is difference between grid gain and apache ignite?
Yes, the desired behavior could be achieved with Apache Ignite. You need to use CacheStore for 3rd party databases. You can even generate an Apache Ignite configuration that can handle the matter by parsing your existing schema through a JDBC connection. GridGain Community Edition is a source-available fork of Apache Ignite maintained by GridGain Systems (original author of Apache Ignite).

How to set up Apache Sling to use a relational DB

I am on Sling 11, which uses Jackrabbit Oak as content repository. I was wondering how to set up Sling to store the JCR repo on an RDBMS (DB2 to be specific).
I found this link on Jackrabbit Persistence, but looks like it does not apply to Oak and Oak documentation is mostly about MongoDB.
Also found an implementation of a Cassandra Resource Provider, although that seems designed to access specific paths mapped to Cassandra without using Oak.
Thanks,
Answering here but credit goes to Sling user's mailing list
Package the DB driver in an OSGi bundle
Download Sling's starter project
In boot.txt add a new running mode (in my case oak_db2)
[settings]
sling.run.mode.install.options=oak_tar,oak_mongo,oak_db2
Download Sling's datasource project and compile it.
In oak.txt configure the running mode (this will load the bundles for you in Felix):
[artifacts startLevel=15 runModes=oak_db2]
com.h2database/h2-mvstore/1.4.196
com.ibm.db2/jcc4/11.1
org.apache.sling/org.apache.sling.datasource/1.0.3-SNAPSHOT
And set-up the services that will manage persistence:
[configurations runModes=oak_db2]
org.apache.jackrabbit.oak.plugins.document.DocumentNodeStoreService
documentStoreType="RDB"
org.apache.sling.datasource.DataSourceFactory
url="jdbc:db2://10.1.2.3:50000/sling"
driverClassName="com.ibm.db2.jcc.DB2Driver"
username="****"
password="****"
datasource.name="oak"
Create a 'sling' named database.
run with java -jar -Dsling.run.modes=oak_db2 sling-starter.jar

Is there a PipelineDB package for Laravel or Native PHP?

I was asking for PipelineDB if there is a package for laravel or Native PHP, so I can use it in my current project?
PipelineDB actually does not have its own special client libraries but instead maintains compatibility with all PostgreSQL clients. Any client that works with PostgreSQL will seamlessly work with PipelineDB, so you're free to use the PHP/PostgreSQL client of your choice.
Please see the clients section of the PipelineDB docs for more information.

Can gatling load test SQL database or MongoDB?

I need to make load testing on my persistent storages Postgres and MongoDB.
I'm wondering if it is possible to do with gatling? Because as I see gating supports only http and jms out of the box.
Unfortunately no, it was under discussion some time ago, however the functionality didn't appear to have high enough demand from the community hence it was discarded.
You can use Apache JMeter for database testing, it comes with MongoDB Script sampler for Mongo and JDBC Request sampler for any other databases which support JDBC protocol.
New JMeter tests and existing Gatling tests can be combined into a single test harness via i.e. Taurus tool which supports JMeter, Gatling and few more underlying load testing tools.