Formio connecting to other databases other than mongodb - formio

Anyone know if we can configure a different database than the default
mongodb database that shipped with formio? I found "formio-sql" and it seems to act as a connector to our own database where we have to create and maintain own database tables and configure them with actions. What I am actually looking for is whether formio can use different databases like Mysql to store the submitted data natively.

As of today, it is not possible to use databases other than MongoDB to store submitted data natively.
The only option is by using 'Sql Connector' action via 'formio-sql' which you have seem to have found yourself.

Related

Does Laravel support MongoB by default or any NoSQL database?

My company runs a huge Lumen 5.1 project on MySQL. They want to add to it analytics, and they that part to use MongoDB
Is it possible to use MongoDB without any third party libraries? I one going to use
https://github.com/jenssegers/laravel-mongodb
But the tech lead thinks Laravel support MongoDB by default, I'm just asking this question to check whether that's true or not.
Edit:
If MongoDB isn't an option, does Laravel support any other NoSQL by default?
Laravel does not support MongoDB by default.
You would need to use one of several available third-party packages. I like moloquent because it maps mongo db collections to laravel models just like eloquent.
You can use Redis if you need a natively supported NoSQL db or could consider ElasticSearch (not supported natively) if you are going to store a lot of meta data and then analyse it. Tools like kibana and logstash might get very helpful.

Realtime sync between Oracle db(source) and Mongodb(destination)

Is it possible to have a real time sync between a heavy Oracle database and mongodb? Has any one tried this?
I saw a site - Keep MongoDB and Oracle in sync
Here they have mentioned having triggers on the oracle tables, my doubt is that will this slow the applications already running on the Oracle database. Will this replication cause the applications to slow down or bring down the oracle database's performance?
The right solution would involve Change Data Capture from Oracle. This does not require triggers on oracle and thus won't effect performance. There are several tools you can use such as Striim and Attunity. Striim supports change data capture from Oracle and writing to MongoDB.
https://striim.com
https://attunity.com

I need to mirror a Postgres DB Schema into a React Native Realm Database

I have created a backend for a mobile app. The database uses Postgres with fairly complex relationships.
Is there a way to recreate the database in Realm? I saw that there was an (enterprise) real-time sync tool that links Realm to Postgres instances, but I'm unsure how to mirror the database in the first place. Do I simply write a schema, step-by-step, on the mobile client to match the Postgres database? The complex relationships involved would make that file very complicated to write.
The Enterprise Edition of Realm Platform contains a PostgreSQL data connector that can perform real-time synchronization between Postgres and Realm Platform, including creating the schema and loading the initial data.

How do I populate a google big table instance with data using an external url?

I've a google big table instance that need to be populate with data that are in a Postgres Database. My product team give a URL's that allow me to replicate the database. So using simple words I need to duplicate the Postgres database into the google instance and the way that my product team give me is using this url, how can I do this? any tutorial that can help me?
If you are already running PostgreSQL and would like to have a mirror of it on Google Cloud Platform, the best and simplest approach may be to run your own PostgreSQL instance on a Google Compute Engine virtual machine which can be done via several approaches, e.g.,
tutorial for launching PostgreSQL, or
click-to-deploy solution for PostgreSQL by Bitnami
Then, you would want to continuously mirror data from your local instance to the PostgreSQL instance running in Google Cloud to be able to query it. Another SO answer suggests that there are two major approaches to this:
Master/Master replication (Bucardo)
Master/Slave replication (Slony)
Based on your use case where you want to keep your local PostgreSQL instance as the canonical one, and just replicate to Google Cloud for the purpose of querying it, you want a Master/Slave replication, and have the PostgreSQL instance be the read-only replica, so you probably want to use the Slony approach.
For a more in-depth look at PostgreSQL solutions for high availability, load balancing, and replication, see the comparison in the manual.

Backend stored procedure schedulers in MongoDB database

Currently we are using apache flume framework to collect the logs from all our web applications and it will store the logs in MongoDB database table called “Raw_Data”.
Now we got a task to differentiate the logs and store the logging information in different tables based on the application. So we decided to maintain a separate table for each application. Is there any way to create and run stored JavaScript on a frequency like scheduler to load the data application wise in MongoDB?
I’m very much familiar with oracle DB but I’m new to MongoDb. Can someone help me to find out the required things for this asap.
Thanks
Kishore
MongoDB has stored Javascript procedures, but no scheduler.
http://docs.mongodb.org/manual/tutorial/store-javascript-function-on-server/
You could trigger this from a client machine using a cron job and the mongo shell. However, I strongly recommend simply running a script in Python or similar client-side scripting language. It'll be far easier to version-control and to debug than server-side Javascript.