I’m a student and i’m working on my last year project, the project is about Data warhousing, BI, etc...
So Im asked to work with Apache Kylin
I did some researchs about it, learned some
And I looked for if it is possible to use a PostgreSQL as Data warehouse and make it communicate with Apache Kylin to build cubes
But found nothing...
So would you please answer to my following question:
Is it possible to make the apache kylin communicate with a postgreSQL DWH?
And if there is some hidden documentations about it would you please share it?
Time is running guys and i really appreciate your answers and guides
Thanks in advance.
Khalil
It's doable. Kylin provides data source adapter for JDBC data sources. PostgreSQL could be one of the data source adapters. MySQL is supported by default. You could check this link to learn more: http://kylin.apache.org/development/datasource_sdk.html
Related
I need to create a second apache instance (same version) on Redhat 7.9. The reason for this is that I want to create a second development environment where apache restart will not affect the other apache instance. I am using httpd24-httpd 2.4.34 RHSCL and I am not able to find any related documentation.
Do you know if multiple apache instances is supported for httpd24-httpd RHSCL on RH 7.9 and if there is any documentation I can follow?
Thank you in advance
It's a bit of a guess tbh, but what about installing the second instance into a container? See here https://catalog.redhat.com/software/containers/rhscl/httpd-24-rhel7/57ea8d049c624c035f96f42e
For simplicity and cost, we are starting our project using local MySQL running on our GCE instances. We will want to switch to CloudSQL some months down the road.
Any advice on avoiding MySQL version conflicts/challenges would be much appreciated!
The majority of the documentation is for MySQL 5.7 so as an advice I recommend you use this version and review migrating to cloudsql concept this is a guide that will guide you through how to migrate safely which migration methods exist and how to prepare you MySQL database.
Another advice which I can give you is make the tutorial migrating mysql to cloud using automated workflow tutorial this guide also says that the any MySQL database running version 5.6 or 5.7 allows you to take advantage of the Cloud SQL automated migration workflow this tutorial is important to know how works and how deploy a source MySQL database on Compute Engine. The sql page will give you more tutorials if you want to learn more.
Finally I suggest to you check de sql pricing to be aware about the billing and also I suggest to you create a workspace with this you can have more transparency and more control over your billing charges by identifying and tuning up the services that are producing more log entries.
I hope all the information that I'm giving you are helpful.
Installing Pentaho Mondrian and using it with custom web application.
I apologize in advance, if my questions seem to be too global, but there's not much info about these topics on internet and I need at least some common answers about overall direction that I should have.
Here's the deal, I have:
Some web app (Django/.NET/Nodejs, doesn't matter really)
Postgres database
Now I need some OLAP server, that I can run my MDX queries against and get analytics. In this case, Pentaho Mondrian OLAP server was chosen.
So I guess, these steps are needed to be done:
Install and run Mondrian as a server
Connect Mondrian to Postgres
Create XML cube schema file via Schema Workbench(seems to be obsolete) or manually
Call Mondrian via some API from my web app
Am I missing something?
Questions:
How do I install and run Mondrian as a server? Unfortunately, I've only found some confusing and obsolete info about it, even in official documentation.
How do I pass MDX queries from my web app to Mondrian and get results back in some meaningful way or form? Is there any API on Mondrian side?
Thank you.
Is possible build a bigdata application on cloud with RED HAT'PaaS OpenShift? I'm looking how build on cloud an Scala Application with Hadoop (HDFS),Spark,an Apache Mahout but i can't find any thing about it.I've seen something with HortonWorks but nothing clear about how install it in an openshift environment an how add HDFS node in Cloud too.Is it possible with OpneShift?
It's possible in Amazon but my question is : IS possible in OpenShift ??
It really depends on what you're ultimately trying to achieve. I know you mention building a big data application on Openshift with Scala but what will the application ultimately be doing?
I've gotten Hadoop running in a gear before but if you want a better example check out this quickstart here to get an idea of how its done https://github.com/ryanj/flask-hbase-todos. I know its not scala but here's a good article that will show you how to put together a scala app https://www.openshift.com/blogs/building-distributed-and-event-driven-applications-in-java-or-scala-with-akka-on-openshift.
What will the application ultimately be doing?:
Forecasting for football match result for several football leagues,a web application (ruby) and
statistic computation and data mining ,calculations with Scala language
and apache frameworks(spark & mahout).
We get the info via CSV files, process and save it in nosql db (Cassandra).
And all of this on cloud(OpenShift),that's the idea.
I've seen the info https://github.com/ryanj/flask-hbase-todos.I'll try by this way but
with Scala.
I was doing a project related to the IBM competition and need to create a web application. I have done a web application before by connecting netbeans and mysql. But, now am facing problem right from installation.
Is there any workbench(like something for mysql) for db2 . If so can anyone give a link. Also, I need to make jdbc connections. So, is there any other software I need to install.
Kindly explain to me in detail as I'm not pretty sure about this.
All you need is a DB2 JDBC JAR. Pick the appropriate one for your version and add it to your CLASSPATH.
You should use either a DB2 admin client to create tables and view data or something like SQL Squirrel.