Parsing and presenting Data from Ganglia - ganglia

I have just started using ganglia.
I am getting data from the clients on the ganglia frontend and Graphs too.
But the data I am getting is raw. it is appearing like key value on the frontend.
Is there any tool with which I can parse it and present as a dashboard ?
The data I am getting is mostly system configuration and performance
Regards,
W

Related

Cannot create a batch pipeline to get data from ZohoCRM with http plugin 1.2.1 to BigQuery. Retuns Spark Program 'phase-1' failed

My first post here and I'm new to Data Fusion and I'm with low to no coding skills.
I want to get data from ZohoCRM to BigQuery. Module from ZohoCRM (e.g. accounts, contacts...) to be a separate table in BigQuery.
To connect to Zoho CRM I obtained a code, token, refresh token and everything needed as described here https://www.zoho.com/crm/developer/docs/api/v2/get-records.html. Then I ran a successful get records request as described here via Postman and it returned the records from Zoho CRM Accounts module as JSON file.
I thought it will be all fine and set the parameters in Data Fusion
DataFusion_settings_1 and DataFusion_settings_2 it validated fine. Then I previewed and ran the pipeline without deploying it. It failed with the following info from the logs logs_screenshot. I tried to manually enter a few fields in the schema when the format was JSON. I tried changing the format to csv, nether worked. I tried switching the Verify HTTPS Trust Certificates on and off. It did not help.
I'd be really thankful for some help. Thanks.
Update, 2020-12-03
I got in touch with Google Cloud Account Manager, who then took my question to their engineers and here is the info
The HTTP plugin can be used to "fetch Atom or RSS feeds regularly, or to fetch the status of an external system" it does not seems to be designed for APIs
At the moment a more suitable tool for data collected via APIs is Dataflow https://cloud.google.com/dataflow
"Google Cloud Dataflow is used as the primary ETL mechanism, extracting the data from the API Endpoints specified by the customer, which is then transformed into the required format and pushed into BigQuery, Cloud Storage and Pub/Sub."
https://www.onixnet.com/insights/gcp-101-an-introduction-to-google-cloud-platform
So in the next weeks I'll be looking at Data Flow.
Can you please attach the complete logs of the preview run? Make sure to redact any PII data. Also what is the version of CDF you are using? Is CDF instance private or public?
Thanks and Regards,
Sagar
Did you end up using Dataflow?
I am also experiencing the same issue with the HTTP plugin, but my temporary way to go around it was to use a cloud scheduler to periodically trigger a cloud function that fetches my data from the API and exports them as a JSON to GCS, which can then be accessed by Data Fusion.
My solution is of course non-ideal, so I am still looking for a way to use the Data Fusion HTTP plugin. I was able to make it work to get sample data from public API end-points, but for a reason still unknown to me I can't get it to work for my actual API.

How to retrieve streamingdata from dataservice and use it in Pentaho CDE Dashboard?

I'm trying to display incoming streamingdata in a Pentaho dashboard. The incoming data are simple strings, which I would just like to display at the dashboard for now.
I created a kettle transformation, in which I bound a dataservice to the last step(MQTT-Producer).
Within spoon, I tested the service and it seems to work fine.
After uploading the kettle file, the service showed up in the service list (http://localhost:9090/pentaho/kettle/listServices).
Working with the dashboard editor, I use 'streaming over dataservices' from the 'DATASERVICES Queries' as my datasource.
At this point I didn't seem to have any success an was just trying out different panel options and dataservice properties.
I was following those tutorials:
https://help.pentaho.com/Documentation/8.2/Products/Data_Integration/Data_Services
https://help.pentaho.com/Documentation/8.2/Products/CTools/Create_Streaming_Service_Dashboard
What is it that I'm doing wrong?
Any help is appreciated.
cheers
update:
I changed the incoming streaming data to be two doubles.
after some more playing around, I did connect to the data service, using an external tool. I did see the expected values within the database. My dashboard, however, still shows this error message:
Error processing component (ccclinechart)
The same kind of error occurs, when I try to view the sample real time dashboard. It can't process the chartComponent. Maybe I need to reconfigure some other things?
Found the mistake.
Something went wrong with the Ports. After switching back to the default(8080) it worked just fine.
There might be another way to adjust your ports-settings to the problem, but the easiest way to deal with this sort of thing is to switch back to default settings.

Standard way to record reporting information?

I get logs for every call to my API. From these logs, I can retrieve interesting information that I use in a reporting dashboard.
My problem is that the number of those logs is getting bigger and bigger, and now I am looking for a new solution to store them.
Do I store logs or I store only the information I get from these logs?
Which database do I choose for storage (MySQL, HBase, MongoDB,Cassandra) ?

Getting the progress of the sent data through Zend Rest service

I have the following situation. We are using Zend Framework to create a web application that is communicating with it's database through REST services.
The problem I'm facing is that when a user tries to upload a big video file for example, the service is taking some time (sometimes a few minutes) to receive the request (which is also sending the video file encoded with base64_encode PHP function.) and returns the response for successful save or error.
My idea is to track how much of the data is sent and show the user a JS progress bar, which will be useful in these cases.
Does anyone have an idea, how I can track how much of the data is sent through the service and based on this I'll be able to show a progress bar?
Zend provides progress bar functionalities that might be paired with some javascript/jquery client.
You will easily find some example implementations like this one:
https://github.com/marcinwol/zfupload
However I don't think that REST services are the best solution for uploading videos as base64 encoding will make files bigger and slower to upload.
Check out Zend_File_Transfer that might be better suited to your needs:
http://framework.zend.com/manual/1.12/en/zend.file.transfer.introduction.html

Extra data usage V having a local DB(cache) in an app

Working on an app where all the contents/data for it will be coming via JSON and occasionally i will display a HTML page.
The client is suggesting that maybe we should have some local database(MYSQL Lite) to cache the JSON data returned so we use less of the users data(if there search for the same item again) allowance and because it maybe slightly faster.
Are these good enough reasons for adding the extra complexity and potential problems of having a local DB on the phone?
I didn't think from my experience that the phone was particularly slow or that JSON or HTML were data heavy in there data usage. I'd prefer having a thin client.
Facebook/Twitter/etc work with very little problems using JSON and Html.
Would I be wrong to try steer away from the local DB idea?
Thanks,
-Code
Caching url request results can improve your application's latency over a slow connexion. You could use CoreData to manually manage a cache (key:url, value:request's answer)
Another more elegant solution would be (if you have write access to the webservices) to implement server-side the "if-modified-since" header so that your request data received would be kept at a minimum level.