how to get service plan for "DataCache" service? - ibm-cloud

I would like to know the plan of "DataCache" service before creating an instance from marketplace.
I can see there is a "DataCache" service in marketplace:
C:\Users\IBM_ADMIN>cf marketplace | findstr Data
- DataCache free, starter*, standard*, premium*
Improve the performance and user experience of web applications by retrieving information from fast, managed, in-memory caches, instead of relying entirely on slower disk-based databases.
- DataWorks free
Find, prepare, and deliver data with an intuitive app or with powerful APIs.
- MobileData Shared*
Enhance your mobile apps through simple to use SDKs to save shared data in a scalable, managed database as a service. Powered by Cloudant.
- XPagesData xpages-data-free
Create an IBM Notes .NSF database to store your XPages Domino data.
- namara-catalog free
Open Data. Clean and simple.
- reappt reappt:pushtechnology:dif03
Real Time Data Distribution Service
- sqldb sqldb_small*, sqldb_free, sqldb_premium*
SQL Database adds an on-demand relational database to your application. Powered by DB2, it provides a managed database service to handle web and transactional workloads.
- timeseriesdatabase small*
Time Series Database (powered by Informix) is purpose-built for fast and efficient storage and analysis of time series data.
When I try to retrieve detail description of the "DataCache" service, I got the "Could find service" error.
C:\Users\IBM_ADMIN>cf marketplace -s "DataCache"
Getting service plan information for service DataCache as yujames.tw#gmail.com...
FAILED
Could not find service
However, I can retrieve detail description for any other service such as "DataWorks":
C:\Users\IBM_ADMIN>cf marketplace -s "DataWorks"
Getting service plan information for service DataWorks as yujames.tw#gmail.com...
OK
service plan description free or paid
free There is currently no charge for the use of this service. free
Any idea?

cf m | grep DataCache
DataCache free, starter*, standard*, premium*
The plans are free, starter, standard and premium. I agree that getting details for some services does not work using cf - defect. I will raise this issue with the team.
See plans in the UI
To create it
cf cs DataCache free myDataCache

Related

How to monitor over 500+ servers using Grafana from SQL server as data source

Currently we're monitoring our SQL servers running in Windows platform via MS SQL server reporting services using shared data sources. To confirm what I mean, we don't store data at centralized server to monitor over 500 target servers. We keep monitoring data on local SQL database servers and use shared data source in SSRS to create dashboards.
Now in our firm we're encouraged to use Grafana as dashboard since they have purchased or running some Grafana server licensing. What I know of Grafana instance is that it can be given to us to monitor SQL servers as described above.
My question is how would Grafana dynamically connect to those 500 plus servers? I see it creates data source once but how will I change or create multiple data sources when I have around 1000 servers to monitor?
Please suggest guide.
You may have to code a bit and use data source provisioning and/or Grafana datasource API for it to pickup the new data source.
If you could set up a system (user-data/ init script/IaC) where this API is called everytime a new server comes up, then you will be able to maintain the data sources without maintainance.

How to take backup of Tableau Server Repository(PostgreSQL)

we are using 2018.3 version of Tableau Server. The server stats like user login, and other stats are getting logged into PostgreSQL DB. and the same being cleared regularly after 1 week.
Is there any API available in Tableau to connect the DB and take backup of data somewhere like HDFS or any place in Linux server.
Kindly let me know if there are any other way other than API as well.
Thanks.
You can enable access to the underlying PostgreSQL repository database with the tsm command. Here is a link to the documentation for your (older) version of Tableau
https://help.tableau.com/v2018.3/server/en-us/cli_data-access.htm#repository-access-enable
It would be good security practice to limit access to only the machines (whitelisted) that need it, create or use an existing read-only account to access the repository, and ideally to disable access when your admin programs are complete (i.e.. enable access, do your query, disable access)
This way you can have any SQL client code you wish query the repository, create a mirror, create reports, run auditing procedures - whatever you like.
Personally, before writing significant custom code, I’d first see if the info you want is already available another way, in one of the built in admin views, via the REST API, or using the public domain LogShark or TabMon systems or with the Addon (for more recent versions of Tableau) the Server Management Add-on, or possibly the new Data Catalog.
I know at least one server admin who somehow clones the whole Postgres repository database periodically so he can analyze stats offline. Not sure what approach he uses to clone. So you have several options.

Transfer MongoDB dump on external hard drive to google cloud platform

As a part of my thesis project, I have been given a MongoDB dump of size 240GB which is on my external hard drive. I'll have to use this data to run my python scripts for a short duration. However, since my dataset is huge and I cannot mongoimport on my local mongodb server (since I don't have enough internal memory), my professor gave me a $100 google cloud platform coupon so I can use the google cloud computing resources.
So far I have researched that I can do it this way:
Create a compute engine in GCP and install mongodb on remote engine. Transfer the MongoDB dump to remote instance and run the scripts to get the output.
This method works well but I'm looking for a method to create a remote database server in GCP so I that I can run my scripts locally, which is something like one of the following.
Creating a remote mongodb server on GCP so that I can establish a remote mongo connection to run my scripts locally.
Transferring the mongodb dump to google's datastore so then I can use the datastore API to remotely connect and run my scripts locally.
I have given a thought of using MongoDB atlas but because of the size of the data, I will be billed hugely and I cannot use my GCP coupon.
Any help or suggestions on how of either of the two methods can be implemented is appreciated.
There is 2 parts to your question
First, you can create a compute engine VM with MongoDB installed and load your backup on it. Then, open the right firewall rules for allowing the connexion from your local environment to the Google Compute Engine VM. The connexion will be performed with a simple login/password.
You can use a static IP on your VM. By the way, in case of reboot on the VM you will keep the same IP (and it will be easier for your local connexion).
Second, BE CAREFUL to datastore. It's a good product, serverless NoSQL database, document oriented, but it's absolutely not the MongoDB equivalent. You can't perform aggregate, you are limited in search capabilities,... It's designed for specific use case (I don't know yours, but don't think that is the MongoDB equivalent!).
Anyway, if you use Datastore, you will have to use a service account or to install Google Cloud SDK on your local environment to be authenticated and to be able to request Datastore API. No login/password in this case.

One ASP.Net application on IBM Cloud - One or more MongoDB's?

I have an ASP.NET application running IBM Cloud, and I have a MongoDB instance created, and my application is deployed in my development-space, and bound to the MongoDB(alias). All working fine.
Now, I have also created a production-space, and want to deploy the application there, after having verified it in Dev.
But - do I really have to create another MongoDB - and pay for 2 instances, or can I somehow share one instance (with security seperating that dev cant access prod)?
Or what is best practice in this case?
Any advice much appreciated.
It is your call. The connection string to the mongodb will work from both dev and prod. I would recommend having two mongodb instances. A separate dev mongodb will allow development to work in isolation both functionally, performance and security.

Can a Java web app in GAE connect to Mongodb in GCE?

I'm looking at Google app engine for deploying my Java web app which connects to a mongodb database through the mongodb java driver.
Plan 1:
Deploy MongoDb Database in Google Compute Engine
Deploy Java Web app in Google App Engine and pass the address of the MongoDb (in GCE) to the mongodb driver
Plan 2:
Deploy both Mongodb and Java web app in Google Compute engines
Question 1: Will Plan 1 work ?
Question 2: I think Plan 2 will probably work. But is it the most efficient method ? If not, can you suggest a more efficient method?
Note: Im on planning on using Google Datastore.
Your plan 1 is feacible, the only thing you need to ensure is that your App Engine and Compute Engine (MongoDB) are in the same virtual network. As it is stated here.
The plan 2 won't be cost effective due to the requirement of having the intance(s) running your app on 24x7. Even if there are no traffic you will have to pay for them. Google App Engine will give you the chance of having a free quota.