How can Spring Cloud Dataflow Server use new tables( with custom prefix ) created for Spring batch and Spring cloud task? - spring-batch

I have created spring cloud task tables i.e. TASK_EXECUTION, TASK_TASK_BATCH with prefix as MYTASK_ and spring batch Tables with prefix MYBATCH_ in oracle database.
There are default tables also there in the same schema which got created automatically or by other team mate.
I have bound my Oracle database service to SCDF server deployed on PCF.
How can i tell my Spring Cloud Dataflow server to use tables created with my prefix to render data on dataflow server dashboard?
Currently, SCDF dashboard uses tables with default prefix to render data. It works fine. I want to use my tables to render SCDF dashboard screens.
I am using Dataflowserver version - 1.7.3 and Deployed it on PCF using manifest.yml

There's an open story to add this enhancement via spring-cloud/spring-cloud-dataflow#2048.
Feel free to consider contributing or share use-case details in the issue.

Currently in a spring-cloud-dataflow and spring-cloud-skipper we use flyway to manage database schemas and it's not possible to prefix table names. Trying to support for this would add too much complexity and I'm not even sure if it'd be possible.

Related

Postgresql server not shown on azure application map

I'm trying to use Application Insights to monitor an application composed of different microservices in an AKS (Azure Kubernetes Services) cluster.
As AKS does not support the auto-instrumentation scenario, I had to instrument myself my js/.net services with the dedicated libs.
And this works fine, I can see my different microservices on an application map.
But I can't see my database server in the dependencies like in the documentation's example, even if those dependencies should be automatically collected as stated in the dependencies documentation.
I'm using Azure Database for PostgreSQL - Flexible Server. Is this normal? Is it due to the fact I am using PostgreSQL instead of SQL Server? Is this related to the fact that I'm using Npqsl instead of SqlClient ?

Is it feasible to use one Spring Cloud Dataflow UI for multiple batch applications?

In the world of microservices, we would have multiple applications that have their own independent datasources and corresponding batch applications in their bounded context. Having this said, since SCDF requires a datasource to be configured to bring it up to monitor batch jobs, is it possible to configure a single, central SCDF server and UI to monitor all the batch jobs of different microservices(obviously with corresponding DBs) with the spring batch metadata tables intact with the applications' business tables? Asking this because it might look very clumsy, untidy and unmaintainable to keep so many SCDF servers running in the environment(Please correct me if my feeling is senseless).
Please bring me some clarity on this query. Thanks in advance.
Yes
Setup and install a single SCDF server with an associated DB. For each of your task/batch apps
override the TaskConfigurer and BatchConfigurer to accept a second datasource that refers to SCDF DB as shown here https://github.com/spring-cloud/spring-cloud-task/tree/main/spring-cloud-task-samples/multiple-datasources.
Thus the batch and task apps will report their state to the SCDF DB while still using their own DB for the work desired.

Is it possible to launch a NoSQL cluster with Dynamodb locally(downloadable) and not in amazon aws?

Actually I am not very familiar with Dynamodb and I would like to launch a NoSQL database with local Dynamodb (downloadable version) but not hosted on amazon AWS. I would appreciate it if someone could let me know is it possible to make such a clustered or does downloadable version of Dynamodb support to be cluster locally ?
You can very easily run DynamoDB locally, but it only supports running a single instance—not a cluster. It's intended to be used for local testing/debugging.
DynamoDB is provided as a hosted service. Does not exist a DynamoDB code that you can download and install to use as a host or service provider.
As part of SDK for a lot of languages, AWS Team developed some wrappers that permits you to execute local versions of DynamoDB to test your particular code. These wrappers respect the DynamoDB API contract. In that case you can code to the DynamoDB interface and get the responses like it were hosted in AWS environment. But you can't host any database or even serve data as a service using this solutions.

How to add functionality to Spring Cloud Bootstrap

I want to add some lookup before the Spring context is loaded, which is ideally on the bootstrap phase of Spring Cloud (When it lookup for Configuration Server, Cloud connectors etc). How can I make my code to be executed on that phase ?
What I want to do is query Vault to get all my databases secrets and api keys and set the properties, I know I can encrypt with Spring Cloud Config, but I liked the strong box of Vault. (The integration with Vault part I can handle)
As I saw in the code of Spring Cloud Config, the bootstrap configuration is auto-configured by using the class org.springframework.cloud.bootstrap.BootstrapConfiguration on the resources/META-INF/spring.factories file, the same which you can use to register new auto configuration classes for Spring Boot, as reference you can refer to the file on the project here. This will make your configuration be started and registered before the "normal" application context.

How to replicate MySQL database to Cloud SQL Database

I have read that you can replicate a Cloud SQL database to MySQL. Instead, I want to replicate from a MySQL database (that the business uses to keep inventory) to Cloud SQL so it can have up-to-date inventory levels for use on a web site.
Is it possible to replicate MySQL to Cloud SQL. If so, how do I configure that?
This is something that is not yet possible in CloudSQL.
I'm using DBSync to do it, and working fine.
http://dbconvert.com/mysql.php
The Sync version do the service that you want.
It work well with App Engine and Cloud SQL. You must authorize external conections first.
This is a rather old question, but it might be worth noting that this seems now possible by Configuring External Masters.
The high level steps are:
Create a dump of the data from the master and upload the file to a storage bucket
Create a master instance in CloudSQL
Setup a replica of that instance, using the external master IP, username and password. Also provide the dump file location
Setup additional replicas if needed
Voilà!