One repository to two databases, MongoDB and Spring Boot - mongodb

I have a repository which I want to save inside two different MongoDB databases, programmatically.
If the user enters an URL with the parameter DB1, the repository will save inside the database DB1, if it is DB2, to the database DB2, etc.
Is there any way to do this?

Not automatically. You need to have the application connected to 2 DBs and call each one depending on the parameter that comes in from the request. You'll need two separate repositories as far as I'm aware.

It is not possible to do with repositories easily (use multiple repositories with a little modification, because we can't use one, for each parameter in the URL is a madness.
So to avoid a lot of duplicate code we have to use the Java Driver.
MongoClient mongoClient = new MongoClient("localhost", 27017);
//here we can change the database name
MongoOperations mongoOperations = new MongoTemplate(mongoClient, database);
mongoOperations.save(YOUR_POJO);
mongoClient.close();
The POJO object has to use de #Documentannotation, if not you will have codec problems. Here you can solve them: http://mongodb.github.io/mongo-java-driver/3.2/bson/codecs/

Related

JPA: how to map some entities to a different schema of another database instance?

JPA: is there a way to map some entities to a schema of another database instance? e.g.,
#Entity
public class Foo {
}
#Entity
#Table(schema="schema1")
public class Bar {
}
The Bar entity is mapped to the schema1 of the same database instance. Is there a way in JPA to map it to a schema in a remote database instance? It is useful for sharing entities among multiple applications.
Can the "catalog" be used for this purpose?
What do you mean by 'remote database'?
If you use #Table(schema = "myschema", name = "bar"), Hibernate will qualify all queries with the schema name (e.g. SELECT e FROM Bar will ultimately translate to SELECT * FROM myschema.bar). If the database user you're using to connect to the DB has access to myschema.bar (whatever such a DB object is), then the query will work; if not, then the query will fail.
If you mean 'a remote DB that is a separate server', then, of course, you can only connect to the DB using one JDBC connection per persistence context. If that's your scenario, perhaps you should consult the docs of the RDBMS for ways to connect two DB instances (in Oracle, for example, you could use database links and synonyms).
Make sure that you understand the implications, though, as such a solution introduces its own class of problems (including the fact that you suddenly have implicit distributed transactions in your system).
As a side note, I'm not sure how such an approach is 'useful for sharing entities among multiple applications' or why one would even think 'sharing entities among multiple applications' is somehow useful, but I'd seriously think through the idea of integrating multiple application via shared/linked DBs. It usually introduces more problems than it solves.
If I understand well what you mean, you should use two (or more) different persistence context

TypeORM: Dynamically set database schema for EntityManager (or repositories) at runtime?

Situation:
For our SaaS API we use schema-based multitenancy, which means every customer (~tenant) has its own separate schema within the same (postgres) database, without interfering with other customers. Each schema consists of the same underlying entity-model.
Everytime a new customer is registered to the system, a new isolated schema is automatically created within the db. This means, the schema is created at runtime and not known in advance. The customer's schema is named according to the customer's domain.
For every request that arrives at our API, we extract the user's tenancy-affiliation from the JWT and determine which db-schema to use to perform the requested db-operations for this tenant.
Problem
After having established a connection to a (postgres) database via TypeORM (e.g. using createConnection), our only chance to set the schema for a db-operation is to resort to the createQueryBuilder:
const orders = await this.entityManager
.createQueryBuilder()
.select()
.from(`${tenantId}.orders`, 'order') // <--- setting schema-prefix here
.where("order.priority = 4")
.getMany();
This means, we are forced to use the QueryBuilder as it does not seem to be possible to set the schema when working with the EntityManager API (or the Repository API).
However, we want/need to use these APIs, because they are much simpler to write, require less code and are also less error-prone, since they do not rely on writing queries "manually" employing a string-based syntax.
Question
In case of TypeORM, is it possible to somehow set the db-schema when working with the EntityManager or repositories?
Something like this?
// set schema when instantiating manager
const manager = connection.createEntityManager({ schema: tenantDomain });
// should find all matching "order" entities within schema
const orders = manager.find(Order, { priority: 4 })
// should find a matching "item" entity within schema using same manager
const item = manager.findOne(Item, { id: 321 })
Notes:
The db-schema needs to be set in a request-scoped way to avoid setting the schema for other requests, which may belong to other customers. Setting the schema for the whole connection is not an option.
We are aware that one could create a whole new connection and set the schema for this connection, but we want to reuse the existing connection. So simply creating a new connection to set the schema is not an option.
To answer my own question:
At the moment there is no way to instantiate TypeORM repositories with different schemas at runtime without creating new connections.
So the only two options that a developer is left with for schema-based multi tenancy are:
Setting up new connections to connect with different schemas within the same db at runtime. E.g. see NestJS Request Scoped Multitenancy for Multiple Databases. However, one should definitely strive for reusing connections and and be aware of connection limits.
Abandoning the idea of working with the RepositoryApi and reverting to using createQueryBuilder (or executing SQL queries via query()).
For further research, here are some TypeORM GitHub issues that track the idea of changing the schema for a existing connections or repositories at runtime (similar to what is requested in the OP):
Multi-tenant architecture using schema. #4786 proposes something like this.photoRepository.useSchema('customer1').find()
Handling of database schemas #3067 proposes something like getConnection().changeDefaultSchema('myschema')
Run-time change of schema #4473
Add an ability to set postgresql schema per call #2439
P.S. If TypeORM decides to support the idea discussed in the OP, I will try to update this answer.
Here is a global overview of the issues with schema-based multitenancy along with a complete walkthrough a Github repo for it.
Most of the time, you may want to use Postgres Row Security Policy instead. It gives most of the benefits of schema-based multitenancy (especially on developer experience), without the issues related to the multiplication of connections.
Since commenting does not work for me, here a hint from the documentation of NestJS:
https://docs.nestjs.com/techniques/database#async-configuration
I am not using NestJS but reading the docs at the moment to decide, if it's a fitting framework for us. We have an app where only some modules have multi tenancy with schema per tenant, so using TypeOrmModule.forRootAsync(dynamicCreatedDbConfig) might be an option for me too.
This may help you if you have an interceptor or middleware, which prepares the dynamicCreatedDbConfig data before...

How to dynamically create MongoTemplate instances, with credentials?

Prior to Spring Data MongoDB 1.9.0-RELEASE, I was able to create a MongoTemplate object as follows:
new MongoTemplate(client, dbName, credentials). Upon upgrading, this constructor no longer works, giving an error to use MongoCredential instead. However, there is no similar MongoTemplate constructor that uses MongoCredential. It appears that the only way to specify credentials now is when constructing the MongoClient object.
However, since my app is multitenant on the database level, this doesn't work because it does not allow for additional credentials to be added after construction (meaning MongoTemplates cannot be created dynamically). It also is not ideal because if any of the credentials in the list are bad, none of the database connections work, as opposed to just the one with bad credentials.
I also do not want to create a new MongoClient instance for each database. From what I understand, doing so would create a new connection for each database rather than letting MongoClient manage a connection pool, which is ultimately not sustainable since Mongo only allows a finite number of connections.
Do I have any options here besides continuing to use the outdated library?
What you can do is instantiate the MongoClient and cache it using a HashMap with some unique db identifier as key and mongoClient as value. Use that create MongoTemplate
What I ended up doing is creating a single user in the admin database that has access to all of the databases that I need (achieved via the roles array). I create one MongoClient, authorizing as that user against the admin database. Then I am able to create MongoTemplate objects dynamically without issue, because the user I'm authorized as hasreadWrite permissions on those databases.

Is it possibile to use a single transaction (on EF) with two different contexts pointing different schemas?

I'm currenly designing an application where I need to use two different database schemas (on the same instance): one as the application base, the other one to customize the application and the fields for every customer.
Since I read something about Repository pattern and as I've understood is possible to use two different contexts without efficiency loose, I'm now asking if I can use a single database transaction between two schemas with Entity Framework, as I'm actually doing directly on the database (SQL Server 2008-2012).
Sorry for my English an Thanks in advance!
If your connection strings are the same (which in your case will be as you have different schemas only for different contexts) then you are ok with this approach.
Basically you will have two different contexts that will be connected via the same connection string to the database and which will represent two different schemas.
using (var scope = new TransactionScope()) {
using (var contextSO = new ContextSchemaOne()) {
// Add, remove, change entities from context schema one
ContextSchemaOne.SaveChanges;
}
using (var contextST = new ContextSchemaTwo()) {
// Add, remove, change entities from context schema two
ContextSchemaTwo.SaveChanges;
}
scope.Complete();
}
I wasn't very successful in the past with this approach, and we switched to one context per database.
Further reading: Entity Framework: One Database, Multiple DbContexts. Is this a bad idea?
Maybe it's better to read something about unit of work before taking a decision about this.
You will have to do something like this: Preparing for multiple EF contexts on a unit of work - TransactionScope

spring-data-mongodb. How can i dynamically create a database in mongo using spring-data-mongodb library?

spring-data-mongodb. How can i dynamically create a database in mongo using spring-data-mongodb library?
I am trying to use Spring-Mongodb-Data module for CRUD operations against Mongo database and going through examples and articles my assumption is that databasename should be pre-defined in spring context xml when defining MongoTemplate bean.
In my case I have an multi-tenant application that will accept requests over http and my application should create the mongodatabase on-the-fly and use the name provided in the input http request to create the database and then load the data into collection in the newly created database.
I am trying to figure out if there is a way to dynamically populate the databasename in MongoTemplate or MongoRepository without having to provide it in spring context.xml?
Please help me.
Thanks
-RK
Have you tried the following instead of going through the pre-defined spring context configuration.
MongoTemplate getMongoTemplate(Mongo mongo, String database) {
return new MongoTemplate(mongo, database);
}