I have many liquibase changsets and execute it before integrations test.
Now I want to use EF InMemory for tests.
Is it possible to forward liquibase changesets to InMemory database?
Sure it is. Just create a copy of database connection settings, which will point to your test database, and use them while running tests.
Related
I started using prisma especially for handling database migrations. It handles that well. But there are many open issues for things that it does not handle well related to queries (biggest are related to queryRaw not always working as expected and with no straight forward way to use postgreSQL Row Level Security). Everything I've found to be a problem related to queries in prsima can easily be done in node-postgres.
I realize from the docs that some of these issues are not a problem in knexjs but prisma has a more feature rich migration setup (automatic and can be customized).
Can I safely use prisma and node-postgres together? If so, how? For example use prisma for schema design and database migrations and use node-postgres for all my query logic?
Yes, you can safely use prisma and node-postgres together.
Your workflow will look like the following:
Create a Prisma Schema to define your models and underlying database tables.
Use the Prisma CLI (the prisma npm library) to run migrations to your database. This will generate the client in node_modules/.prisma/client directory, which you could optionally delete as you won't be using it.
Instead of using generated Prisma Client inside your node application, use the node-postgres library to run queries against the database in your application.
Since you're only using Prisma to make any kind of migration or changes to the database structure, there's no risk of your prisma schema not being synced with your database. On the off chance this does happen, you can always get it back in sync using prisma db pull.
Furthermore, since your application is not going to use Prisma client to connect to the database for running queries, node-postgres will handle the connection pool.
what is the difference between prisma db push and prisma migrate dev ? When should I use one over the other. Docs say that prisma db push is about schema prototyping only and I don't understand what does that mean.
They serve two different environments. The prisma db push is not to be used in your production environment, as stated in the docs
db push uses the same engine as Prisma Migrate to synchronize your Prisma schema with your database schema, and is best suited for schema
prototyping. The db push command:
Introspects the database to infer and executes the changes required to
make your database schema reflect the state of your Prisma schema.
By default, after changes have been applied to the database schema,
generators are triggered (for example, Prisma Client). You do not need
to manually invoke prisma generate.
If db push anticipates that the changes could result in data loss, it
will:
Throw an error
Require the --accept-data-loss option if you still want
to make the changes
Note: db push does not interact with or rely on
migrations. The migrations table will not be updated, and no migration
files will be generated.
The prisma migrate dev is used in you local environment, as explained in the docs
migrate dev is a development command and should never be used in a
production environment.
This command:
Replays the existing migration history in the shadow database in order
to detect schema drift (edited or deleted migration file, or a manual
changes to the database schema)
Applies pending migrations to the
shadow database (for example, new migrations created by colleagues)
Generates a new migration from any changes you made to the Prisma
schema before running migrate dev
Applies all unapplied migrations to
the development database and updates the _prisma_migrations table
Triggers the generation of artifacts (for example, the Prisma Client)
The migrate dev command will prompt you to reset the database in the
following scenarios:
Migration history conflicts caused by modified or missing migrations
The database schema has drifted away from the end-state of the
migration history
If you have any other question regarding this, there is this comparison in the docs explaining when to use one or the other.
From the docs:
Use db push to prototype a schema at the start of a project and initialize a migration history when you are happy with the first draft
Use db push to prototype a change to an existing schema, then run prisma migrate dev to generate a migration from your changes (you will be asked to reset)
So the key for me here, was that only the first time I had to run db push (into a heroku app with heroku postgres). After that, migrations all the time...
I'm new to Prisma ORM, & I'm trying to make migrations in Prisma
I see that I way to do this is to update data.model & then just run:
prisma deploy
But what if I want to create a migrations for specific versions of app how could I do that ??
As the prisma documentation describes there are two ways of doing database migrations in prisma:
Using the Prisma CLI
Performing a manual DB migration with plain SQL
If you follow the first approach and edit your data model the changes will be carried out automagically once you run prisma deploy. You can specify the service and stage this will be rolled out to via the PRISMA_ENDPOINT environment variable:
PRISMA_ENDPOINT="http://localhost:4466/{SERVICE}/{STAGE}"
This way you can roll out and test you data model changes in a different stage or on a different service.
The second approach is to manually change the database model via plain SQL. Be careful to ensure the database schema and your data model are in sync.
For more information check out:
https://www.prisma.io/docs/datamodel-and-migrations/migrations-POSTGRES-asd4/
I have created a Postgresql database A using liquibase changesets. Now, I'm creating an application that allows creating a new database B and copies the schema from database A in real-time including the liquibase changesets as the database can still be updated later. Note that at the time of the copied schema in database A could already be updated, making the base changesets outdated.
My main question would be:
How to copy PostgreSQL schema x from database a (dynamically generated at run-time) to b using liquibase? Database b could be on another server.
If it's not possible with liquibase, what other tools or approaches would make this possible?
--
Let me add more context:
We initialize a new database a schema using liquibase changeset.
We can add a new table and field to the database an at run-time. Or during the time when the application is running. For example, we add a new table people to the schema of database a, which is not originally in the changeset. This is done using liquibase classes too. So changeset is added to databasechangelog table.
Now, we create a new database b.
We want to import the schema of the database a to b, with people table.
I hope that is clear.
Thanks.
All schema changes must be run through your schema migration tool
The point of using a database schema migration tool such as Liquibase or Flyway is to have a “single source of truth” regarding the structure of your database tables. Your set of Liquibase changesets (or Flyway scripts) is supposed to be that single source of truth for your database.
If you are altering the structure of you database at runtime, such as adding a table named people, outside the scope of your migration tool, well, then you have violated the rules of the game. You will have defeated the purpose of using a schema migration tool. The intention of using a schema migration tool is that you make all schema changes through that tool.
If you need to add a table while running in production, you should be dropping the physical file for the Liquibase changeset (or Flyway script) into the file system of your database server environment, and then invoking Liquibase (or Flyway) to run a migration.
Perhaps you have been misunderstanding the sequence of events:
If you have built a database on server "A", that means you installed Postgres, created an empty database, then installed the collection of Liquibase changesets you have carefully built, then ran a Liquibase migration operation on that server.
When you go to create a database on server "B", you should be following the same steps: Install Postgres, create an empty database, installing the very same collection of Liquibase changesets, and then running a Liquibase migration operation.
Alternatively, if making a copy of server "A" to create server "B", that copy should include the exact same Liquibase changesets. So at the end of your copy process, the two databases+changesets are identical.
Here's how I solved this problem of mine using the Liquibase Java library:
1.) Export the changelog from the source database into a temporary file (XML).
Liquibase liquibase = new Liquibase(liquibaseOutFile.getAbsolutePath(), new FileSystemResourceAccessor(), sourceDatabase);
liquibase.generateChangeLog(catalogAndSchema, changeLogWriter, new PrintStream(liquibaseOutFile.getAbsolutePath()), null);
2.) Execute the temporary file to the new data source.
Liquibase targetLiquibase = new Liquibase(liquibaseOutFile.getAbsolutePath(), new FileSystemResourceAccessor(), targetDatabase);
Contexts context = new Contexts();
targetLiquibase.update(context);
Here's the complete code: https://czetsuya-tech.blogspot.com/2019/12/generate-postgresql-schema-using-java.html
Our clients do not have admin access to our web servers and we run MVC , how can i clone an existing DB to fork the data without shutting down the current DB ?. At the moment EF creates the new DB but all the records need to be manually created so we wanted to do a fork.
I suppose i could go through all entities in all tables , detach them all and insert into the DB but is there a nicer way ? As writing that code for 100 tables is not quick even if we use reflection .
The other option of doing a backup and restore is a bit painful as some of the DBs are hosted on SQL server and some as attached files .
Ben
EF is not tool for this. Either use native SQL tools like backup / restore or if there is any additional logic needed create SSIS package or custom ADO.NET application for data migration. With EF it will not only take long to do that but it will be also terrible bad and slow solution.