Play!: Does Slick's DDL replace Evolutions? - scala

This may be a dumb question but I'm new to Play! & Slick. While using Slick's table.ddl.create I noticed that it doesn't create an evolution but the application still works.
Does this replace evolutions? Can I use it in production? Should I?
Thanks in advance.

Both Slick and the Slick DDL Plugin can only generate code to create or delete your schema, not to evolve it. So you still need Play evolutions or something similar to modify an existing schema along the way. In the Slick team, we are working towards a migration solution (on a lower priority). Many parts are already there, but haven't been integrated properly yet. There are #nafg's schema manipulation DSL: https://github.com/nafg/slick-migration-api and my one year old prototype for a database version management tool: https://github.com/cvogt/migrations/ . The code generation part of the latter has already made it into Slick 2.0. Properly integrating all of these will give us a comprehensive solution for type-safe database migration scripts.

Slick is able to generate DDL for your defined tables it does not contain logic that does what evolutions does.
The play slick plugin on the other hand contains a SlickDDLPlugin that will generate and run DDL evolutions for you when you run your app in non prod-mode (with play run for example) It also dumps out those evolutions in your conf/evolutions directory.
The sources that handles evolutions:
https://github.com/freekh/play-slick/blob/master/src/main/scala/play/api/db/slick/plugin/SlickPlugin.scala

Related

Using Slick with Kudu/Impala

Kudu tables can be accessed via Impala thus its jdbc driver. Thanks to that it is accessable via standard java/scala jdbc api. I was wondering if it is possible to use slick for it. Or if not is any other high level scala db framework supporting impla/kudu.
Slick can be used with any JDBC database
http://slick.lightbend.com/doc/3.3.0/database.html
At least, for me, Slick is not fully compatible with Impala Kudu. Using Slick, I can not modify db entities, can not create, update or delete any item. It works only to read data.
There are two ways you could use Slick with an arbitrary JDBC driver (and SQL dialect).
The first is to use low-level JDBC calls. The SimpleDBIO class gives you access to a JDBC connection:
val getAutoCommit = SimpleDBIO[Boolean](_.connection.getAutoCommit)
That example is from the Slick manual.
However, I think you're more interested in working at a higher level than that. In that case, for Slick, you'd need to implement a custom Profile. If Impala is similar enough to an existing database profile, you may be able to extend an existing profile and adjust it to account for any differences. For example, this would allow you to customize how SQL is formatted for Impala, how timestamps are represented, how column names are quoted. The documentation on Porting SQL from Other Database Systems to Impala would give you an idea of what needs to change in a driver.
Or if not is any other high level scala db framework supporting impla/kudu.
None of the main-stream libraries seem to support Impala as a feature. Having said that, the Doobie documentation mentions customising connections for Hive. So Doobie may be worth quickly trying Doobie to see if you can query and insert, for example.

Slick-CodeGen for specific tables

I am looking at the slick 2.1 code generation utility (working with legacy code).
http://slick.lightbend.com/doc/2.1.0/code-generation.html
I don't see a syntax which will allow me to run the code-gen on only a few specific tables.
So I added new tables, and I want to generate only those not everything else.
Is this possible?

Play Slick version 1.0.1, how to generate the SQL from models?

I am looking at this example:
https://github.com/playframework/play-slick/tree/master/samples/computer-database
https://github.com/playframework/play-slick/blob/master/samples/computer-database/app/dao/CompaniesDAO.scala
How can I generate the SQL (DDL) file for applying to the database for this example?
I tried in application.conf
ebean.default="models.daos.*"
This didn't help. Also I don't see any examples within that github https://github.com/playframework/play-slick/blob/master/samples/ which generate SQL from the models.

How to migrate existing data managed with sqeryl?

There is a small project of mine reaching its release, based on squeryl - typesafe relational database framework for Scala (JVM based language).
I foresee multiple updates after initial deployment. The data entered in the database should be persisted over them. This is impossible without some kind of data migration procedure, upgrading data for newer DB schema.
Using old data for testing new code also requires compatibility patches.
Now I use automatic schema generation by framework. It seem to be only able create schema from scratch - no data persists.
Are there methods that allow easy and formalized migration of data to changed schema without completely dropping automatic schema generation?
So far I can only see an easy way to add columns: we dump old data, provide default values for new columns, reset schema and restore old data.
How do I delete, rename, change column types or semantics?
If schema generation is not useful for production database migration, what are standard procedures to follow for conventional manual/scripted redeployment?
There have been several discussions about this on the Squeryl list. The consensus tends to be that there is no real best practice that works for everyone. Having an automated process to update your schema based on your model is brittle (can't handle situations like column renames) and can be dangerous in production. Personally, I like the idea of "migrations" where all of your schema changes are written as SQL. There are a few frameworks that help with this and you can find some of them here. Personally, I just use a light wrapper around the psql command line utility to do schema migrations and data loading as it's a lot faster for the latter than feeding in the data over JDBC.

Do any of the Scala ORM's implement code generation from SQL -> Scala?

I am using Squeryl as an ORM with MySQL. This is a new project working with existing schemas that contain several hundred tables.
As far as I can tell, neither Squeryl nor any of the available Scala ORMs can generate the O (Scala classes) from the R (mysql tables). I suppose it wouldn't be too hard to roll my own by crawling the information schema, but I'd rather not duplicate that effort if someone else has already done so.
I'm also curious if anyone can tell me why the R->O direction is so often neglected. In my experience, O->R is the exception and not the rule.
I'll probably start down the path of rolling my own solution. If that's anywhere near complete before I hear of another option, I'll post a link to that code.
Thanks.
QueryDSL provides you with a utility, that can generate code from existing tables. You would however need to accept, that it's primarily a Java lib, and Scala is treated only as an extension there.
I guess the support for R-O is just a matter of time and users' feedback.
There is Squealer which does query database tables and generate scala code. It uses Squeryl and other libraries.
I managed to use it with minimal tweaking.
Its gitub is here
I'm curious what type of projects you are working on where you've found R->O to be the rule. My experience, and I'm including not just my own projects but those that other Squeryl users have mentioned on the mailing list, is that most Squeryl projects are predominantly new applications where an SQL database is being used to persist an application specific model rather than a model being created to match an existing schema. Like most OS projects the developers tend to focus first on features that they themselves need and second on features that are most requested by the community so I would encourage you to take this up at the Squeryl Google Group as well.