Upgrade from slick-extensions-2.1.0 to Slick-3.2.0 - scala

I was upgrading my project from scala 2.11 to scala 2.12.
For DB Interaction slick-extentions was used, but I found out that slick-extension has been merged with Slick itself since Slick-3.2.0.
While I was going through the docs I found about JdbcProfiles and discontinuation of Drivers etc.
Now, I have a lot of code where withSession method from scala.slick.jdbc.JdbcBackend has been used - like -
db.withSession { implicit session =>
rmobVersionControl.foreach(e =>
elements += new RMOBVersionControlElement(e._1, e._2, e._3))
}
In the docs I see that withSession() method is Deprecated (Since version 3.0).
But I was wondering if there's a way to keep this code in slick 3.2.0 because changing all this code and using Action-Based Api would be a lot of pain.

Related

Aggregation works unexpectedly after spring-data-mongo-db library update

Today I updated Spring Boot from version 2.2.2.RELEASE to 2.5.2. After that aggregations started to behave differently. Here is an example query (in kotlin language):
val aggregation = Aggregation.newAggregation(
Aggregation.match(Criteria.where("_id").isEqualTo(ObjectId("6faa215a23cfcf1524cc4a4b"))),
Aggregation.project().andExclude("_id").andExpression("\$\$ROOT").`as`("user"),
Aggregation.lookup("user", "user._id", "_id", "sameUser")
)
return reactiveMongoTemplate.aggregate(aggregation, "user", UserTestAgggr::class.java)
data class UserTestAgggrUserTestAgggr(
val user: User,
val sameUser: User
)
For 2.2.2.RELEASE version this code worked. However in version 2.5.2 API requires sameUser param to be a list (otherwise it throws an exception).
I would like to avoid modifying my queries or objects (because I've got too many of those).
So I guess my question is: is there a way to make most recent API behave like before without a downgrade?
So my answer was to create my own MappingMongoConverter, which was a nightmare, because it had to extend MappingMongoConverter (some spring classes inject MappingMongoConverter directly instead of using MongoConverter interface). Had to write it in java as well (so I could rely on MappingMongoConverter original implementation). Not fun at all, but solved the issue for me.

Equivalent of scala.concurrent.util.Unsafe in Scala 2.12

I have created empty instance of my object and then initialise it using run time values. Implementation was based on scala.concurrent.util.Unsafe in Scala 2.11 and it worked fine.
I understand Unsafe is bad and hence has been deprecated in Scala 2.12.
If it's deprecated then what's equivalent of Unsafe in Scala 2.12?
Assuming you're running on a JVM where sun.misc.Unsafe is still available (this will limit which JVMs you can run on, but so did using scala.concurrent.util.Unsafe so no immediate loss):
val unsafeInstance = // use in place of Scala 2.11 usages of scala.concurrent.util.Unsafe.instance
classOf[sun.misc.Unsafe]
.getDeclaredFields
.filter(_.getType == classOf[sun.misc.Unsafe])
.headOption
.map { field =>
field.setAccessible(true)
field.get(null).asInstanceOf[sun.misc.Unsafe]
}
.getOrElse { throw new IllegalStateException("Can't find instance of sun.misc.Unsafe") }
Code is very slightly adapted from the Scala 2.11 source.
It's possible that this is an instance of spending so much time thinking about "could" that one didn't think about "should".

type mismatch errors when upgrading from scala 2.9 to 2.13.2

I recently revived an old library that was written in scala 2.9, and I created a new scala project using scala 2.13.2
I am getting errors like the following:
type mismatch;
found : scala.collection.mutable.Buffer[Any]
[error] required: Seq[Any]
Was there a specific change between 2.9 to 2.13.2 that involved not implicitly casting sequences or something that might solve many of these types of compile errors?
I had to add .toSeq to many of my function return statements that were vals of Buffer[Any] that needed to be passed as an arguement to a function expected a Sequence.
Quite a lot things happened in the last 7+ years (including rewrite of the collections library).
If adding .toSeq solves your problem - just go for it.
If you want to know what exactly has changed - try upgrading version-by version: first upgrade to scala-2.10., then to 2.11., then 2.12.*, then, finally, to 2.13.2.
At each upgrade you'll probably see deprecation warnings. Fix them before upgrading to the next version.
Brave, but perhaps bad form, to disturb the dead. Nevertheless, maybe pass mutable.Buffer as mutable.Seq instead of Seq which is by default immutable.Seq. Consider
val mb = mutable.Buffer(11, Some(42))
val ms: mutable.Seq[Any] = mb // OK
val is: Seq[Any] = mb // NOK

The "right" way to use write Slick 3.0 Scala queries in Play Framework

I'm using Slick 3.0 and (of course) almost all the examples out there cover Slick 2.x. Things have changed and frankly seem more complicated, not less.
Here's an example: I want to get an object (a GPPerson) by id. This is what I have right now, and it seems very verbose... more so than Slick 2.x:
def get(id: GPID): Option[GPPerson] = Await.result(
db.run(
people.filter(_.id === id).result), Duration.Inf
).headOption
In Slick 2.x things were easier because of the implicits, among other things. But the above seems to be the most concise expression I've come up with.
It also doesn't really address exception handling, which I would need to add.
I started to use Slick 3.0 in a new project a few months ago and I had the same questions. This is what I understood:
Slick 3.0 was designed for non-blocking asynchronous (reactive) applications. Obviously it means Akka + Play / Spray nowadays. In this world you mostly interact with Futures, that's why Slick's db.run returns Future. There is no point in using Await.result - if you need blocking calls it's better to return to 2.x.
But if you use reactive stack you'll get benefits immediately. For example, Spray is completely non-blocking library that works with Futures nicely using onComplete directive. You can call a method that returns Future with a result from Slick in a Spray route and then use that result together with onComplete. In this case the whole response-reply pipeline is non-blocking.
You also mentioned exception handling, so this is exactly how you do it - using Futures.
So based on my experience I would write your method in a following way:
def get(id: GPID): Future[Option[GPPerson]] = db.run(
people.filter(_.id === id).result.map(_.headOption)
)
and then work with a Future.
you can do this.
def syncResult[R](action:slick.dbio.DBIOAction[R, slick.dbio.NoStream, scala.Nothing]):R = {
import scala.concurrent.duration.Duration
val db = Database.forConfig("db")
try {
Await.result(db.run(action), Duration.Inf)
} finally db.close
}
def get(id: GPID): Option[GPPerson] = syncResult { people.filter(_.id === id).result.headOption }

Conditional compilation in Scala

I am working on a library which depends on Scala 2.9 but only for a minor feature. I would like to propose version compatible with 2.8, but I don't want to maintain two code branch. Since I'm using SBT, I would like to benefits from it cross-compilation features.
However I don't know is there is a way to provide an equivalent of conditional compilation, to include a piece of code only if Scala 2.9 is used. Reflexivity could be an option (but how?).
Edit: The features I am using in 2.9 are the new sys package object.
I got it with reflection. So if I want to get the sys.SystemProperties, I can do:
try {
val k = java.lang.Class.forName("scala.sys.package$")
val m = k.getMethod( "props" )
// etc.
} catch {
case _ => throw new UnsupportedOperationException("Only available with Scala 2.9")
}
But it is so boring and ugly that I think I will drop those features...
Read this blog post, which describes how to do it with metaprogramming:
http://michid.wordpress.com/2008/10/29/meta-programming-with-scala-conditional-compilation-and-loop-unrolling/