Recursive tree-like table query with Slick - scala

My table data forms a tree structure where one row can reference a parent row in the same table.
What I am trying to achieve, using Slick, is to write a query that will return a row and all it's children. Also, I would like to do the same, but write a query that will return a child and all it's ancestors.
In other words:
findDown(1) should return
List(Group(1, 0, "1"), Group(3, 1, "3 (Child of 1)"))
findUp(5) should return
List(Group(5, 2, "5 (Child of 2)"), Group(2, 0, "2"))
Here is a fully functional worksheet (except for the missing solutions ;-).
package com.exp.worksheets
import scala.slick.driver.H2Driver.simple._
object ParentChildTreeLookup {
implicit val session = Database.forURL("jdbc:h2:mem:test1;", driver = "org.h2.Driver").createSession()
session.withTransaction {
Groups.ddl.create
}
Groups.insertAll(
Group(1, 0, "1"),
Group(2, 0, "2"),
Group(3, 1, "3 (Child of 1)"),
Group(4, 3, "4 (Child of 3)"),
Group(5, 2, "5 (Child of 2)"),
Group(6, 2, "6 (Child of 2)"))
case class Group(
id: Long = -1,
id_parent: Long = -1,
label: String = "")
object Groups extends Table[Group]("GROUPS") {
def id = column[Long]("ID", O.PrimaryKey, O.AutoInc)
def id_parent = column[Long]("ID_PARENT")
def label = column[String]("LABEL")
def * = id ~ id_parent ~ label <> (Group, Group.unapply _)
def autoInc = id_parent ~ label returning id into {
case ((_, _), id) => id
}
def findDown(groupId: Long)(implicit session: Session) = { ??? }
def findUp(groupId: Long)(implicit session: Session) = { ??? }
}
}
A really bad, and static attempt at findDown may be something like:
private def groupsById = for {
group_id <- Parameters[Long]
g <- Groups; if g.id === group_id
} yield g
private def childrenByParentId = for {
parent_id <- Parameters[Long]
g <- Groups; if g.id_parent === parent_id
} yield g
def findDown(groupId: Long)(implicit session: Session) = { groupsById(groupId).list union childrenByParentId(groupId).list }
But, I'm looking for a way for Slick to recursively search the same table using the id and id_parent link. Any other good ways to solve the problem is really welcome. Keep in mind though, that it would be best to minimise the number of database round-trips.

You could try calling SQL from slick. The SQL call to go up the hierarchy would look something like this (This is for SQL Server):
WITH org_name AS
(
SELECT DISTINCT
parent.id AS parent_id,
parentname.label as parent_label,
child.id AS child_id,
childname.ConceptName as child_label
FROM
Group parent RIGHT OUTER JOIN
Group child ON child.parent_id = parent.id
),
jn AS
(
SELECT
parent_id,
parent_label,
child_id,
child_label
FROM
org_name
WHERE
parent_id = 5
UNION ALL
SELECT
C.parent_id,
C.parent_label,
C.child_id,
C.child_label
FROM
jn AS p JOIN
org_name AS C ON C.child_id = p.parent_id
)
SELECT DISTINCT
jn.parent_id,
jn.parent_label,
jn.child_id,
jn.child_label
FROM
jn
ORDER BY
1;
If you want to go down the hierarchy change the line:
org_name AS C ON C.child_id = p.parent_id
to
org_name AS C ON C.parent_id = p.child_id

In plain SQL this would be tricky. You would have multiple options:
Use a stored procedure to collect the correct records (recursively). Then convert those records into a tree using code
Select all the records and convert those into a tree using code
Use more advanced technique as described here (from Optimized SQL for tree structures) and here. Then convert those records into a tree using code
Depending on the way you want to do it in SQL you need to build a Slick query. The concept of Leaky Abstractions is very evident here.
So getting the tree structure requires two steps:
Get the correct (or all records)
Build (using regular code) a tree from those records
Since you are using Slick I don't think it's an option, but another database type might be a better fit for your data model. Check out NoSQL for the differences between the different types.

Related

How to update table query in Slick

How can I convert Query[MappedProjection[Example, (Option[String], Int, UUID, UUID)], Example, Seq] to Query[Examples, Example, Seq]?
Details
I am trying to drop a column from an existing table(Examples in this case) and move the data to another table (Examples2 in this case). I don't want to change all the existing code base, so I plan to join these two tables and map the results to Example.
import slick.lifted.Tag
import slick.driver.PostgresDriver.api._
import java.util.UUID
case class Example(
field1: Option[String] = None,
field2: Int,
someForeignId: UUID,
id: UUID,
)
object Example
class Examples(tag: Tag) extends Table[Example](tag, "entityNotes") {
def field1 = column[Option[String]]("field1")
def field2 = column[Int]("field2")
def someForeignId = column[UUID]("someForeignId")
def id = column[UUID]("id", O.PrimaryKey)
def someForeignKey = foreignKey(
"someForeignIdToExamples2",
someForeignId,
Examples2.query,
)(
_.id.?
)
def * =
(
field1.?,
field2,
someForeignId,
id,
) <> ((Example.apply _).tupled, Example.unapply)
}
object Examples{
val query = TableQuery[Examples]
}
Basically, all the functions in the codebase call Examples.query. If I update that query by joining two tables, the problem will be solved (of course with a performance shortcoming because of one extra join for each call).
To use the query with the existing code base, we need to keep the type the same. For example, we we can use filter as follows:
val query_ = TableQuery[Examples]
val query: Query[Examples, Example, Seq] = query_.filter(_.field2 > 5)
Everything will work without a problem since we keep the type of the query as it is supposed to be.
However, I cannot do that with a join if I want to use data from the second table.
val query_ = TableQuery[Examples]
val query = query
.join(Examples2.query_)
.on(_.someForeignId === _.id)
.map({
case (e, e2) =>
((
e2.value.?,
e1.field2,
e2.id
e.id,
) <> ((Example.apply _).tupled, Example.unapply))
})
This is where I got stuck. Its type is Query[MappedProjection[Example, (Option[String], Int, UUID, UUID)], Example, Seq].
Can anyone help? Btw, we don't have to use map. This is just what I got so far.

Join three tables in Scala Slick or flatten nested tuples

I need to do INNER JOIN three tables because of foreign key in the first one as follows:
CREATE TABLE "product" (
"id" INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT,
"name" VARCHAR NOT NULL,
"price" FLOAT NOT NULL,
"categoryid" INT NOT NULL,
"supplierid" INT NOT NULL,
FOREIGN KEY(categoryid) references category(id),
FOREIGN KEY(supplierid) references supplier(id)
);
In models I have method to handle list all:
def list(): Future[Seq[(Product, Category, Supplier)]] = db.run {
productTable.join(categoryTable).on(_.categoryid === _.id).join(supplierTable).on(_._1.supplierid === _.id).result
}
But this returns nested tuple instead of flat one: ((Product, Category), Supplier).
Then how should I join those tables to get flat tuple or, if not possible to do that, how can I flatten this tuple?
EDIT:
Actually the only solution I found which works for me is manual use of map:
def list(): Future[Seq[(Product, Category, Supplier)]] = db.run {
productTable.join(categoryTable).on(_.categoryid === _.id).join(supplierTable).on(_._1.supplierid === _.id).result.map(a => Seq((a(1)._1._1,a(1)._1._2,a(1)._2)))
}
Which looks and feels horrible. But only this works so far ...
Any better ideas?
Inner joins are expressed as for comprehensions in Slick:
def list(): Future[Seq[(Product, Category, Supplier)]] = db.run {
for {
product <- productTable
category <- categoryTable if product.categoryid === category.id
supplier <- supplierTable if product.supplierid === supplier.id
} yield (product, category, supplier)
}
I would also recommend you to check out Slick's support for foreign key queries. That would make the query quite a bit simpler, it would probably look something like this:
def list(): Future[Seq[(Product, Category, Supplier)]] = db.run {
for {
product <- productTable
category <- product.category
supplier <- product.supplier
} yield (product, category, supplier)
}

Transforming Doobie ConnectionIO[Option[Int]] without explicit match

I have a ConnectionIO[Option[Int]] and map over the Option to produce a ConnectionIO[Option[String]] with a query the Some[Int] otherwise keep the Nones. I was able to do this with aforcomprehension and amatch`:
def findWidgetByOwner(name: String): ConnectionIO[Option[String]] = for {
opt <- sql"SELECT owner_id FROM owners WHERE name = $name".query[Int].option
widget <- opt match {
case None => None.pure[ConnectionIO]
case Some(id) => sql"SELECT widget_name FROM widgets WHERE owner_id = $id".query[String].option
}
} yield widget
I know i'm getting tripped up by the ConnectionIO container, but I can't find a cleaner mapping approach that to transform ConnectionIO[Option[Int]] to ConnectionIO[Option[String]].
It would be cleaner to join using SQL instead of scala:
def findWidgetByOwner(name: String): ConnectionIO[Option[String]] =
sql"""
SELECT widgets.widget_name FROM widgets WHERE owner_id = $id
INNER JOIN owners ON widgets.owner_id = owners.owner_id
WHERE owners.name = $name
""".query[Int].option
But if you want to clean up the original, some incantation of sequence would probably work (not tested):
import cats._, cats.data._, cats.implicits._
...
widget <- opt.map {id =>
sql"SELECT widget_name FROM widgets WHERE owner_id = $id".query[String].unique
}.sequence
Note: You have to change query[String].option to .query[String].unique otherwise widget becomes an Option[Option[String]] which if the widget query can be null, might be desirable, but requires a .flatten at the end.

Scala Slick 3.0.1 Relationship to self

I have an entity called Category which has a relationship to itself. There are two types of categories, a parent category and a subcategory. The subcategories have in the idParent attribute, the id from the parent category.
I defined the Schema this way
class CategoriesTable(tag: Tag) extends Table[Category](tag, "CATEGORIES") {
def id = column[String]("id", O.PrimaryKey)
def name = column[String]("name")
def idParent = column[Option[String]]("idParent")
def * = (id, name, idParent) <> (Category.tupled, Category.unapply)
def categoryFK = foreignKey("category_fk", idParent, categories)(_.id.?)
def subcategories = TableQuery[CategoriesTable].filter(_.id === idParent)
}
And I have this data:
id name idParent
------------------------------
parent Parent
child1 Child1 parent
child2 Child2 parent
Now I want to get the result in a map grouped by the parent category like
Map(
(parent,Parent,None) -> Seq[(child1,Child1,parent),(child2,Child2,parent]
)
For that I tried with the following query:
def findChildrenWithParents() = {
db.run((for {
c <- categories
s <- c.subcategories
} yield (c,s)).sortBy(_._1.name).result)
}
If at this point I execute the query with:
categoryDao.findChildrenWithParents().map {
case categoryTuples => categoryTuples.map(println _)
}
I get this:
(Category(child1,Child1,Some(parent)),Category(parent,Parent,None))
(Category(child2,Child2,Some(parent)),Category(parent,Parent,None))
Here there are two facts that already disconcerted me:
It is returning Future[Seq[Category, Category]] instead of the Future[Seq[Category, Seq[Category]]] that I would expect.
The order is inverted, I would expect the parent to appear first like:
(Category(parent,Parent,None),Category(child1,Child1,Some(parent)))
(Category(parent,Parent,None),Category(child2,Child2,Some(parent)))
Now I would try to group them. As I am having problems with nested queries in Slick. I perform a group by on the result like this:
categoryDao.findChildrenWithParents().map {
case categoryTuples => categoryTuples.groupBy(_._2).map(println _)
}
But the result is really a mess:
(Category(parent,Parent,None),Vector((Category(child1,Child1,Some(parent)),Category(parent,Parent,None),(Category(child2,Child2,Some(parent)),Category(parent,Parent,None))))
I would have expected:
(Category(parent,Parent,None),Vector(Category(child1,Child1,Some(parent)),Category(child2,Child2,Some(parent))))
Can you please help me with the inverted result and with the group by?
Thanks in advance.
Ok I managed to fix it by myself. Here the answer if someone wants to learn from it:
def findChildrenWithParents() = {
val result = db.run((for {
c <- categories
s <- c.subcategories
} yield (c,s)).sortBy(_._1.name).result)
result map {
case categoryTuples => categoryTuples.groupBy(_._1).map{
case (k,v) => (k,v.map(_._2))
}
}
}
The solution isn't perfect. I would like to make the group by already in Slick, but this retrieves what I wanted.

Slick 3.0.0 Select and Create or Update

I'am in a situation where in I have to do a select first, use the value to issue a create. It is some versioning that I'm trying to implement. Here is the table definition:
class Table1(tag: Tag) extends Table[(Int, String, Int)](tag, "TABLE1") {
def id = column[Int]("ID")
def name = column[String]("NAME")
def version = column[Int]("VERSION")
def indexCol = index("_a", (id, version))
val tbl1Elems = TableQuery[Table1]
}
So when a request comes to create or update an entry in Table1, I have to do the following:
1. Select for the given id, if exists, get the version
2. Increment the version
3. Create a new entry
All that should happen in a single transaction. Here is what I have got so far:
// this entry should be first checked if the id exists and if yes get //the complete set of columns by applying a filter that returns the max //version
val table1 = Table1(2, "some name", 1)
for {
tbl1: Table1 <- tbl1MaxVersionFilter(table1.id)
maxVersion: Column[Int] = tbl1.version
result <- tbl1Elems += table1.copy(version = maxVersion + 1) // can't use this!!!
} yield result
I will later wrap that entire block in one transaction. But I',m wondering how to complete that will create a new version? How can I get the value maxVersion out of the Column so that I can increment 1 to it and use it?
I would go with a static query, with something like this
import scala.slick.jdbc.{StaticQuery=>Q}
def insertWithVersion(id: Int,name:String) =
( Q.u + "insert into table1 select " +?id + "," +?name + ", (
select coalese(max(version),1) from table1 where id=" +?id +")" ).execute
If you want to write it using slick way then take a look at the following
val tableOne = TableQuery[Table1]
def updateWithVersion(newId:Int,name:String):Unit = {
val version = tableOne.filter( _.id === newId).map( _.version).max.run.getOrElse(1)
tableOne += (newId,name,version)
}
The idea is select the max version in the same query and if there is no version use 1 and insert it. Also, as the whole logic is issued in a single statement, no extra transaction management is needed.
P.S. There might be some error is sql and code.