How to parameterize table name in Slick - scala

class MyTable(tag: Tag) extends Table[MyEntity](tag, "1970Table") {
def id = column[Int]("id")
override def * =
(
id
) <> (MyEntity.tupled, MyEntity.unapply)
}
val myTable = TableQuery[MyTable]
class MyRepository(val config: DatabaseConfig[JdbcProfile])
extends MyRepository[MyTable, String] {
override val table: config.profile.api.TableQuery[MyTable] = myTable
def insert(me: MyEntity): Future[Int] = {
db.run(table += me)
}
}
I use this in my other classes like this:
val myRepository = new MyRepository(dbConfig)
myRepository.insert(myrecord)
Question
I would like to not have a hardcoded tablename but rather make the tablename dynamic.
I would like to change the insert method such that it accepts a year (int) parameter and based on the year parameter it chooses the right table. i.e. if the year passed in is 1970 then table name is 1970Table but if the year passed in is 1980 then the table is 1980Table.

Try
class MyRepository(val config: DatabaseConfig[JdbcProfile]) {
import config._
import profile.api._
abstract class MyTable(tag: Tag, name: String) extends Table[MyEntity](tag, name) {
def id = column[Int]("id")
override def * = (id) <> (MyEntity.tupled, MyEntity.unapply)
}
class Table1970(tag: Tag) extends MyTable[MyEntity](tag, "1970Table")
class Table1980(tag: Tag) extends MyTable[MyEntity](tag, "1980Table")
val table1970 = TableQuery[Table1970]
val table1980 = TableQuery[Table1980]
def insert(me: MyEntity, year: Int): Future[Int] = db.run {
year match {
case "1970" => table1970 += me
case "1980" => table1980 += me
}
}
}
Now
val myRepository = new MyRepository(dbConfig)
myRepository.insert(myrecord, "1970")

There is two apply methods in TableQuery. val myTable = TableQuery[MyTable] -
this one uses macros to create MyTable.
The other one is defined like this:
def apply[E <: AbstractTable[_]](cons: Tag => E): TableQuery[E] =
new TableQuery[E](cons)
So you can do smth like this
class MyTable(tag: Tag, tableName: String) extends Table[MyEntity](tag, tableName)
...
def myTable(name: String) = TableQuery[MyTable](tag => new MyTable(tag, name))
Now you can predefine all tables you need and use them or do smth like this
class MyRepository(val config: DatabaseConfig[JdbcProfile])
extends MyRepository[MyTable, String] {
override def table(year: Int): config.profile.api.TableQuery[MyTable] = myTable(year.toString)
def insert(me: MyEntity, year: Int): Future[Int] = {
db.run(table(year) += me)
}
}

Related

How to override variable studentName given in class within changeName variable and new variable is used in userName when I call changeName method?

class Person {
val studentName = "Arpana"
def changeName(id:String, name:String) ={
val studentName = name
useName(id)
}
def useName(id:String) = {
println(s"use name is $id, by $studentName")
}
}
object Person {
def main(args: Array[String]): Unit = {
(new Person).changeName("2", "Shubham")
}
}
I don't want to use var in code, can we do it by keywords, I tried with keywords like super, protected, private, final but didn't work.
In actual I want to apply this in the below code.
abstract class BaseRepository[T <: BaseModel : ClassTag : StriveSerializer] {
self: BaseConnection =>
val tableName: String = implicitly[ClassTag[T]].runtimeClass.getSimpleName
private val serializer = implicitly[StriveSerializer[T]]
private def executeInserts(query: String): Future[Boolean] = Future {
val preparedStatement = self.connection.prepareStatement(query)
preparedStatement.execute()
}
def exist(id: String, name: String): Future[Boolean] = {
val tableName = name
val promise = Promise[Boolean]
queryById(id).onComplete {
case Success(_) => promise.success(true)
case Failure(ex) => promise.failure(ex)
}
promise.future
}
def queryById(id: String): Future[T] = {
val getSql = s"SELECT * FROM $tableName WHERE id == $id;"
executeReads(getSql).map(serializer.fromResultSet)
}
}
I want when i call exist function then table name given in exist function override in queryById method table name .
It seems like a bit of mix of Java and Scala style. I tried to refactor a bit assuming the intention behind the code. Try and see if this achieves what you want to do:
class Person(_id: String, _studentName: String) {
private val id: String = _id
private val studentName: String = _studentName
def useName() = {
println(s"use name is $id, by $studentName")
}
}
object Person extends App {
new Person("2", "Shubham").useName()
}
I think you should use case class For Model
case class Student(id:String,name:String)
def changeId(student:Student, newId:String): Student ={
student.copy(id=newId)
}
val s1 = Student("1","A")
val newS1 = changeId(s1,"2")
I think it okay to use mutable in a class
e.g.
class MySuperService{
var lastHeartbeat: Option[Timestamp] = None
def setLastHeartbeat(ts:Timestamp): Unit ={
lastHeartbeat = Some(ts)
}
}
val mss1 = new MySuperService()
mss1.setLastHeartbeat(???)

Scala Reflection Instantiating class issues with embded external library in this spark schema structs

I saw lots of websites about scala reflection library but none of them have a straightforward answer to instantiate an object of the class at runtime.
For example, I have the following code:
trait HydraTargetTable {
val inputTables = Seq.empty[Int]
val tableType: String
val tableName: String
val schema: StructType
def className: String = this.getClass.toString
}
trait HydraIntermediateTable extends HydraTargetTable {
val tableType = "Intermediate"
def build(textRDD: RDD[String]): DataFrame = {
DataframeUtils.safeParseFromSchema(textRDD, schema)
}
}
class Table1 extends HydraIntermediateTable {
override val inputTables: Seq[Int] = Seq(1, 2)
override val tableName: String = ""
override val schema: StructType = new StructType()
}
At runtime, I want to be able to instantiate an object of Table1 given the class name as a String value. Here is my reflection code.
object ReflectionTestApp {
def tableBuilder(name: String): Intermediate = {
Class.forName("hearsay.hydra.dataflow.api." + name).newInstance()
.asInstanceOf[Intermediate]
}
def hydraTableBuilder(name: String): HydraTargetTable = {
val action = Class
.forName("hearsay.hydra.dataflow.api." + name).newInstance()
action.asInstanceOf[HydraTargetTable]
}
def main(args: Array[String]): Unit = {
hydraTableBuilder("Table1").inputTables.foreach(println)
}
}
Here is how you can achieve reflection for Object/Class.
package reflection
trait Table {
val id: Int
}
class ActivityTable extends Table {
val id = 10
}
object ActivityTable2 extends Table {
val id = 10
}
object Reflection extends App {
val obj = activityTableBuilder("ActivityTable")
println(obj.id) //output 10
val obj2 = objectBuilder("ActivityTable2$")
println(obj2.id) //output 10
/*
class reflection
*/
def activityTableBuilder(name: String): Table = {
val action = Class.forName("reflection." + name).newInstance()
action.asInstanceOf[Table]
}
/*
object reflection
*/
def objectBuilder(name: String): Table = {
val action = Class.forName("reflection." + name)
action.getField("MODULE$").get(classOf[Table]).asInstanceOf[Table]
}
}

How to create slick projection for list of nested case class?

I am using play 2.6.6 , scala 2.12.3 and slick 3.0.0.
I had following case class structure initially where there was a nested case class:
case class Device(id: Int, deviceUser: Option[DeviceUser] =None)
case class DeviceUser(name: Option[String] = None)
So, I had created following projection for Device class:
class DevicesTable(tag: Tag) extends Table[Device](tag, "DEVICES") {
def id = column[Int]("ID", O.PrimaryKey)
def name = column[Option[String]]("NAME")
def deviceUser = name.<>[Option[DeviceUser]](
{
(param: Option[String]) => {
param match {
case Some(name) => Some(DeviceUser(Some(name)))
case None => None
}
}
},
{
(t: Option[DeviceUser]) =>
{
t match {
case Some(user) => Some(user.name)
case None => None
}
}
}
)
def * = (id, deviceUser).<>(Device.tupled, Device.unapply)
}
The above setup was working fine. I could easily store and retrieve data using the above projection. But now, my requirement has changed and I need to store list of nested case class. So, the class structure is now as follow :
case class Device(id: Int, deviceUser: Option[List[DeviceUser]] =None)
case class DeviceUser(name: Option[String] = None)
Is there some way where I could define projection for the field deviceUser: Option[List[DeviceUser]] ?
Update : I am looking for more of a non-relational approach here.
Since, no body has suggested a solution so far, I am sharing the approach that I am using right now. It works but of course is not the best solution. Specially, I want to avoid using Await here and would like to develop a generic implicit parser.
ALso, I had to create a separate DeviceUsersTable.
case class DeviceUser(id: Int,name: Option[String] = None)
class DeviceUserRepo #Inject()(protected val dbConfigProvider: DatabaseConfigProvider) {
val dbConfig = dbConfigProvider.get[JdbcProfile]
val db = dbConfig.db
import dbConfig.profile.api._
val DeviceUsers = TableQuery[DeviceUserTable]
private def _findById(id: Int): DBIO[Option[DeviceUser]] =
DeviceUsers.filter(_.id === id).result.headOption
def findById(id: Int): Future[Option[DeviceUser]] =
db.run(_findById(id))
def all: Future[List[DeviceUser]] =
db.run(DeviceUsers.to[List].result)
def create(deviceUser: DeviceUser): Future[Int] = {
db.run(DeviceUsers returning DeviceUsers.map(_.id) += deviceUser)
}
class DeviceUserTable(tag: Tag) extends Table[DeviceUser](tag, "DEVICE_USERS") {
def id = column[Int]("ID", O.PrimaryKey)
def name = column[Option[String]]("NAME")
def * = (id, name).<>(DeviceUser.tupled, DeviceUser.unapply)
}
}
And the original DevicesTable now looks like this :
class DevicesTable(tag: Tag) extends Table[Device](tag, "DEVICES") {
implicit val deviceUserConverter = MappedColumnType.base[Option[List[DeviceUser]], String](
deviceUsersOpt => {
deviceUsersOpt match {
case Some(users:List[DeviceUser]) =>val listOfId = users.map{
k => val res = deviceUserRepo.create(k)
Await.result(res, 10 seconds)
}
listOfId.mkString(",")
case None => ""
}
},
str =>{
val listOfIds = (str split "," map Integer.parseInt).toList.filterNot(k => k.equals(""))
if(listOfIds.nonEmpty){
val users = listOfIds.map{ k =>
val res = deviceUserRepo.findById(k)
Await.result(res, 10 seconds)
}
Some(users.flatten)
} else {
None
}
}
)
def id = column[Int]("ID", O.PrimaryKey)
def deviceUser = column[Option[List[DeviceUser]]]("DEVICE_USERS")
def * = (id, deviceUser).<>(Device.tupled, Device.unapply)
}

Slick select all rows for table with no filtering

How can I get a collection of JurisdictionRow objects? I need a SELECT * FROM jurisdiction
object JurisdictionRepo extends {
val profile = slick.driver.MySQLDriver
} with JurisdictionRepo
trait JurisdictionRepo {
private val dbConfig: DatabaseConfig[MySQLDriver] = DatabaseConfig.forConfig("pnga-master-data")
private val db = dbConfig.db
val profile: slick.driver.JdbcProfile
val tableName = "jurisdiction"
def add(jurisdictionRow: JurisdictionRow): Future[Unit] = db.run(query += jurisdictionRow).map { _ => () }
def delete(id: String): Future[Int] = db.run(query.filter(_.id === id).delete)
def get(id: String): Future[Option[JurisdictionRow]] = db.run(query.filter(_.id === id).result.headOption)
def all() = ???
import profile.api._
lazy val schema: profile.SchemaDescription = query.schema
case class JurisdictionRow(id: String,
parentId: String,
name: String,
code: String)
class Jurisdiction(_tableTag: Tag) extends Table[JurisdictionRow](_tableTag, tableName) {
val id: Rep[String] = column[String](s"${tableName}_id", O.PrimaryKey, O.Length(36, varying=true))
val parentId: Rep[String] = column[String]("parent_id", O.Length(36, varying=true))
val name: Rep[String] = column[String]("name", O.Length(255, varying=true))
val code: Rep[String] = column[String]("code", O.Length(255, varying=true))
def * = (id, parentId, name, code) <> (JurisdictionRow.tupled, JurisdictionRow.unapply _)
}
lazy val query = new TableQuery(tag => new Jurisdiction(tag))
}
I would like to implement the all method to return all possible JurisdictionRow objects in the table. This seems like a common case, but the Slick documentation has not been helpful. I just need a plain old result set, no fancy filtering, etc.
Just replicate what you already have in the other queries but without the filter part.
def all = db.run(query.result)
Have look at the first example:
http://slick.lightbend.com/doc/3.2.0/gettingstarted.html#querying

Scala Slick 2.0: converting from Query[Mixed, T#TableElementType] to Query[T, T#TableElementType]

Assume we have a table
class Entry(tag :Tag) extends Table[(String, Long)](tag, "entries") {
def name = column[String]("name")
def value = column[Long]("value")
def * = (name, value)
}
val Entries = new TableQuery(new Entry(_))
and a query of type Query[(Column[String], Column[Long]), (String, Long)]. Can I somehow convert it to Query[Entry, (String, Long)]? This would be very useful in case of grouping queries such as Entries.groupBy(_.name).map(g=>(g._1, g._2.map(_.value).avg))
try this :
case class Entry(name: String,value: Long)
class Entries(tag :Tag) extends Table[Entry](tag, "entries") {
def name = column[String]("name")
def value = column[Long]("value")
def * = (name, value) <>(Entry.tupled, Entry.unapply )
}
val Entries = TableQuery[Entries]