Slick3.2 Error: No matching Shape found - scala

I'm not sure what is wrong here.
The following code block is throwing error:
(for {
(e,r) <- tblDetail.joinLeft(tblMaster).on((e,r) => r.col1 === e.col3)
} yield (e.id)
Error
No matching Shape found.
[error] Slick does not know how to map the given types.
[error] Possible causes: T in Table[T] does not match your * projection,
[error] you use an unsupported type in a Query (e.g. scala List),
[error] or you forgot to import a driver api into scope.
[error] Required level: slick.lifted.FlatShapeLevel
[error] Source type: (slick.lifted.Rep[Int], slick.lifted.Rep[String],...)
[error] Unpacked type: T
[error] Packed type: G
[error] (e,r) <- tblDetail.joinLeft(tblMaster).on((e,r) => r.col1 === e.col3)
I checked the Slick Tables for tblDetail and tblMaster they seemed to be fine.
tblMaster
class TblMaster(tag:Tag)
extends Table[(Int,String,...)](tag, "tbl_master") {
def id = column[Int]("id")
def col3 = column[String]("col3")
def * = (id,col3)
}
tblDetail
class TblDetail(tag:Tag)
extends Table[Entity](tag, "tbl_detail") {
def id = column[Int]("id")
def col1 = column[String]("col1")
def * : ProvenShape[Entity] = (id,col1) <>
((Entity.apply _).tupled, Entity.unapply)
}
Any help would be appreciable.

Related

scala spark type mismatching

I need to group my rdd by two columns and aggregate the count. I have a function:
def constructDiagnosticFeatureTuple(diagnostic: RDD[Diagnostic])
: RDD[FeatureTuple] = {
val grouped_patients = diagnostic
.groupBy(x => (x.patientID, x.code))
.map(_._2)
.map{ events =>
val p_id = events.map(_.patientID).take(1).mkString
val f_code = events.map(_.code).take(1).mkString
val count = events.size.toDouble
((p_id, f_code), count)
}
//should be in form:
//diagnostic.sparkContext.parallelize(List((("patient", "diagnostics"), 1.0)))
}
At compile time, I am getting an error:
/FeatureConstruction.scala:38:3: type mismatch;
[error] found : Unit
[error] required: org.apache.spark.rdd.RDD[edu.gatech.cse6250.features.FeatureConstruction.FeatureTuple]
[error] (which expands to) org.apache.spark.rdd.RDD[((String, String), Double)]
[error] }
[error] ^
How can I fix it?
I red this post: Scala Spark type missmatch found Unit, required rdd.RDD , but I do not use collect(), so, it does not help me.

Scala Slick, how to create Schema ONLY if it does not exist

In Scala Slick, a database schema can be created with the following:
val schema = coffees.schema ++ suppliers.schema
db.run(DBIO.seq(
schema.create
))
From the bottom of this documentation page http://slick.typesafe.com/doc/3.0.0/schemas.html
However, if the database schema already exists then this throws an exception.
Is there a normal way or right way to create the schema IF AND ONLY IF it does not already exist?
In Slick 3.3.0 createIfNotExists and dropIfExists schema methods were added. So:
db.run(coffees.schema.createIfNotExists)
Googled this question and tried several solutions from answers until figured it out.
This is what I do for multiple tables, with slick 3.1.1 and Postgres
import slick.driver.PostgresDriver.api._
import slick.jdbc.meta.MTable
import scala.concurrent.Await
import scala.concurrent.duration.Duration
import scala.concurrent.ExecutionContext.Implicits.global
val t1 = TableQuery[Table1]
val t2 = TableQuery[Table2]
val t3 = TableQuery[Table3]
val tables = List(t1, t2, t3)
val existing = db.run(MTable.getTables)
val f = existing.flatMap( v => {
val names = v.map(mt => mt.name.name)
val createIfNotExist = tables.filter( table =>
(!names.contains(table.baseTableRow.tableName))).map(_.schema.create)
db.run(DBIO.sequence(createIfNotExist))
})
Await.result(f, Duration.Inf)
With Slick 3.0, Mtable.getTables is a DBAction so something like this would work:
val coffees = TableQuery[Coffees]
try {
Await.result(db.run(DBIO.seq(
MTable.getTables map (tables => {
if (!tables.exists(_.name.name == coffees.baseTableRow.tableName))
coffees.schema.create
})
)), Duration.Inf)
} finally db.close
As JoshSGoman comment points out about the answer of Mike-s, the table is not created. I managed to make it work by slightly modifying the first answer's code :
val coffees = TableQuery[Coffees]
try {
def createTableIfNotInTables(tables: Vector[MTable]): Future[Unit] = {
if (!tables.exists(_.name.name == events.baseTableRow.tableName)) {
db.run(coffees.schema.create)
} else {
Future()
}
}
val createTableIfNotExist: Future[Unit] = db.run(MTable.getTables).flatMap(createTableIfNotInTables)
Await.result(createTableIfNotExist, Duration.Inf)
} finally db.close
With the following imports :
import slick.jdbc.meta.MTable
import slick.driver.SQLiteDriver.api._
import scala.concurrent.{Await, Future}
import scala.concurrent.duration.Duration
import scala.concurrent.ExecutionContext.Implicits.global
why don't you simply check the existence before create?
val schema = coffees.schema ++ suppliers.schema
db.run(DBIO.seq(
if (!MTable.getTables.list.exists(_.name.name == MyTable.tableName)){
schema.create
}
))
cannot use createIfNotExists on schema composed of 3 tables with composite primary key on one of the tables. Here, the 3rd table has a primary key composed from the the primary key of each of the 1st and 2nd table. I get an error on this schema when .createIfNotExists is encountered a 2nd time. I am using slick 3.3.1 on scala 2.12.8.
class UserTable(tag: Tag) extends Table[User](tag, "user") {
def id = column[Long]("id", O.AutoInc, O.PrimaryKey)
def name = column[String]("name")
def email = column[Option[String]]("email")
def * = (id.?, name, email).mapTo[User]
}
val users = TableQuery[UserTable]
lazy val insertUser = users returning users.map(_.id)
case class Room(title: String, id: Long = 0L)
class RoomTable(tag: Tag) extends Table[Room](tag, "room") {
def id = column[Long]("id", O.PrimaryKey, O.AutoInc)
def title = column[String]("title")
def * = (title, id).mapTo[Room]
}
val rooms = TableQuery[RoomTable]
lazy val insertRoom = rooms returning rooms.map(_.id)
case class Occupant(roomId: Long, userId: Long)
class OccupantTable(tag: Tag) extends Table[Occupant](tag, "occupant") {
def roomId = column[Long]("room")
def userId = column[Long]("user")
def pk = primaryKey("room_user_pk", (roomId, userId) )
def * = (roomId, userId).mapTo[Occupant]
}
val occupants = TableQuery[OccupantTable]
I can successfully create schema and add user, room and occupant at first. On the second usage of .createIfNotExists as follows below, I get an error on duplicate primary key:
println("\n2nd run on .createIfNotExists using different values for users, rooms and occupants")
val initdup = for {
_ <- users.schema.createIfNotExists
_ <- rooms.schema.createIfNotExists
_ <- occupants.schema.createIfNotExists
curlyId <- insertUser += User(None, "Curly", Some("curly#example.org"))
larryId <- insertUser += User(None, "Larry")
moeId <- insertUser += User(None, "Moe", Some("moe#example.org"))
shedId <- insertRoom += Room("Shed")
_ <- occupants += Occupant(shedId, curlyId)
_ <- occupants += Occupant(shedId, moeId)
} yield ()
The exception is as below:
2nd run on .createIfNotExists using different values for users, rooms and occupants
[error] (run-main-2) org.h2.jdbc.JdbcSQLException: Constraint "room_user_pk" already exists; SQL statement:
[error] alter table "occupant" add constraint "room_user_pk" primary key("room","user") [90045-197]
[error] org.h2.jdbc.JdbcSQLException: Constraint "room_user_pk" already exists; SQL statement:
[error] alter table "occupant" add constraint "room_user_pk" primary key("room","user") [90045-197]
[error] at org.h2.message.DbException.getJdbcSQLException(DbException.java:357)
[error] at org.h2.message.DbException.get(DbException.java:179)
[error] at org.h2.message.DbException.get(DbException.java:155)
[error] at org.h2.command.ddl.AlterTableAddConstraint.tryUpdate(AlterTableAddConstraint.java:110)
[error] at org.h2.command.ddl.AlterTableAddConstraint.update(AlterTableAddConstraint.java:78)
[error] at org.h2.command.CommandContainer.update(CommandContainer.java:102)
[error] at org.h2.command.Command.executeUpdate(Command.java:261)
[error] at org.h2.jdbc.JdbcPreparedStatement.execute(JdbcPreparedStatement.java:249)
[error] at slick.jdbc.JdbcActionComponent$SchemaActionExtensionMethodsImpl$$anon$6.$anonfun$run$7(JdbcActionComponent.scala:292)
[error] at slick.jdbc.JdbcActionComponent$SchemaActionExtensionMethodsImpl$$anon$6.$anonfun$run$7$adapted(JdbcActionComponent.scala:292)
[error] at slick.jdbc.JdbcBackend$SessionDef.withPreparedStatement(JdbcBackend.scala:425)
[error] at slick.jdbc.JdbcBackend$SessionDef.withPreparedStatement$(JdbcBackend.scala:420)
[error] at slick.jdbc.JdbcBackend$BaseSession.withPreparedStatement(JdbcBackend.scala:489)
[error] at slick.jdbc.JdbcActionComponent$SchemaActionExtensionMethodsImpl$$anon$6.$anonfun$run$6(JdbcActionComponent.scala:292)
[error] at slick.jdbc.JdbcActionComponent$SchemaActionExtensionMethodsImpl$$anon$6.$anonfun$run$6$adapted(JdbcActionComponent.scala:292)
[error] at scala.collection.Iterator.foreach(Iterator.scala:941)
[error] at scala.collection.Iterator.foreach$(Iterator.scala:941)
[error] at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
[error] at scala.collection.IterableLike.foreach(IterableLike.scala:74)
[error] at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
[error] at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
[error] at slick.jdbc.JdbcActionComponent$SchemaActionExtensionMethodsImpl$$anon$6.run(JdbcActionComponent.scala:292)
[error] at slick.jdbc.JdbcActionComponent$SchemaActionExtensionMethodsImpl$$anon$6.run(JdbcActionComponent.scala:290)
[error] at slick.jdbc.JdbcActionComponent$SimpleJdbcProfileAction.run(JdbcActionComponent.scala:28)
[error] at slick.jdbc.JdbcActionComponent$SimpleJdbcProfileAction.run(JdbcActionComponent.scala:25)
[error] at slick.basic.BasicBackend$DatabaseDef$$anon$3.liftedTree1$1(BasicBackend.scala:276)
[error] at slick.basic.BasicBackend$DatabaseDef$$anon$3.run(BasicBackend.scala:276)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] at java.lang.Thread.run(Thread.java:748)
[error] Nonzero exit code: 1
[error] (Compile / run) Nonzero exit code: 1
Additionally, I can use .createIfNotExists more than once on schema where all tables are created with O.PrimaryKey convention.
Am I able to do something to massage code? Is there a workaround so that .createIfNotExists is still usable on composite primary key case?

use SQL in DStream.transform() over Spark Streaming?

There are some examples for use SQL over Spark Streaming in foreachRDD(). But if I want to use SQL in tranform():
case class AlertMsg(host:String, count:Int, sum:Double)
val lines = ssc.socketTextStream("localhost", 8888)
lines.transform( rdd => {
if (rdd.count > 0) {
val t = sqc.jsonRDD(rdd)
t.registerTempTable("logstash")
val sqlreport = sqc.sql("SELECT host, COUNT(host) AS host_c, AVG(lineno) AS line_a FROM logstash WHERE path = '/var/log/system.log' AND lineno > 70 GROUP BY host ORDER BY host_c DESC LIMIT 100")
sqlreport.map(r => AlertMsg(r(0).toString,r(1).toString.toInt,r(2).toString.toDouble))
} else {
rdd
}
}).print()
I got such error:
[error] /Users/raochenlin/Downloads/spark-1.2.0-bin-hadoop2.4/logstash/src/main/scala/LogStash.scala:52: no type parameters for method transform: (transformFunc: org.apache.spark.rdd.RDD[String] => org.apache.spark.rdd.RDD[U])(implicit evidence$5: scala.reflect.ClassTag[U])org.apache.spark.streaming.dstream.DStream[U] exist so that it can be applied to arguments (org.apache.spark.rdd.RDD[String] => org.apache.spark.rdd.RDD[_ >: LogStash.AlertMsg with String <: java.io.Serializable])
[error] --- because ---
[error] argument expression's type is not compatible with formal parameter type;
[error] found : org.apache.spark.rdd.RDD[String] => org.apache.spark.rdd.RDD[_ >: LogStash.AlertMsg with String <: java.io.Serializable]
[error] required: org.apache.spark.rdd.RDD[String] => org.apache.spark.rdd.RDD[?U]
[error] lines.transform( rdd => {
[error] ^
[error] one error found
[error] (compile:compile) Compilation failed
Seems only if I use sqlreport.map(r => r.toString) can be a correct usage?
dstream.transform take a function transformFunc: (RDD[T]) ⇒ RDD[U]
In this case, the if must result in the same type on both evaluations of the condition, which is not the case:
if (count == 0) => RDD[String]
if (count > 0) => RDD[AlertMsg]
In this case, remove the optimization of if rdd.count ... sothat you have an unique transformation path.

Action.async and using for-comprehention in a seperate function

This might be even more Scala than Play related, but I have this issue within some Play code.
I have
def index = Action.async { implicit request =>
val content1Future = Blogs.getAllBlogs(request)
val sidebar1Future = Blogs.getAllBlogsOverview(request)
for {
sidebar1 <- sidebar1Future
content1 <- content1Future
sidebar1Body <- Pagelet.readBody(sidebar1)
content1Body <- Pagelet.readBody(content1)
} yield {
val sidebarcontents = List(sidebar1Body)
val contents = List(content1Body)
Ok(views.html.main("Index", contents, sidebarcontents)).withHeaders(("Cache-Control", "no-cache"))
}
and it works fine.
Now, I try to extract the for-comprehention into a separate function dealing with 'sidebar' and I want to pass in the 'content'.
Like
def index = Action.async { implicit request =>
standardMain(Blogs.getAllBlogs(request))
}
def standardMain(content1Future: => Future[Result])(implicit request: Request[_]) : Future[Result] = {
val sidebar1Future = Blogs.getAllBlogsOverview(request)
for {
sidebar1 <- sidebar1Future
content1 <- content1Future
sidebar1Body <- Pagelet.readBody(sidebar1)
content1Body <- Pagelet.readBody(content1)
} yield {
val sidebarcontents = List(sidebar1Body)
val contents = List(content1Body)
Ok(views.html.main("Index", contents, sidebarcontents)).withHeaders(("Cache-Control", "no-cache"))
}
}
But then I get
[info] Compiling 1 Scala source to D:\Dropbox\Playground\PlayWorld\play-with-forms\target\scala-2.11\classes...
[error] D:\Dropbox\Playground\PlayWorld\play-with-forms\app\controllers\Application.scala:91: type mismatch;
[error] found : scala.concurrent.Future[play.api.mvc.Result]
[error] required: play.api.libs.iteratee.Iteratee[Array[Byte],?]
[error] content1 <- content1Future
[error] ^
[error] D:\Dropbox\Playground\PlayWorld\play-with-forms\app\controllers\Application.scala:90: type mismatch;
[error] found : play.api.libs.iteratee.Iteratee[Array[Byte],Nothing]
[error] required: scala.concurrent.Future[play.api.mvc.Result]
[error] sidebar1 <- sidebar1Future
[error] ^
[error] two errors found
[error] (compile:compile) Compilation failed
[error] application -
I tried many different calls, but somehow I do not know how to get an play.api.libs.iteratee.Iteratee[Array[Byte],?].
How can I achive this. Thanks.

Scala specs2 mocking a trait method returns always Nullpointer exception

I have a trait that I want to mock and use that mocked Trait in another Service during testing. The problem is, that I receive a Nullpointerexception when I try to mock the return value of the indexDocuments function.
Testmethod:
"createDemand must return None if writing to es fails" in new WithApplication {
val demandDraft = DemandDraft(UserId("1"), "socken bekleidung wolle", Location(Longitude(52.468562), Latitude(13.534212)), Distance(30), Price(25.0), Price(77.0))
val es = mock[ElasticsearchClient]
val sphere = mock[SphereClient]
val productTypes = mock[ProductTypes]
sphere.execute(any[ProductCreateCommand]) returns Future.successful(product)
productTypes.demand returns ProductTypeBuilder.of("demand", ProductTypeDrafts.demand).build()
// this line throws the nullpointer exception
es.indexDocument(any[IndexName], any[TypeName], any[JsValue]) returns Future.failed(new RuntimeException("test exception"))
val demandService = new DemandService(es, sphere, productTypes)
demandService.createDemand(demandDraft) must be (Option.empty[Demand]).await
}
Trait:
sealed trait ElasticsearchClient {
implicit def convertListenableActionFutureToScalaFuture[T](x: ListenableActionFuture[T]): Future[T] = {
val p = Promise[T]()
x.addListener(new ActionListener[T] {
def onFailure(e: Throwable) = p.failure(e)
def onResponse(response: T) = p.success(response)
})
p.future
}
lazy val client = createElasticsearchClient()
def close(): Unit
def createElasticsearchClient(): Client
def indexDocument(esIndex: IndexName, esType: TypeName, doc: JsValue): Future[IndexResponse] =
client.prepareIndex(esIndex.value, esType.value).setSource(doc.toString()).execute()
def search(esIndex: IndexName, esType: TypeName, query: QueryBuilder): Future[SearchResponse] =
client.prepareSearch(esIndex.value).setTypes(esType.value).setQuery(query).execute()
}
Exception
[error] NullPointerException: (DemandServiceSpec.scala:89)
[error] services.DemandServiceSpec$$anonfun$1$$anonfun$apply$8$$anon$2$$anonfun$8.apply(DemandServiceSpec.scala:89)
[error] services.DemandServiceSpec$$anonfun$1$$anonfun$apply$8$$anon$2$$anonfun$8.apply(DemandServiceSpec.scala:89)
[error] services.DemandServiceSpec$$anonfun$1$$anonfun$apply$8$$anon$2.delayedEndpoint$services$DemandServiceSpec$$anonfun$1$$anonfun$apply$8$$anon$2$1(DemandServiceSpec.scala:89)
[error] services.DemandServiceSpec$$anonfun$1$$anonfun$apply$8$$anon$2$delayedInit$body.apply(DemandServiceSpec.scala:81)
[error] play.api.test.WithApplication$$anonfun$around$1.apply(Specs.scala:23)
[error] play.api.test.WithApplication$$anonfun$around$1.apply(Specs.scala:23)
[error] play.api.test.PlayRunners$class.running(Helpers.scala:49)
[error] play.api.test.Helpers$.running(Helpers.scala:403)
[error] play.api.test.WithApplication.around(Specs.scala:23)
[error] play.api.test.WithApplication.delayedInit(Specs.scala:20)
[error] services.DemandServiceSpec$$anonfun$1$$anonfun$apply$8$$anon$2.<init>(DemandServiceSpec.scala:81)
[error] services.DemandServiceSpec$$anonfun$1$$anonfun$apply$8.apply(DemandServiceSpec.scala:81)
[error] services.DemandServiceSpec$$anonfun$1$$anonfun$apply$8.apply(DemandServiceSpec.scala:81)
Please let me know if you need additional information.
I found out that the any[] Matchers in the indexDocuments call are the problem. When I replace them with the actual values it works:
"createDemand must return None if writing to es fails and deleteDemand should be called once with correct parameters" in new WithApplication {
val demandDraft = DemandDraft(UserId("1"), "socken bekleidung wolle", Location(Longitude(52.468562), Latitude(13.534212)), Distance(30), Price(25.0), Price(77.0))
val es = mock[ElasticsearchClient]
val sphere = mock[SphereClient]
val productTypes = mock[ProductTypes]
sphere.execute(any[ProductCreateCommand]) returns Future.successful(product)
sphere.execute(any[ProductDeleteByIdCommand]) returns Future.successful(product)
productTypes.demand returns ProductTypeBuilder.of("demand", ProductTypeDrafts.demand).build()
es.indexDocument(IndexName("demands"), TypeName("demands"), Json.toJson(demand)) returns Future.failed(new RuntimeException("test exception"))
val demandService = new DemandService(es, sphere, productTypes)
demandService.createDemand(demandDraft) must be (Option.empty[Demand]).await
}
I've had this happen a whole bunch and work around it by creating a class (rather than a trait) to feed to mock:
trait SomeTraitYouWantToMock {
…
}
class MockableSomeTraitYouWantToMock extends SomeTraitYouWantToMock
val whatever = mock[MockableSomeTraitYouWantToMock]