ERROR: Phantom-dsl BatchQuery Unspecified with Overloaded method - scala

I am attempting to extend my application to include another Cassandra table for storing the Transactions included in each Block.
I have tried to keep the code snippets succinct and relevant. If there is further code context required - just let me know.
phantomVersion = "1.22.0"
cassandraVersion = "2.1.4"
I am getting the following compilation error with the code listed below. Insights greatly appreciated.
[error] /home/dan/projects/open-blockchain/scanner/src/main/scala/org/dyne/danielsan/openblockchain/data/database/Database.scala:30: overloaded method value add with alternatives:
[error] (batch: com.websudos.phantom.batch.BatchQuery[_])com.websudos.phantom.batch.BatchQuery[com.websudos.phantom.builder.Unspecified] <and>
[error] (queries: Iterator[com.websudos.phantom.builder.query.Batchable with com.websudos.phantom.builder.query.ExecutableStatement])(implicit session: com.datastax.driver.core.Session)com.websudos.phantom.batch.BatchQuery[com.websudos.phantom.builder.Unspecified] <and>
[error] (queries: com.websudos.phantom.builder.query.Batchable with com.websudos.phantom.builder.query.ExecutableStatement*)(implicit session: com.datastax.driver.core.Session)com.websudos.phantom.batch.BatchQuery[com.websudos.phantom.builder.Unspecified] <and>
[error] (query: com.websudos.phantom.builder.query.Batchable with com.websudos.phantom.builder.query.ExecutableStatement)(implicit session: com.datastax.driver.core.Session)com.websudos.phantom.batch.BatchQuery[com.websudos.phantom.builder.Unspecified]
[error] cannot be applied to (scala.concurrent.Future[com.datastax.driver.core.ResultSet])
[error] .add(ChainDatabase.bt.insertNewBlockTransaction(bt))
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 6 s, completed Aug 9, 2016 2:42:30 PM
GenericBlockModel.scala:
case class BlockTransaction(hash: String, txid: String)
sealed class BlockTransactionModel extends CassandraTable[BlockTransactionModel, BlockTransaction] {
override def fromRow(r: Row): BlockTransaction = {
BlockTransaction(
hash(r),
txid(r)
)
}
object hash extends StringColumn(this) with PartitionKey[String]
object txid extends StringColumn(this) with ClusteringOrder[String] with Descending
}
abstract class ConcreteBlockTransactionModel extends BlockTransactionModel with RootConnector {
override val tableName = "block_transactions"
def insertNewBlockTransaction(bt: BlockTransaction): Future[ResultSet] = insertNewRecord(bt).future()
def insertNewRecord(bt: BlockTransaction) = {
insert
.value(_.hash, bt.hash)
.value(_.txid, bt.txid)
}
}
Database.scala
class Database(val keyspace: KeySpaceDef) extends DatabaseImpl(keyspace) {
def insertBlock(block: Block) = {
Batch.logged
.add(ChainDatabase.block.insertNewRecord(block))
.future()
}
def insertTransaction(tx: Transaction) = {
Batch.logged
.add(ChainDatabase.tx.insertNewTransaction(tx))
.future()
}
def insertBlockTransaction(bt: BlockTransaction) = {
Batch.logged
.add(ChainDatabase.btx.insertNewBlockTransaction(bt))
.future()
}
object block extends ConcreteBlocksModel with keyspace.Connector
object tx extends ConcreteTransactionsModel with keyspace.Connector
object btx extends ConcreteBlockTransactionsModel with keyspace.Connector
}
object ChainDatabase extends Database(Config.keySpaceDefinition)

The error is obviously that you are trying to add a Future to a Batch, when a Batch needs a query. If you already triggered a query, it's not possible to batch it anymore, so you need to stop one step ahead. Here's how:
def insertNewRecord(
bt: BlockTransaction
): InsertQuery.Default[BlockTransactionModel, BlockTransaction] = {
insert
.value(_.hash, bt.hash)
.value(_.txid, bt.txid)
}
Now you can add multiple records to a batch with:
Batch.logged.add(insertNewRecord(record1)
.add(insertNewRecord(record2))
// etc
On a different note a batch in Cassandra is not used to do parallel inserts, instead it is used to guarantee atomicity which makes it in general at least 30% slower than a normal parallel insert. Read this for more details.
If you simply want to insert more things at the same time, you can use the method that returns a future like this:
def insertMany(
list: List[BlockTransaction]
): Future[List[ResultSet]] = {
Future.sequence(list.map(insertNewRecord(_).future()))
}

Related

Cannot resolve overloaded method 'startTimerWithFixedDelay'

I do not understand, why I get an overloaded error inside "timers.startTimerWithFixedDelay". I added 3 parameter to ensure to take the right method, but it appears that he also finds the method with just 2 parameters highly attractive.
package main
import akka.actor.typed.{ActorSystem, Behavior}
import akka.actor.typed.receptionist.Receptionist
import akka.actor.typed.scaladsl.Behaviors
import scala.concurrent.duration.{Duration, FiniteDuration, MINUTES}
object Guardian {
case object Tick
val start: Behavior[Nothing] =
Behaviors.setup[Receptionist.Listing] { context =>
Behaviors.withTimers { timers =>
timers.startTimerWithFixedDelay(Tick, Tick, FiniteDuration(Duration("3 seconds").toSeconds, MINUTES))
Behaviors.same
}
}.narrow
}
object Application extends App {
ActorSystem[Nothing](Guardian.start, "system")
}
Inside TimerScheduler.scala does it look like, that he can't he decide between:
def startTimerWithFixedDelay(msg: T, delay: FiniteDuration): Unit
def startTimerWithFixedDelay(key: Any, msg: T, delay: FiniteDuration): Unit
Why is he not taking the one with 3 parameter?
Compiler Error message:
[error] ... overloaded method startTimerWithFixedDelay with alternatives:
[error] (msg: akka.actor.typed.receptionist.Receptionist.Listing,delay: scala.concurrent.duration.FiniteDuration)Unit <and>
[error] (key: Any,msg: akka.actor.typed.receptionist.Receptionist.Listing,delay: scala.concurrent.duration.FiniteDuration)Unit
Tick is not a Receptionist.Listing so neither version matches.

Recursively wrapping method invocations with compiler plugins/macros

OUTLINE
I have an API that looks something like this:
package com.example
object ExternalApi {
def create[T <: SpecialElement](elem: T): TypeConstructor[T] =
TypeConstructor(elem)
def create1[T <: SpecialElement](elem: T): TypeConstructor[T] =
TypeConstructor(elem)
def create2[T <: SpecialElement](elem: T): TypeConstructor[T] =
TypeConstructor(elem)
//...
}
object MyApi {
def process[T <: TypeConstructor[_ <: SpecialElement]](
l: T,
metadata: List[String]): T = {
println("I've been called!")
//do some interesting stuff with the List's type parameter here
l
}
}
case class TypeConstructor[E](elem: E)
trait SpecialElement
The ExternalApi (which is actually external to my lib, so no modifying that) has a series of calls that I'd like to automatically wrap with MyApi.process calls, with the metadata argument derived from the final type of T.
To illustrate, the calls to be wrapped could have any form, including nested calls, and calls within other AST subtree types (such as Blocks), e.g. :
package com.example.test
import com.example.{ExternalApi, SpecialElement}
object ApiPluginTest extends App {
//one possible form
val targetList = ExternalApi.create(Blah("name"))
//and another
ExternalApi.create2(ExternalApi.create1(Blah("sth")).elem)
//and yet another
val t = {
val sub1 = ExternalApi.create(Blah("anything"))
val sub2 = ExternalApi.create1(sub1.elem)
sub2
}
}
case class Blah(name: String) extends SpecialElement
Since compiler plugins handle matching structures within ASTs recursively "for free", I've decided to go with them.
However, due to the fact that I need to match a specific type signature, the plugin follows the typer phase.
Here's the code of the PluginComponent:
package com.example.plugin
import com.example.{SpecialElement, TypeConstructor}
import scala.tools.nsc.Global
import scala.tools.nsc.plugins.PluginComponent
import scala.tools.nsc.transform.Transform
class WrapInApiCallComponent(val global: Global)
extends PluginComponent
with Transform {
protected def newTransformer(unit: global.CompilationUnit) =
WrapInApiCallTransformer
val runsAfter: List[String] = List("typer") //since we need the type
val phaseName: String = WrapInApiCallComponent.Name
import global._
object WrapInApiCallTransformer extends Transformer {
override def transform(tree: global.Tree) = {
val transformed = super.transform(tree)
transformed match {
case call # Apply(_, _) =>
if (call.tpe != null && call.tpe.finalResultType <:< typeOf[
TypeConstructor[_ <: SpecialElement]]) {
println(s"Found relevant call $call")
val typeArguments = call.tpe.typeArgs.map(_.toString).toList
val listSymbOf = symbolOf[List.type]
val wrappedFuncSecondArgument =
q"$listSymbOf.apply(..$typeArguments)"
val apiObjSymbol = symbolOf[com.example.MyApi.type]
val wrappedCall =
q"$apiObjSymbol.process[${call.tpe.finalResultType}]($call, $wrappedFuncSecondArgument)"
//explicit typing, otherwise later phases throw NPEs etc.
val ret = typer.typed(wrappedCall)
println(showRaw(ret))
println("----")
ret
} else {
call
}
case _ => transformed
}
}
}
}
object WrapInApiCallComponent {
val Name = "api_embed_component"
}
This seems to resolve identifiers, as well as types, correctly, outputting e.g.:
Apply(TypeApply(Select(TypeTree().setOriginal(Ident(com.example.MyApi)), TermName("process")), List(TypeTree())), List(Apply(TypeApply(Select(Select(Select(Ident(com), com.example), com.example.MyApi), TermName("create")), List(TypeTree())), List(Apply(Select(Ident(com.example.test.Blah), TermName("apply")), List(Literal(Constant("name")))))), Apply(TypeApply(Select(TypeTree().setOriginal(Ident(scala.collection.immutable.List)), TermName("apply")), List(TypeTree())), List(Literal(Constant("com.example.test.Blah"))))))
Unfortunately, I get an error during compilation starting with:
scala.reflect.internal.FatalError:
[error]
[error] Unexpected tree in genLoad: com.example.MyApi.type/class scala.reflect.internal.Trees$TypeTree at: RangePosition([projectpath]/testPluginAutoWrap/compiler_plugin_test/src/main/scala/com/example/test/ApiPluginTest.scala, 108, 112, 112)
[error] while compiling: [projectpath]/testPluginAutoWrap/compiler_plugin_test/src/main/scala/com/example/test/ApiPluginTest.scala
[error] during phase: jvm
[error] library version: version 2.12.4
[error] compiler version: version 2.12.4
[error] reconstructed args: -Xlog-implicits -classpath [classpath here]
[error]
[error] last tree to typer: TypeTree(class String)
[error] tree position: line 23 of [projectpath]/testPluginAutoWrap/compiler_plugin_test/src/main/scala/com/example/test/ApiPluginTest.scala
QUESTION
It looks I'm screwing something up with type definitions, but what is it?
Specifically:
How do I correctly wrap every ExternalApi.createX call with an MyApi.process call, constrained by the requirements provided above?
NOTES
Given the amount of boilerplate required, I've set up a complete example project. It's available here.
The answer does not have to define a compiler plugin. If you're able to cover all the relevant calls with a macro, that is fine as well.
Originally the wrapped call was to something like: def process[T <: TypeConstructor[_ <: SpecialElement] : TypeTag](l: T): T, the setup here is actually a workaround. So if you are able to generate a wrapped call of this type, i.e. one that includes a runtime TypeTag[T], that's fine as well.

play slick updating enumeration column

I'm having trouble figuring out how to update a column with type enumeration using play-slick.
Here's my enum and case class:
object TestStatus extends Enumeration {
type TestStatus = Value
val Status1 = Value("Status1")
}
case class Test (
id: String,
status: TestStatus
)
and the table mapping:
class Tests(tag: Tag) extends Table[Test](tag, "tests") {
implicit val statusColumn = MappedColumnType.base[TestStatus, String](_.toString, TestStatus.withName)
override def * = (id, status) <> ((Test.apply _).tupled, Test.unapply)
val id = column[String]("id", 0.PrimaryKey)
val status = column[TestStatus]("status")
}
when I try to go and update a Tests row, I get an error:
object TestQueries extends TableQuery[Tests](new Tests(_)) {
def updateStatus(id: String, newStatus: TestStatus) = {
TestQueries.filter(_.id === id).map(_.status).update(newStatus)
}
}
[error] Slick does not know how to map the given types.
[error] Possible causes: T in Table[T] does not match your * projection,
[error] you use an unsupported type in a Query (e.g. scala List),
[error] or you forgot to import a driver api into scope.
[error] Required level: slick.lifted.FlatShapeLevel
[error] Source type: slick.lifted.Rep[models.TestStatus.Value]
[error] Unpacked type: T
[error] Packed type: G
[error] TestQueries.filter(_.id === id).map(_.status).update(newStatus)
[error] ^
IntelliJ is showing that TestQueries.filter(_.id === id).map(_.status) has type Query[Nothing, Nothing, Seq], which makes me think the problem is with the specific column rather than with the update function.
Updating the id works fine using the same structure.
You need to define the custom column type of TestStatus.Value. This is how slick allows you to build a custom column type by mapping it to an already supported type:
implicit def testStatCT: BaseTypedType[TestStatus.Value] =
MappedColumnType.base[TestStatus.Value, String](
enum => enum.toString, str => TestStatus.withName(str)
)
this implicit needs to be imported wherever implicit resolutions such as the one in your example fail (or better yet defined in the TestStatus object so that it's always available) this way slick can have evidence that TestStatus.Value is a BaseTypedType which basically just means that something is a supported column type.
For further read on column mapping you can look at Slick Documentation

why does the slick can not store Option[String]?

I use slick 2.0 rc1 in my playframework 2.2 project
my talbe code is:
case class Resource(id: Option[Long] = None, owner: UserId, types: String)
// The static object that does the actual work - note the names of tables and fields in H2 are case sensitive and must be all caps
object Resources extends Table[Resource]( "RESOURCE") {
def id = column[Long]("ID", O.PrimaryKey, O.AutoInc)
def owner = column[UserId]("Owner")
def types = column[String]("Type")
def withuser = foreignKey("User_FK", owner, Users)(_.id)
// Every table needs a * projection with the same type as the table's type parameter
def * = id.? ~ owner ~ types <> (Resource, Resource.unapply _)
}
the output error is:
[info] Compiling 16 Scala sources and 2 Java sources to C:\assigment\slick-advan
ced\target\scala-2.10\classes...
[error] C:\assigment\slick-advanced\app\models\Resource.scala:12: overloaded met
hod constructor Table with alternatives:
[error] (_tableTag: scala.slick.lifted.Tag,_tableName: String)play.api.db.slic
k.Config.driver.Table[models.Resource] <and>
[error] (_tableTag: scala.slick.lifted.Tag,_schemaName: Option[String],_tableN
ame: String)play.api.db.slick.Config.driver.Table[models.Resource]
[error] cannot be applied to (String)
[error] object Resources extends Table[Resource]( "RESOURCE") {
[error] ^
I know on document it like:
object Resources(tag: Tag) extends TableTable[(Long, UserId, String)](tag, "RESOURCE") {
but after I change it, it still have error:
[error] C:\assigment\slick-advanced\app\models\Resource.scala:12: traits or obje
cts may not have parameters
[error] object Resources(tag: Tag) extends TableTable[(Long, UserId, String)](ta
g, "RESOURCE") {
[error] ^
[error] one error found
Check out the Scaladoc for Table for the constructors:
new Table(_tableTag: Tag, _tableName: String)
new Table(_tableTag: Tag, _schemaName: Option[String], _tableName: String)
The compiler is telling you that you have those choices but are using neither. You construct your instance of Table called Resources with just a String. As you can see, that isn't an option. No pun intended.
The documentation shows that your declaration should look more like this:
class Resources(tag: Tag) extends Table[(Long, UserId, String)](tag, "RESOURCE") {
...
}
val Resources = TableQuery[Resources]

Specifying a return type for Slick's MappedProjection

I'm using the PlayFramework 2.1.1 in combination with Slick 1.0.1 and the Play-Slick-Plugin 0.3.2.
Defining an abstract class that enforces my models to implement a "forInsert"-Mapper fails because I'm unable to specify the proper return type. My current definition results in the compile error below but I'm simply unable to track down this issue and provide the correct type.
import play.api.db.slick.Config.driver.KeysInsertInvoker
abstract class Model[M]( val table: String ) extends Table[M]( table )
{
def id = column[Int]( "id", O.PrimaryKey, O.AutoInc )
def forInsert: KeysInsertInvoker[M, Int]
}
object Course extends Model[Course]( "course" )
{
...
def forInsert = name ~ room <> ( apply _, unapply _ ) returning id
}
[error] Course.scala:27: polymorphic expression cannot be instantiated to expected type;
[error] found : [RU]play.api.db.slick.Config.driver.KeysInsertInvoker[model.Course,RU]
[error] required: play.api.db.slick.Config.driver.KeysInsertInvoker[model.Course,Int]
[error] def forInsert = name ~ room <> ( apply _, unapply _ ) returning id
[error] ^
[error] one error found
[error] (sample/compile:compile) Compilation failed
[error] Total time: 3 s, completed 18.06.2013 03:38:24
abstract class Model[M]( val table: String ) extends Table[M]( table )
{
def id = column[Int]( "id", O.PrimaryKey, O.AutoInc )
def forInsert: scala.slick.driver.BasicInvokerComponent#KeysInsertInvoker[M, Int]
}
Not that difficult. Printing a getClass from the implementation solved the mystery quite easily. An idea I didn't come up with yesterday night.