When I "sbt run" the following code,
package com.example
import java.io.ObjectInputStream
import java.io.ObjectOutputStream
import java.io.FileInputStream
import java.io.FileOutputStream
object SimpleFailure extends App {
case class MyClass(a: String, b: Int, c: Double)
def WriteObjectToFile[A](obj: A, filename: String) {
val output = new ObjectOutputStream(new FileOutputStream(filename, false))
output.writeObject(obj)
}
def ReadObjectFromFile[A](filename: String)(implicit m: Manifest[A]): A = {
val obj = new ObjectInputStream(new FileInputStream(filename)) readObject
obj match {
case a if m.erasure.isInstance(a) => a.asInstanceOf[A]
case _ => { sys.error("Type not what was expected when reading from file") }
}
}
val orig = MyClass("asdf", 42, 2.71)
val filename = "%s/delete_me.spckl".format(System.getProperty("user.home"))
WriteObjectToFile(List(orig), filename)
val loaded = try {
ReadObjectFromFile[List[MyClass]](filename)
} catch { case e => e.printStackTrace; throw e }
println(loaded(0))
}
I get the following exception:
java.lang.ClassNotFoundException: com.example.SimpleFailure$MyClass
However, I can run the code fine in Eclipse with the Scala plugin. Is this an SBT bug? Interestingly, the problem only comes up when wrapping MyClass in a List (see how "orig" is wrapped in a List in the WriteObjectToFile call). If I don't wrap in a List, everything works fine.
Put this in your build.sbt or project file:
fork in run := true
The problem seems to be with the classloader that gets used when sbt loads your code. ObjectInputStream describes it's default classloader resolution, which walks the stack. Normally, this ends up finding the loader associated with the program in mind, but in this case, it ends up using the wrong one.
I was able to work around this by including the following class in my code, and using it instead of ObjectInputStream directly.
package engine;
import java.io.InputStream;
import java.io.IOException;
import java.io.ObjectInputStream;
import java.io.ObjectStreamClass;
class LocalInputStream extends ObjectInputStream {
LocalInputStream(InputStream in) throws IOException {
super(in);
}
#Override
protected Class<?> resolveClass(ObjectStreamClass desc)
throws ClassNotFoundException
{
return Class.forName(desc.getName(), false,
this.getClass().getClassLoader());
}
}
This overrides the resolveClass method, and always uses one associated with this particular class. As long as this class is the one that is part of your app, this should work.
BTW, this is both faster than requiring fork in run, but it also works with the Play framework, which currently doesn't support forking in dev mode.
I was able to reproduce this too using sbt 0.10.1 and scalaVersion := "2.9.0-1". You should probably just report it on github or bring it up on the mailing list.
Related
I am trying to do a very basic serialization of a very simple case class in Scala:
import org.scalatest.wordspec.AnyWordSpecLike
import java.io.{ByteArrayOutputStream, ObjectOutputStream}
class PersistenceSpec extends AnyWordSpecLike{
case class TestClass(name: String) extends Serializable
def serializeSomething(): ByteArrayOutputStream = {
val testItem = TestClass("My Thing")
val bos: ByteArrayOutputStream = new ByteArrayOutputStream()
val oos = new ObjectOutputStream(bos)
oos.writeObject(testItem)
bos
}
"serializeSomething" when {
"executed" must {
"successfully serialize" in {
val outputStream = serializeSomething()
println(outputStream.toString())
}
}
}
}
When I run this test I get a java.io.NotSerializableException on the call to oos.writeObject(testItem), which makes no sense, since case classes automatically implement Serializable, and this is the simplest possible example.
However, if I paste the code for TestClass and serializeSomething() into repl, I am able to call the function, and it works just fine.
What is different when calling my function via scalatest, vs repl that would cause this exception?
One final note: If I change the call from oos.writeObject(testItem) to oos.writeObject("Hello"), it works fine, even when run from scalatest.
You need to define TestClass outside of PersistenceSpec.
Inner class instances automatically get a reference to the instance of the outer class. So, when you write it out, it tries to serialize the PersistenceSpec instance as well, and that of course fails.
I am trying to write a custom codec to convert Cassandra columns of type timestamp to org.joda.time.DateTime.
I am building my project with sbt versions 0.13.13.
I wrote a test that serializes and deserializes a DateTime object. When I run the test via the command line with sbt "test:testOnly *DateTimeCodecTest", the project builds and the test passes.
However, if I try to build the project inside Intellij, I receive the following error:
Error:(17, 22) overloaded method constructor TypeCodec with alternatives:
(x$1: com.datastax.driver.core.DataType,x$2: shade.com.datastax.spark.connector.google.common.reflect.TypeToken[org.joda.time.DateTime])com.datastax.driver.core.TypeCodec[org.joda.time.DateTime] <and>
(x$1: com.datastax.driver.core.DataType,x$2: Class[org.joda.time.DateTime])com.datastax.driver.core.TypeCodec[org.joda.time.DateTime]
cannot be applied to (com.datastax.driver.core.DataType, com.google.common.reflect.TypeToken[org.joda.time.DateTime])
object DateTimeCodec extends TypeCodec[DateTime](DataType.timestamp(), TypeToken.of(classOf[DateTime]).wrap()) {
Here is the codec:
import java.nio.ByteBuffer
import com.datastax.driver.core.exceptions.InvalidTypeException
import com.datastax.driver.core.{ DataType, ProtocolVersion, TypeCodec }
import com.google.common.reflect.TypeToken
import org.joda.time.{ DateTime, DateTimeZone }
/**
* Provides serialization between Cassandra types and org.joda.time.DateTime
*
* Reference for writing custom codecs in Scala:
* https://www.datastax.com/dev/blog/writing-scala-codecs-for-the-java-driver
*/
object DateTimeCodec extends TypeCodec[DateTime](DataType.timestamp(), TypeToken.of(classOf[DateTime]).wrap()) {
override def serialize(value: DateTime, protocolVersion: ProtocolVersion): ByteBuffer = {
if (value == null) return null
val millis: Long = value.getMillis
TypeCodec.bigint().serializeNoBoxing(millis, protocolVersion)
}
override def deserialize(bytes: ByteBuffer, protocolVersion: ProtocolVersion): DateTime = {
val millis: Long = TypeCodec.bigint().deserializeNoBoxing(bytes, protocolVersion)
new DateTime(millis).withZone(DateTimeZone.UTC)
}
// Do we need a formatter?
override def format(value: DateTime): String = value.getMillis.toString
// Do we need a formatter?
override def parse(value: String): DateTime = {
try {
if (value == null ||
value.isEmpty ||
value.equalsIgnoreCase("NULL")) throw new Exception("Cannot produce a DateTime object from empty value")
// Do we need a formatter?
else new DateTime(value)
} catch {
// TODO: Determine the more specific exception that would be thrown in this case
case e: Exception =>
throw new InvalidTypeException(s"""Cannot parse DateTime from "$value"""", e)
}
}
}
and here is the test:
import com.datastax.driver.core.ProtocolVersion
import org.joda.time.{ DateTime, DateTimeZone }
import org.scalatest.FunSpec
class DateTimeCodecTest extends FunSpec {
describe("Serialization") {
it("should serialize between Cassandra types and org.joda.time.DateTime") {
val now = new DateTime().withZone(DateTimeZone.UTC)
val result = DateTimeCodec.deserialize(
// TODO: Verify correct ProtocolVersion for DSE 5.0
DateTimeCodec.serialize(now, ProtocolVersion.V4), ProtocolVersion.V4
)
assertResult(now)(result)
}
}
}
I make extensive use of the debugger within Intellij as well as the ability to quickly run a single test using some hotkeys. Losing the ability to compile within the IDE is almost as bad as losing the ability to compile at all. Any help would be appreciated, and I am more than happy to provide any additional information about my project // environment if anyone needs it.
Edit, update:
The project compiles within IntelliJ if I provide an instance of com.google.common.reflect.TypeToken as opposed to shade.com.datastax.spark.connector.google.common.reflect.TypeToken.
However, this breaks the build within sbt.
You must create a default constructor for DateTimeCodec.
I resolved the issue.
The issue stemmed from conflicting versions of spark-cassandra-connector on the classpath. Both shaded and unshaded versions of the dependency were on the classpath, and removing the shaded dependency fixed the issue.
Recently trying to migrate to the latest phantom version 2.24.8. I created a dummy project, but running into a few issues that I can't figure out. Here's my code:
import com.outworkers.phantom.connectors.{CassandraConnection, ContactPoints}
import com.outworkers.phantom.database.Database
import scala.concurrent.Future
import com.outworkers.phantom.dsl._
case class Test(id: String, timestamp: String)
abstract class Tests extends Table[Tests, Test] {
object id extends StringColumn with PartitionKey
object timestamp extends StringColumn with ClusteringOrder
}
abstract class ConcreteTests extends Tests with RootConnector {
def addTest(l: Test): Future[ResultSet] = {
// store(l).consistencyLevel_=(ConsistencyLevel.LOCAL_ONE).future
insert.value(_.id, l.id)
.value(_.timestamp, l.timestamp)
.consistencyLevel_=(ConsistencyLevel.QUORUM).future
}
}
class MyDB(override val connector: CassandraConnection) extends Database[MyDB](connector) {
object tests extends ConcreteTests with connector.Connector
def init(): Unit = {
tests.create
}
}
object Test{
def main(args: Array[String]): Unit = {
val db = new MyDB(ContactPoints(Seq("127.0.0.1")).keySpace("tests"))
db.init
db.tests.addTest(Test("1", "1323234234"))
println("Done")
}
}
It compiled and ran in IntelliJ and print out 'Done'. However, no table is ever created. Also no exceptions or warnings. It did nothing. I tried to stop the local cassandra database. The code throws the NoHostAvailableException. So it does try to connect the local database. What is the problem?
Another weird thing is that "com.typesafe.play" %% "play-json" % "2.6.9" is in my build.sbt. If I remove the library, the same code throws the following exception:
Exception in thread "main" java.lang.NoClassDefFoundError: scala/reflect/runtime/package$
at com.outworkers.phantom.column.AbstractColumn.com$outworkers$phantom$column$AbstractColumn$$_name(AbstractColumn.scala:55)
at com.outworkers.phantom.column.AbstractColumn.com$outworkers$phantom$column$AbstractColumn$$_name$(AbstractColumn.scala:54)
at com.outworkers.phantom.column.Column.com$outworkers$phantom$column$AbstractColumn$$_name$lzycompute(Column.scala:22)
at com.outworkers.phantom.column.Column.com$outworkers$phantom$column$AbstractColumn$$_name(Column.scala:22)
at com.outworkers.phantom.column.AbstractColumn.name(AbstractColumn.scala:58)
at com.outworkers.phantom.column.AbstractColumn.name$(AbstractColumn.scala:58)
at com.outworkers.phantom.column.Column.name(Column.scala:22)
at com.outworkers.phantom.builder.query.InsertQuery.value(InsertQuery.scala:107)
Really cannot figure what's going on. Any help?
BTW, I'm using scala 2.12.6 and JVM 1.8.181.
You're not using the correct DSL method for table creation, have a look at the official guide. All that table.create does is to create an empty CreateQuery, and you're coercing the return type to Unit manually.
The automated blocking create method is on Database, not on table, so what you want is:
class MyDB(override val connector: CassandraConnection) extends Database[MyDB](connector) {
object tests extends ConcreteTests with connector.Connector
def init(): Unit = {
this.create
}
}
If you want to achieve the same thing using table, you need:
class MyDB(override val connector: CassandraConnection) extends Database[MyDB](connector) {
object tests extends ConcreteTests with connector.Connector
def init(): Unit = {
import scala.concurrent.duration._
import scala.concurrent.Await
Await.result(tests.create.future(), 10.seconds)
}
}
It's only the call to future() method that will trigger any kind of action to the database, otherwise you're just building a query without executing it. The method name can be confusing, and we will improve the docs and future releases to make it more obvious.
The conflict with play 2.6.9 looks very weird, it's entirely possible there's an incompatible dependency behind the scenes to do with macro compilation. Raise that as a separate issue and we can definitely have a look at it.
I'm pretty new to Scala, Play, and Quill and I'm not sure what I'm doing wrong. I have my project split up into models, repositories, and services (and controllers, but that is not relevant for this question). Right now, I'm getting this error for the lines in my services that are making changes to the database:
exception during macro expansion: scala.reflect.macros.TypecheckException: Can't find implicit `Decoder[models.AgentId]`. Please, do one of the following things:
1. ensure that implicit `Decoder[models.AgentId]` is provided and there are no other conflicting implicits;
2. make `models.AgentId` `Embedded` case class or `AnyVal`.
And I'm getting this error for all the other lines in my services:
exception during macro expansion: [error] scala.reflect.macros.TypecheckException: not found: value quote
I found a similar ticket, but the same fix does not work for me (I am already requiring ctx as an implicit variable, so I can't import it as well. I'm totally at a loss and if anyone has any suggestions, I would be happy to try anything. I'm using the following versions:
Scala 2.12.4
Quill 2.3.2
Play 2.6.6
The code:
db/package.scala
package db
import io.getquill.{PostgresJdbcContext, SnakeCase}
package object db {
class DBContext(config: String) extends PostgresJdbcContext(SnakeCase, config)
trait Repository {
val ctx: DBContext
}
}
repositories/AgentsRepository.scala
package repositories
import db.db.Repository
import models.Agent
trait AgentsRepository extends Repository {
import ctx._
val agents = quote {
query[Agent]
}
def agentById(id: AgentId) = quote { agents.filter(_.id == lift(id)) }
def insertAgent(agent: Agent) = quote {
query[Agent].insert(_.identifier -> lift(agent.identifier)
).returning(_.id)
}
}
services/AgentsService.scala
package services
import db.db.DBContext
import models.{Agent, AgentId}
import repositories.AgentsRepository
import scala.concurrent.ExecutionContext
class AgentService(implicit val ex: ExecutionContext, val ctx: DBContext)
extends AgentsRepository {
def list: List[Agent] =
ctx.run(agents)
def find(id: AgentId): List[Agent] =
ctx.run(agentById(id))
def create(agent: Agent): AgentId = {
ctx.run(insertAgent(agent))
}
}
models/Agent.scala
package models
import java.time.LocalDateTime
case class AgentId(value: Long) extends AnyVal
case class Agent(
id: AgentId
, identifier: String
)
I am already requiring ctx as an implicit variable, so I can't import it as well
You don't need to import a context itself, but everything which is inside in order to make it work
import ctx._
Make sure to place it before ctx.run called, as in https://github.com/getquill/quill/issues/998#issuecomment-352189214
I am reading this tutorial on scala macro code generation
http://www.michaelpollmeier.com/2016/12/01/scalameta-code-generation-tutorial
Based on this I wrote my macro class
class Foo extends StaticAnnotation {
inline def apply(defn: Any) : Any = meta {
val q"..$mods class $tName (..$params) extends $template" = defn
q"""
..$mods class $tName(..$params) {
def sayMyName() : String = "Hello"
}
"""
}
}
Now I decorated my classes with the new attribute
package com.abhi
object Models {
#Foo case class Bar(...)
#Foo case class Baz(...)
}
If I write the following two lines
val x = Bar("test")
println(x.sayMyName())
and then compile using sbt sbt macros/compile root/compile and sbt run the code runs fine and I see that Hello is printed.
However the same code does not compile inside of intelliJ idea. (Ultimate 2016.3.3)