I need help with implementing a model of a notification using phantom and cassandra. What I have done till now:
import java.util.UUID
import com.websudos.phantom.dsl._
import com.websudos.phantom.connectors.Connector
import org.joda.time.DateTime
import scala.concurrent.Future
case class Notification(
id: UUID,
userId: UUID,
timestamp: DateTime,
read: Boolean,
actionUser: List[String],
verb: String,
itemId: UUID,
collectionId: String
)
sealed class NotificationTable extends CassandraTable[NotificationTable, Notification] {
object id extends UUIDColumn(this) with ClusteringOrder[UUID] with Ascending
object userId extends StringColumn(this) with PartitionKey[String]
object timestamp extends DateTimeColumn(this) with ClusteringOrder[DateTime] with Descending
object read extends BooleanColumn(this)
object actionUser extends ListColumn[NotificationTable, Notification, String](this)
object verb extends StringColumn(this)
object itemId extends UUIDColumn(this)
object collectionId extends StringColumn(this)
def fromRow(row: Row): Notification =
Notification(
id(row),
userId(row),
timestamp(row),
read(row),
actionUser(row),
verb(row),
itemId(row),
collectionId(row)
)
}
object NotificationTable extends NotificationTable with Connector {
override def keySpace: String = "test"
implicit val keyspace: com.websudos.phantom.connectors.KeySpace = com.websudos.phantom.connectors.KeySpace("test")
def insertItem(item: Notification): Future[ResultSet] =
insert
.value(_.id, item.id)
.value(_.userId, item.userId)
.value(_.timestamp, item.timestamp)
.value(_.read, item.read)
.value(_.actionUser, item.actionUser)
.value(_.verb, item.verb)
.value(_.itemId, item.itemId)
.value(_.collectionId, item.collectionId)
.future()
}
Somehow, I have to define two keyspaces, one for RootConnector and one for the insert statement. This is close enough to: this example,. Yet, my code does not compile. I know that they are using an abstract class there and hence it compiles.
My question is how would I go about using that abstract class?? I want to just call the insert statement from another scala source.
You are missing the fact that you are meant to use RootConnector instead of a random Connector trait there. The reason why that class is abstract is because it should only be instantiated inside of a Database object.
Have a look at this tutorial for more details, but in short, notice the RootConnector mixin here:
abstract class ConcreteNotificationTable extends
NotificationTable with RootConnector
And then:
class MyDatabase(val connector: KeySpaceDef) extends Database(connector) {
// And here you inject the real session and keyspace in the table
object notifications extends ConcreteNotificationsTable with connector.Connector
}
Then you do something like this:
object MyDatabase extends MyDatabase(ContactPoint.local.keySpace("my_app"))
And from every other source file:
val notification = Notification(id, //etc...)
MyDatabase.notifications.insertItem(someNotification)
And even better seperation of concern, as available in the tutorial:
trait DbProvider extends DatabaseProvider {
def database: MyDatabase
}
trait ProductionDbProvider extends DbProvider {
// this would now point to your object
override val database = MyDatabase
}
Then every single place which needs a database would need to mixin either DbProvider or directly ProductionDbProvider. Read the tutorial for more details, this is not a super trivial topic and all the details are already there.
Try:
import java.util.UUID
import com.websudos.phantom.dsl._
import com.websudos.phantom.connectors.Connector
import org.joda.time.DateTime
import scala.concurrent.Future
case class Notification(
id: UUID,
userId: UUID,
timestamp: DateTime,
read: Boolean,
actionUser: List[String],
verb: String,
itemId: UUID,
collectionId: String
)
//use normal class
class NotificationTable extends CassandraTable[NotificationTable, Notification] {
object id extends UUIDColumn(this) with ClusteringOrder[UUID] with Ascending
object userId extends StringColumn(this) with PartitionKey[String]
object timestamp extends DateTimeColumn(this) with ClusteringOrder[DateTime] with Descending
object read extends BooleanColumn(this)
object actionUser extends ListColumn[NotificationTable, Notification, String](this)
object verb extends StringColumn(this)
object itemId extends UUIDColumn(this)
object collectionId extends StringColumn(this)
def fromRow(row: Row): Notification =
Notification(
id(row),
userId(row),
timestamp(row),
read(row),
actionUser(row),
verb(row),
itemId(row),
collectionId(row)
)
}
//use abstract
abstract class NotificationTable extends NotificationTable with Connector {
def insertItem(item: Notification): Future[ResultSet] =
insert
.value(_.id, item.id)
.value(_.userId, item.userId)
.value(_.timestamp, item.timestamp)
.value(_.read, item.read)
.value(_.actionUser, item.actionUser)
.value(_.verb, item.verb)
.value(_.itemId, item.itemId)
.value(_.collectionId, item.collectionId)
.future()
}
Related
I'm creating a Slick Postresql table that contains a Interval field, that I want to represent as a Duration
case class Object(id: String, aproxDuration: Duration)
class Objects(tag: Tag) extends Table[Object](tag, "OBJECTS"){
def id = column[String]("id", O.PrimaryKey)
def expectedDuration = column[Duration]("expected_duration")
def * = (id, expectedDuration) <> (Object.tupled, Object.unapply)
}
To support this, I've installed the slick-pg extension and created a Profile that extends ExPostgresProfile, PgDate2Support, PgRangeSupport but I'm not sure why is it not finding the implicit TypedType for Duration
import java.time.Duration
import com.github.tminglei.slickpg.{ExPostgresProfile, PgDate2Support}
import slick.lifted.ProvenShape
trait MyPostgresProfile extends ExPostgresProfile with PgDate2Support {
override val api = MyAPI
object MyAPI extends API with DateTimeImplicits
}
object MyPostgresProfile extends MyPostgresProfile
case class Object(id: String, aproxDuration: Duration)
object Tables {
import MyPostgresProfile.api._
class Objects(tag: Tag) extends Table[Object](tag, "OBJECTS") {
def id: Rep[String] = column[String]("id", O.PrimaryKey)
def expectedDuration: Rep[Duration] = column[Duration]("expected_duration")
def * : ProvenShape[Object] = (id, expectedDuration) <> (Object.tupled, Object.unapply)
}
}
Phantom 2.14.5
Scala 2.12.1
I'm experiencing: could not find implicit value for parameter sg: com.outworkers.phantom.macros.SingleGeneric.Aux[User,Repr,HL,Out]
when I try to implement an insert using the generated save. Here are my definitions.
case class User(
id: UUID,
email: String,
firstName: String,
lastName: String,
passwordHash: String,
createdAt: DateTime,
updatedAt: DateTime
)
abstract class Users extends Table[Users, User] {
object id extends UUIDColumn with PartitionKey
object email extends StringColumn
object first_name extends StringColumn
object last_name extends StringColumn
object password_hash extends StringColumn
object created_at extends DateTimeColumn
object updated_at extends DateTimeColumn
def getById(id: UUID): Future[Option[User]] = {
select.where(_.id eqs id).one()
}
def add(user: User): Future[UUID] = {
store(user)
.consistencyLevel_=(ConsistencyLevel.ALL)
.future()
.map(_ => user.id)
}
}
class AppDatabase(override val connector: CassandraConnection)
extends Database[AppDatabase](connector) {
object users extends Users with Connector
}
object Database extends AppDatabase(connector)
Any insight is appreciated! Thanks!
I am using Cassandra Phantom driver to build an application with Scala and Cassandra. My code looks like this:
case class User(id: UUID, name:String)
abstract class Users extends CassandraTable[Users, User] with RootConnector {
object id extends UUIDColumn(this) with PartitionKey
object name extends StringColumn(this)
def save(user: User): Future[ResultSet] = {
insert
.value(_.id, user.id)
.value(_.name, user.name)
.consistencyLevel_=(ConsistencyLevel.ALL)
.future()
}
def getById(id: UUID): Future[Option[User]] = {
select.where(_.id eqs id).one()
}
}
But when I try to compile the code it gives me following error:
could not find implicit value for parameter helper: com.outworkers.phantom.macros.TableHelper[Users, User]
I am not able to understand why this error is occurring when I am following documentation.
Phantom Version: 2.7.6
Scala: 2.11.2
case class User(id: UUID, name:String)
abstract class Users extends Table[Users, User] with RootConnector {
object id extends UUIDColumn(this) with PartitionKey
object name extends StringColumn(this)
def save(user: User): Future[ResultSet] = {
store(user)
.consistencyLevel_=(ConsistencyLevel.ALL)
.future()
}
def getById(id: UUID): Future[Option[User]] = {
select.where(_.id eqs id).one()
}
}
I have just compiled this with 2.7.6, you also don't need to manually implement an insert as it is generated for you.
I am new to scala (long time java developer). I tried to understand implicit and I think I understand the basics however I do not understand why it does not find the implicit value session. I tried to describe my problem with as much information as possible.
I followed the following blog post: http://blog.websudos.com/2015/04/04/a-series-on-phantom-part-1-getting-started-with-phantom/
Everything went/compiled fine until I wanted to test it. Getting the following error:
Error:(15, 46) could not find implicit value for parameter session: com.datastax.driver.core.Session
Await.result(Database.autocreate().future(), 5.seconds)
^
Error:(20, 48) could not find implicit value for parameter session: com.datastax.driver.core.Session
Await.result(Database.autotruncate().future(), 5.seconds)
^
when I execute the following test class:
import org.joda.time.DateTime
import org.scalatest.{BeforeAndAfterAll, FlatSpec}
import scala.concurrent.Await
import scala.concurrent.duration._
class DatabaseTest extends FlatSpec with BeforeAndAfterAll{
override def beforeAll(): Unit = {
super.beforeAll()
Await.result(Database.autocreate().future(), 5.seconds)
}
override def afterAll(): Unit = {
super.afterAll()
Await.result(Database.autotruncate().future(), 5.seconds)
}
"A TwitterMessage" should "be stored in cassandra" in {
val twitterMessageBefore = TwitterMessage(1L, DateTime.now, "This is a message", "me", "none")
Database.twitterMessages.store(
twitterMessageBefore
)
val twitterMessageAfter:Option[TwitterMessage] = Await.result(Database.twitterMessages.getById(1L), 5.seconds)
assert(twitterMessageAfter.isDefined, "No entry was found regarding the id.")
assert(twitterMessageAfter.get equals twitterMessageBefore)
}
}
I also copied the other classes I wrote below:
TwitterMessages.scala
import com.websudos.phantom.dsl._
import scala.concurrent.Future
case class TwitterMessage (
id: Long,
timestamp: DateTime,
msg: String,
user: String,
category: String
)
sealed class TwitterMessages extends CassandraTable[ConcreteTwitterMessages, TwitterMessage]{
object id extends LongColumn(this) with PartitionKey[Long]
object timestamp extends DateTimeColumn(this)
object msg extends StringColumn(this)
object user extends StringColumn(this)
object category extends StringColumn(this)
def fromRow(row: Row): TwitterMessage = {
TwitterMessage(
id(row),
timestamp(row),
msg(row),
user(row),
category(row)
)
}
}
abstract class ConcreteTwitterMessages extends TwitterMessages with RootConnector{
def store(twitterMessage: TwitterMessage): Future[ResultSet] = {
insert.value(_.id, twitterMessage.id)
.value(_.timestamp, twitterMessage.timestamp)
.value(_.msg, twitterMessage.msg)
.value(_.user, twitterMessage.user)
.value(_.category, twitterMessage.category)
.consistencyLevel_=(ConsistencyLevel.ALL)
.future()
}
def getById(id: Long): Future[Option[TwitterMessage]] = {
select.where(_.id eqs id).one()
}
}
Database.scala
import com.websudos.phantom.connectors.{ContactPoint, KeySpaceDef}
object Defaults {
val connector = ContactPoint.local.keySpace("twitter")
}
class Database(val keyspace:KeySpaceDef) extends com.websudos.phantom.db.DatabaseImpl(keyspace){
object twitterMessages extends ConcreteTwitterMessages with keyspace.Connector
}
object Database extends Database(Defaults.connector)
To specifically address your problem, all you have to do is to mix in the connector into the the test suite. This one is on me as I forgot to update the blog post with this information.
class DatabaseTest extends FlatSpec with BeforeAndAfterAll
with Defaults.connector.Connector
Upon digging it seems this is a sensitive issue.
class TestNames {
private[this] lazy val _name: String = this.getClass.getName.split("\\.").last
def name: String = _name
}
class Parent extends TestNames
class Parent2 extends Parent
class ClassNameExtraction extends FlatSpec {
it should "correctly extract the table name" in {
object TestNames extends TestNames
assert(TestNames.name === "TestNames")
}
it should "correctly extract the parent name" in {
object Parent extends Parent
assert(Parent.name === "Parent")
}
it should "correctly extract the column names" in {
object Parent2 extends Parent2
assert((Parent2.name === "Parent2"))
}
}
I can see there is a simple pattern: $$annonfun$number$ + ACTUAL_CLASS_NAME + $number$.
Is there a simpler way of doing this?
Update:
Got something working without inheritance:
import scala.reflect.ClassTag
class SomeClass {
private[this] lazy val _name: String = implicitly[ClassTag[this.type]].runtimeClass.getSimpleName
def name: String = _name
}
However this returns the same name in sub classes:
class SomeOtherClass extends SomeClass {}
object SomeOtherClass extends SomeOtherClass
SomeOtherClass.name// is still SomeClass
There is an issue:
https://issues.scala-lang.org/browse/SI-2034
Do you need a class per se or a type name? Because that's what reflection is for.
The policy is not to discuss religion, politics or name mangling.