Only Keys stored not objects with refrence to that key - scala

I have created one method "saveInDB" which will takes 3 parameters which are passed to "imgData()" class from where all values are set to object "picDetail" of that class when i push to store to riak database only keys are stored but objects of picDetail are not being stored. I can't figure it out what's happening.
val riakClient = RiakFactory.pbcClient
val bucketName = riakClient.createBucket("bucket_name").execute
def saveInDB(title: String, desc: String, imageName: String ): Boolean = {
val picDetail = new imgData()
picDetail.title = title
picDetail.desc = desc
val s = imageName.replace(".png", "")
picDetail.imageName = imageName
try{
riakBucketName.store(s , picDetail).execute
true
}catch{
case e: Exception => false
}
}
#Update: Riak Version : 1.3.2 and Riak Java Client : 1.1.4
Any idea will be greatly appreciated.
Thanks in advance

I figured out the Solution as i tried to pass object "picDetail" to riak it is unable to store so i convert the object into json String. Now it is working fine. My code is like
case class ImgData(
title: String,
desc: String,
imageName: String
)
def getJsonString(title:String, desc:String, imageName:String) : String = {
import play.api.libs.json._
import play.api.libs.functional.syntax._
implicit val cardsWrites: Writes[CardsModel] = (
(__ \ "title").write[String] and
(__ \ "desc").write[String] and
(__ \ "imageName").write[String]
)(unlift(ImgData.unapply))
val jObject = (title, desc, imageName)
val jString = Json.toJson(jObject)
jString.toString
}
def saveInDB(title: String, desc:String, imageName: String ): Boolean ={
val obj:String = getJsonString(title, desc, imageName)
val s:String = imageName.replace(".png", "")
try{
riakBucketName.store(s ,obj).execute
true
}catch{
case e: Exception => false
}
true
}
Thank you!

Just found out another easy way to store.
def saveInDB(title: String, desc:String, imageName: String ): Boolean ={
val obj = ImgData(title, desc, imageName)
val s:String = imageName.replace(".png", "")
val jsonString: String = generate(obj)
var riakObject = RiakObjectBuilder.
newBuilder("bucket_name", s).
withContentType("application/json").
withValue(jsonString).
build
try{
riakBucketName.store(s, riakObject).returnBody(true).execute
fetchUserData(userID)
true
}catch{
case e: Exception => e
}
true
}

Related

how to insert usert defined type in cassandra by using lagom scala framework

I am using Lagom(scala) framework and i could find any way to save scala case class object in cassandra with has complex Type. so how to i insert cassandra UDT in Lagom scala. and can any one explain hoe to use BoundStatement.setUDTValue() method.
I have tried to do by using com.datastax.driver.mapping.annotations.UDT.
but does not work for me. I have also tried com.datastax.driver.core
Session Interface. but again it does not.
case class LeadProperties(
name: String,
label: String,
description: String,
groupName: String,
fieldDataType: String,
options: Seq[OptionalData]
)
object LeadProperties{
implicit val format: Format[LeadProperties] = Json.format[LeadProperties]
}
#UDT(keyspace = "leadpropertieskeyspace", name="optiontabletype")
case class OptionalData(label: String)
object OptionalData {
implicit val format: Format[OptionalData] = Json.format[OptionalData]
}
my query:----
val optiontabletype= """
|CREATE TYPE IF NOT EXISTS optiontabletype(
|value text
|);
""".stripMargin
val createLeadPropertiesTable: String = """
|CREATE TABLE IF NOT EXISTS leadpropertiestable(
|name text Primary Key,
|label text,
|description text,
|groupname text,
|fielddatatype text,
|options List<frozen<optiontabletype>>
);
""".stripMargin
def createLeadProperties(obj: LeadProperties): Future[List[BoundStatement]] = {
val bindCreateLeadProperties: BoundStatement = createLeadProperties.bind()
bindCreateLeadProperties.setString("name", obj.name)
bindCreateLeadProperties.setString("label", obj.label)
bindCreateLeadProperties.setString("description", obj.description)
bindCreateLeadProperties.setString("groupname", obj.groupName)
bindCreateLeadProperties.setString("fielddatatype", obj.fieldDataType)
here is the problem I am not getting any method for cassandra Udt.
Future.successful(List(bindCreateLeadProperties))
}
override def buildHandler(): ReadSideProcessor.ReadSideHandler[PropertiesEvent] = {
readSide.builder[PropertiesEvent]("PropertiesOffset")
.setGlobalPrepare(() => PropertiesRepository.createTable)
.setPrepare(_ => PropertiesRepository.prepareStatements)
.setEventHandler[PropertiesCreated](ese ⇒
PropertiesRepository.createLeadProperties(ese.event.obj))
.build()
}
I was faced with the same issue and solve it following way:
Define type and table:
def createTable(): Future[Done] = {
session.executeCreateTable("CREATE TYPE IF NOT EXISTS optiontabletype(filed1 text, field2 text)")
.flatMap(_ => session.executeCreateTable(
"CREATE TABLE IF NOT EXISTS leadpropertiestable ( " +
"id TEXT, options list<frozen <optiontabletype>>, PRIMARY KEY (id))"
))
}
Call this method in buildHandler() like this:
override def buildHandler(): ReadSideProcessor.ReadSideHandler[FacilityEvent] =
readSide.builder[PropertiesEvent]("PropertiesOffset")
.setPrepare(_ => prepare())
.setGlobalPrepare(() => {
createTable()
})
.setEventHandler[PropertiesCreated](processPropertiesCreated)
.build()
Then in processPropertiesCreated() I used it like:
private val writePromise = Promise[PreparedStatement] // initialized in prepare
private def writeF: Future[PreparedStatement] = writePromise.future
private def processPropertiesCreated(eventElement: EventStreamElement[PropertiesCreated]): Future[List[BoundStatement]] = {
writeF.map { ps =>
val userType = ps.getVariables.getType("options").getTypeArguments.get(0).asInstanceOf[UserType]
val newValue = userType.newValue().setString("filed1", "1").setString("filed2", "2")
val bindWriteTitle = ps.bind()
bindWriteTitle.setString("id", eventElement.event.id)
bindWriteTitle.setList("options", eventElement.event.keys.map(_ => newValue).toList.asJava) // todo need to convert, now only stub
List(bindWriteTitle)
}
}
And read it like this:
def toFacility(r: Row): LeadPropertiesTable = {
LeadPropertiesTable(
id = r.getString(fId),
options = r.getList("options", classOf[UDTValue]).asScala.map(udt => OptiontableType(field1 = udt.getString("field1"), field2 = udt.getString("field2"))
)
}
My prepare() function:
private def prepare(): Future[Done] = {
val f = session.prepare("INSERT INTO leadpropertiestable (id, options) VALUES (?, ?)")
writePromise.completeWith(f)
f.map(_ => Done)
}
This is not a very well written code, but I think will help to proceed work.

Scala Test: File upload with additional attributes - MultipartFormData

I am actually trying to test the creation of a new product.
One attribute of a product is a picture. This picture should be stored into a directory called "images". In the database only the file name should be stored as a string in the picture column.
So I tried to create a MultiPartFormData Fake Request and add the attributes into the dataParts attribute of the MultiPartFormData.
But when executing the test i get following error:
\test\InventoryControllerSpec.scala:50: Cannot write an instance of play.api.mvc.MultipartFormData[play.api.
libs.Files.TemporaryFile] to HTTP response. Try to define a Writeable[play.api.mvc.MultipartFormData[play.api.libs.Files.TemporaryFile]]
The product model looks like following:
case class Product(id: Option[Int],
name: String,
category: String,
picture: Option[String],
amount: Int,
criticalAmount: Int
) {
}
object Product {
implicit val productFormat = Json.format[Product]
def tupled(t: (Option[Int], String, String, Option[String], Int, Int)) =
Product(t._1, t._2, t._3, t._4, t._5, t._6)
def toTuple(p: Product) = Some((p.id, p.name, p.category, p.picture, p.amount, p.criticalAmount))
}
The database model looks like this:
class Products(tag: Tag) extends Table[Product](tag, "PRODUCTS"){
def id = column[Int]("ID", O.PrimaryKey, O.AutoInc)
def name = column[String]("NAME")
def category = column[String]("CATEGORY")
def picture = column[String]("PICTURE")
def amount = column[Int]("AMOUNT")
def criticalAmount = column[Int]("CRITICALAMOUNT")
def * = (id.?, name, category, picture.?, amount, criticalAmount) <>(Product.tupled, Product.toTuple)
}
I think also the create function in the controller should work:
val productForm = Form(
tuple(
"name" -> nonEmptyText,
"category" -> nonEmptyText,
"amount" -> number,
"criticalAmount" -> number
)
)
def create = SecuredAction(IsInventoryAdmin()
).async(parse.multipartFormData) {
implicit request => {
val pr : Option[Product] = productForm.bindFromRequest().fold (
errFrm => None,
product => Some(Product(None, product._1, product._2, None, product._3,product._4))
)
request.body.file("picture").map { picture =>
pr.map { product =>
val filename = picture.filename
val contentType = picture.contentType
val filePath = s"/images/$filename"
picture.ref.moveTo(new File(filePath), replace=true)
val fullProduct = product.copy(picture = Some(filePath))
inventoryRepo.createProduct(fullProduct).map(p => Ok(Json.toJson(p)))
}.getOrElse{
Future.successful(
BadRequest(Json.obj("message" -> "Form binding error.")))
}
}.getOrElse {
Future.successful(
BadRequest(Json.obj("message" -> "File not attached.")))
}
}
}
Now my problem is the creation of a Scala Test which checks if the functionality is working. At the moment my code looks like this:
"allow inventory admins to create new products" in new RepositoryAwareContext {
new WithApplication(application) {
val token = CSRF.SignedTokenProvider.generateToken
val tempFile = TemporaryFile(new java.io.File("/images/the.file"))
val part = FilePart[TemporaryFile](key = "the.file", filename = "the.file", contentType = Some("image/jpeg"), ref = tempFile)
val formData = MultipartFormData(dataParts = Map(("name", Seq("Test Product")),("category", Seq("Test Category")),("amount", Seq("50")), ("criticalAmount", Seq("5"))), files = Seq(part), badParts = Seq(), missingFileParts = Seq())
val result = route(FakeRequest(POST, "/inventory", FakeHeaders(), formData)
.withAuthenticator[JWTAuthenticator](inventoryAdmin.loginInfo)
.withHeaders("Csrf-Token" -> token)
.withSession("csrfToken" -> token)
).get
val newInventoryResponse = result
status(newInventoryResponse) must be(OK)
//contentType(newInventoryResponse) must be(Some("application/json"))
val product = contentAsJson(newInventoryResponse).as[Product]
product.id mustNot be(None)
product.name mustBe "Test Product"
product.category mustBe "Test Category"
}
}
It would be great if anybody can help me because i can not find a solution on my own...
Kind regards!

Play framework - Using anorm with Option[LocalDate] \ Option[LocalDateTime]

I am trying to define a nullable date field in postgres, while using anorm as connection to the database.
I am trying to update an entry:
def update(id: Long, startTime: Option[LocalDate]){
SQL("""UPDATE my_table
|SET start_date = {start_date}
|WHERE id = {id}
""".stripMargin)
.on(
'id ->id,
'start_date -> startDate,
).executeUpdate()
}
But I get a compilation error, looks like anorm can't handle Option[DateTime], although when I configured a parser it works form me:
val parser: RowParser[Info] = {
get[Long]("id") ~
get[Option[DateTime]]("start_date") map {
case id ~ startTime => Info(id, startDate)
}
}
What am I missing here?
Thanks!
I added my own implicit definitions:
implicit def rowToLocalDate: Column[LocalDate] = Column.nonNull {(value, meta) =>
val MetaDataItem(qualified, nullable, clazz) = meta
value match {
case ts: java.sql.Timestamp => Right(new LocalDate(ts.getTime))
case d: java.sql.Date => Right(new LocalDate(d.getTime))
case str: java.lang.String => Right(fmt.parseLocalDate(str))
case _ => Left(TypeDoesNotMatch("Cannot convert " + value + ":" + value.asInstanceOf[AnyRef].getClass) )
}
}
implicit val localDateToStatement = new ToStatement[LocalDate] {
def set(s: java.sql.PreparedStatement, index: Int, aValue: LocalDate): Unit = {
s.setTimestamp(index, new java.sql.Timestamp(aValue.toDateTimeAtStartOfDay().getMillis()))
}
}
And the relevant ParameterMetaData
implicit object LocalDateClassMetaData extends ParameterMetaData[LocalDate] {
val sqlType = ParameterMetaData.DateParameterMetaData.sqlType
val jdbcType = ParameterMetaData.DateParameterMetaData.jdbcType
}
That made the trick
Related question, Anorm compare/search by java.time LocalDateTime what it worked for me is just update to new version (not-yet-release-one)

How to resolve error "play - Cannot invoke the action, eventually got an error: java.lang.IllegalArgumentException: can't serialize class" in scala?

I am using scala case classes and play framework to process some data.
Here is a my code:
case class myClass(
datastores: Option[List[DataStores]])
case class DataStores(
name: Option[String],
utilization: Option[DataStoreUtilization],
luns: Option[List[DataStoreLuns]]
)
case class DataStoreUtilization(
freeBytes: Option[Long],
sizeBytes: Option[Long],
usedBytes: Option[Long])
case class DataStoreLuns(
model: Option[String],
canonicalName: Option[String],
vendor: Option[String])
object myClass extends ModelCompanion[myClass, ObjectId] {
val collection = MongoClient("mongoserver", 27017)("myDB")("myCollection")
val dao = new SalatDAO[myClass, ObjectId](collection = collection) {}
def apply(rawData: JsValue): myClass = {
val datastoreList = getDataStoreList(rawData)
myClass(datastoreList)
}
private def getDataStoreList(rawData: JsValue): Option[List[DataStores]] = {
val commandInfo = (rawData \ "Commands").as[JsValue]
(commandInfo \ "dataStores").asOpt[List[JsObject]].map { dataStores =>
dataStores map {
dataStoreJson =>
val name = (dataStoreJson \ "name").asOpt[String]
val utilization = getDataStoreUtilization(dataStoreJson)
val luns = getDataStoreLuns(dataStoreJson)
// val virtualMachines = getDataStoreVirtualMachines(dataStoreJson)
DataStores(name, utilization, luns)
}
}
}
private def getDataStoreUtilization(dataStoreJson: JsObject): Option[DataStoreUtilization] = {
(dataStoreJson \ "utilization").asOpt[JsObject].map {
case utilizationJson =>
val freeBytes = (utilizationJson \ "freeBytes").asOpt[Long]
val sizeBytes = (utilizationJson \ "sizeBytes").asOpt[Long]
val usedBytes = (utilizationJson \ "usedBytes").asOpt[Long]
DataStoreUtilization(freeBytes, sizeBytes, usedBytes)
}
}
private def getDataStoreLuns(dataStoreJson: JsValue): Option[List[DataStoreLuns]] = {
(dataStoreJson \ "luns").asOpt[List[JsObject]].map { luns =>
luns map {
lunJson =>
val model = (lunJson \ "model").asOpt[String]
val canonicalName = (lunJson \ "CanonicalName").asOpt[String]
val vendor = (lunJson \ "vendor").asOpt[String]
DataStoreLuns(model, canonicalName, vendor)
}
}
}
}
I am getting following error:
play - Cannot invoke the action, eventually got an error: java.lang.IllegalArgumentException: can't serialize class ....models.DataStoreUtilization
How to resolve this error??

Strange result when using squeryl and scala

I'm trying to select the coupled user by getting the correct linkedAccount.
The query that is created is correct but when trying to use a property
on dbuser e.g dbuser.lastName I get a compile error since dbuser is not
of type User but Query1 size=?
It's probably something really simple but I can't figure it out since I'm
a scala and squeryl noob!
Why doesn't it return the correct value and what have I done wrong in my query?
Also, saving works without any issues.
User:
class User(
#Column("id") val id: Long,
#Column("first_name") val firstName : String,
#Column("last_name") val lastName : String,
#Column("email") val email : String,
#Column("email_validated") val emailValidated: Boolean = false,
#Column("last_login") val lastLogin: Timestamp = null,
val created: Timestamp,
val modified: Timestamp,
val active: Boolean = false
) extends KeyedEntity[Long] {
lazy val linkedAccounts: OneToMany[LinkedAccount] = AppDB.usersToLinkedAccounts.left(this)
}
LinkedAccount:
class LinkedAccount(
#Column("id") val id: Long,
#Column("user_id") val userId: Long,
#Column("provider_user_id") val providerUserId: String,
#Column("salt") val salt: String,
#Column("provider_key") val providerKey: String) extends KeyedEntity[Long] {
lazy val user: ManyToOne[User] = AppDB.usersToLinkedAccounts.right(this)
}
AppDB:
object AppDB extends Schema {
val users = table[User]("users")
val linkedAccounts = table[LinkedAccount]("linked_account")
val usersToLinkedAccounts = oneToManyRelation(users, linkedAccounts).via((u, l) => u.id === l.userId)
def userByLinkedAccount(prodivderKey: String, providerUserId: String) = {
from(AppDB.users)(u =>
where(u.id in
from(AppDB.linkedAccounts)(la =>
where(la.userId === u.id and la.providerKey === prodivderKey and la.providerUserId === providerUserId)
select (la.userId)
)
)
select (u)
)
}
The call:
val dbuser = inTransaction {
val u2 = AppDB.userByLinkedAccount(id.providerId, id.id)
println(u2.statement)
}
println(dbuser.lastName)
The sql generated
Select
users10.last_login as users10_last_login,
users10.email as users10_email,
users10.modified as users10_modified,
users10.last_name as users10_last_name,
users10.first_name as users10_first_name,
users10.id as users10_id,
users10.created as users10_created,
users10.email_validated as users10_email_validated,
users10.active as users10_active
From
users users10
Where
(users10.id in ((Select
linked_account13.user_id as linked_account13_user_id
From
linked_account linked_account13
Where
(((linked_account13.user_id = users10.id) and (linked_account13.provider_key = 'facebook')) and (linked_account13.provider_user_id = 'XXXXXXXXXX'))
) ))
BTW, in the documentation to #Column and #ColumnBase it is said:
The preferred way to define column metadata is not not define them (!)
So, you can define columns just as
val id: Long,
instead of
#Column("id") val id: Long
Ok figured it out. I need to make the call, in this case:
.headOption
Also fixed the query after some tips from Per
def userByLinkedAccount(providerKey : String, providerUserId : String) = {
inTransaction {
from(AppDB.users, AppDB.linkedAccounts)((u,la) =>
where (u.id === la.userId and la.providerKey === providerKey and la.providerUserId === providerUserId)
select(u)
).headOption
}
}