Action.async and using for-comprehention in a seperate function - scala

This might be even more Scala than Play related, but I have this issue within some Play code.
I have
def index = Action.async { implicit request =>
val content1Future = Blogs.getAllBlogs(request)
val sidebar1Future = Blogs.getAllBlogsOverview(request)
for {
sidebar1 <- sidebar1Future
content1 <- content1Future
sidebar1Body <- Pagelet.readBody(sidebar1)
content1Body <- Pagelet.readBody(content1)
} yield {
val sidebarcontents = List(sidebar1Body)
val contents = List(content1Body)
Ok(views.html.main("Index", contents, sidebarcontents)).withHeaders(("Cache-Control", "no-cache"))
}
and it works fine.
Now, I try to extract the for-comprehention into a separate function dealing with 'sidebar' and I want to pass in the 'content'.
Like
def index = Action.async { implicit request =>
standardMain(Blogs.getAllBlogs(request))
}
def standardMain(content1Future: => Future[Result])(implicit request: Request[_]) : Future[Result] = {
val sidebar1Future = Blogs.getAllBlogsOverview(request)
for {
sidebar1 <- sidebar1Future
content1 <- content1Future
sidebar1Body <- Pagelet.readBody(sidebar1)
content1Body <- Pagelet.readBody(content1)
} yield {
val sidebarcontents = List(sidebar1Body)
val contents = List(content1Body)
Ok(views.html.main("Index", contents, sidebarcontents)).withHeaders(("Cache-Control", "no-cache"))
}
}
But then I get
[info] Compiling 1 Scala source to D:\Dropbox\Playground\PlayWorld\play-with-forms\target\scala-2.11\classes...
[error] D:\Dropbox\Playground\PlayWorld\play-with-forms\app\controllers\Application.scala:91: type mismatch;
[error] found : scala.concurrent.Future[play.api.mvc.Result]
[error] required: play.api.libs.iteratee.Iteratee[Array[Byte],?]
[error] content1 <- content1Future
[error] ^
[error] D:\Dropbox\Playground\PlayWorld\play-with-forms\app\controllers\Application.scala:90: type mismatch;
[error] found : play.api.libs.iteratee.Iteratee[Array[Byte],Nothing]
[error] required: scala.concurrent.Future[play.api.mvc.Result]
[error] sidebar1 <- sidebar1Future
[error] ^
[error] two errors found
[error] (compile:compile) Compilation failed
[error] application -
I tried many different calls, but somehow I do not know how to get an play.api.libs.iteratee.Iteratee[Array[Byte],?].
How can I achive this. Thanks.

Related

scala spark type mismatching

I need to group my rdd by two columns and aggregate the count. I have a function:
def constructDiagnosticFeatureTuple(diagnostic: RDD[Diagnostic])
: RDD[FeatureTuple] = {
val grouped_patients = diagnostic
.groupBy(x => (x.patientID, x.code))
.map(_._2)
.map{ events =>
val p_id = events.map(_.patientID).take(1).mkString
val f_code = events.map(_.code).take(1).mkString
val count = events.size.toDouble
((p_id, f_code), count)
}
//should be in form:
//diagnostic.sparkContext.parallelize(List((("patient", "diagnostics"), 1.0)))
}
At compile time, I am getting an error:
/FeatureConstruction.scala:38:3: type mismatch;
[error] found : Unit
[error] required: org.apache.spark.rdd.RDD[edu.gatech.cse6250.features.FeatureConstruction.FeatureTuple]
[error] (which expands to) org.apache.spark.rdd.RDD[((String, String), Double)]
[error] }
[error] ^
How can I fix it?
I red this post: Scala Spark type missmatch found Unit, required rdd.RDD , but I do not use collect(), so, it does not help me.

Slick3.2 Error: No matching Shape found

I'm not sure what is wrong here.
The following code block is throwing error:
(for {
(e,r) <- tblDetail.joinLeft(tblMaster).on((e,r) => r.col1 === e.col3)
} yield (e.id)
Error
No matching Shape found.
[error] Slick does not know how to map the given types.
[error] Possible causes: T in Table[T] does not match your * projection,
[error] you use an unsupported type in a Query (e.g. scala List),
[error] or you forgot to import a driver api into scope.
[error] Required level: slick.lifted.FlatShapeLevel
[error] Source type: (slick.lifted.Rep[Int], slick.lifted.Rep[String],...)
[error] Unpacked type: T
[error] Packed type: G
[error] (e,r) <- tblDetail.joinLeft(tblMaster).on((e,r) => r.col1 === e.col3)
I checked the Slick Tables for tblDetail and tblMaster they seemed to be fine.
tblMaster
class TblMaster(tag:Tag)
extends Table[(Int,String,...)](tag, "tbl_master") {
def id = column[Int]("id")
def col3 = column[String]("col3")
def * = (id,col3)
}
tblDetail
class TblDetail(tag:Tag)
extends Table[Entity](tag, "tbl_detail") {
def id = column[Int]("id")
def col1 = column[String]("col1")
def * : ProvenShape[Entity] = (id,col1) <>
((Entity.apply _).tupled, Entity.unapply)
}
Any help would be appreciable.

use SQL in DStream.transform() over Spark Streaming?

There are some examples for use SQL over Spark Streaming in foreachRDD(). But if I want to use SQL in tranform():
case class AlertMsg(host:String, count:Int, sum:Double)
val lines = ssc.socketTextStream("localhost", 8888)
lines.transform( rdd => {
if (rdd.count > 0) {
val t = sqc.jsonRDD(rdd)
t.registerTempTable("logstash")
val sqlreport = sqc.sql("SELECT host, COUNT(host) AS host_c, AVG(lineno) AS line_a FROM logstash WHERE path = '/var/log/system.log' AND lineno > 70 GROUP BY host ORDER BY host_c DESC LIMIT 100")
sqlreport.map(r => AlertMsg(r(0).toString,r(1).toString.toInt,r(2).toString.toDouble))
} else {
rdd
}
}).print()
I got such error:
[error] /Users/raochenlin/Downloads/spark-1.2.0-bin-hadoop2.4/logstash/src/main/scala/LogStash.scala:52: no type parameters for method transform: (transformFunc: org.apache.spark.rdd.RDD[String] => org.apache.spark.rdd.RDD[U])(implicit evidence$5: scala.reflect.ClassTag[U])org.apache.spark.streaming.dstream.DStream[U] exist so that it can be applied to arguments (org.apache.spark.rdd.RDD[String] => org.apache.spark.rdd.RDD[_ >: LogStash.AlertMsg with String <: java.io.Serializable])
[error] --- because ---
[error] argument expression's type is not compatible with formal parameter type;
[error] found : org.apache.spark.rdd.RDD[String] => org.apache.spark.rdd.RDD[_ >: LogStash.AlertMsg with String <: java.io.Serializable]
[error] required: org.apache.spark.rdd.RDD[String] => org.apache.spark.rdd.RDD[?U]
[error] lines.transform( rdd => {
[error] ^
[error] one error found
[error] (compile:compile) Compilation failed
Seems only if I use sqlreport.map(r => r.toString) can be a correct usage?
dstream.transform take a function transformFunc: (RDD[T]) ⇒ RDD[U]
In this case, the if must result in the same type on both evaluations of the condition, which is not the case:
if (count == 0) => RDD[String]
if (count > 0) => RDD[AlertMsg]
In this case, remove the optimization of if rdd.count ... sothat you have an unique transformation path.

Play 2.4.X: How to Upload a File without Saving It to a Temporary File

The following code snippet shows how to save an uploaded file into MongoDB directly:
object MyController extends Controller {
...
def saveImage = Action.async(fsBodyParser) { implicit request =>
val result = for { file <- request.body.files.head.ref
update <- {
fsService.update(
file.id,
Json.obj("metadata" -> Json.obj("category" -> "image"))
)
}
} yield update
result.map { _ =>
Created(success).withHeaders(LOCATION -> s"${localHost.baseUrl}${request.uri}")
}
}
private def fsBodyParser()(
implicit fsService: FsServiceComponent#FsService
): BodyParser[MultipartFormData[Future[MetaFile]]] = {
import BodyParsers.parse._
multipartFormData(Multipart.handleFilePart {
case Multipart.FileInfo(partName, filename, contentType) =>
fsService.iteratee(filename, contentType)
})
}
}
The code above compiles and works correctly up to Play 2.3.x... while if I try to compile it with Play 2.4.x I always get the following error messages:
[error] /home/j3d/Projects/test/app/controllers/MyController.scala:71: not found: value handleFilePart
[error] multipartFormData(handleFilePart {
[error] ^
[error] /home/j3d/Projects/test/app/controllers/MyController:72: not found: value FileInfo
[error] case FileInfo(partName, filename, contentType) =>
[error] ^
[error] (compile:compile) Compilation failed
[error] Total time: 2 s, completed Jan 3, 2015 2:11:47 PM
Look at the latest version of Multipart.scala... Multipart.handleFilePart is private now, and it looks like there is no other option than handleFilePartAsTemporaryFile. Why? Is there a workaround?

Akka -- type mismatch; [error] found : Unit [error] required: scala.sys.process.ProcessLogger

I try to write example code to combine akka and actor. But I got the error message when compile the code.
The code is really simple as showed below.
So, What have I got wrong?
[error] /home/qos/workspaces/actors/actors.scala:20: type mismatch;
[error] found : Unit
[error] required: scala.sys.process.ProcessLogger
[error] execute(cmd)
[error] ^
[error] one error found
[error] (compile:compile) Compilation failed
The code is
import scala.sys.process._
import akka.actor._
object TryActor {
def main(args: Array[String]) {
val akkaSystem = ActorSystem("akkaSystem")
val worker = akkaSystem.actorOf(Props[Worker], name = "work0")
worker ! Command("ls")
}
case class Command(cmd: String)
class Worker extends Actor {
def receive = {
case Command(cmd) => {
println(cmd)
"echo recieve message from someone" !
execute(cmd.toString)
}
}
def execute(cmd: String) {
val process = Process(cmd.toString)
process ! ProcessLogger(_ => {})
}
}
}
It's interpreting execute(cmd.toString) as the argument to !, because newlines don't necessarily end statements. To fix this, don't use postfix syntax, which is deprecated for a reason:
def receive = {
case Command(cmd) => {
println(cmd)
"echo recieve message from someone".!
execute(cmd.toString)
}
}