Can you access a SBT SettingKey inside a Command? - scala

I am writing a Command and want to use the Logger in TaskStreams but this is not possible since you cannot access .value of a SettingKey in a Command. Is there some way?
def myCommand = Command.single("myCommand") {
case (currentState, userInput) =>
val extracted = Project.extract(currentState)
val log = streams.value.log <--- not allowed
log.info("Some logging")
currentState
}

streams is intended for tasks, not commands.
So one way is to create a "holder" TaskKey and get a stream of that, for instance sbt-pgp creates and uses pgpCmdContext - see the definition of pgp-cmd.
Another way is using sLog, but I'm not sure if sLog should be used here:
val myCommand = Command.single("myCommand") { case (s, arg) =>
val extracted = Project extract s
val log = extracted get sLog
log info "Some logging"
currentState
}

Related

Logging from custom SBT plugin not working

At work we've created this plugin, and from one of its inputKey it's suppose to log an exception (whenever that occurs). For demonstration purposes I've only used the relevant stuff, so have a look at the following code:
theTask := {
streams.value.log
val args: Seq[String] = spaceDelimited("<arg>").parsed
args.foreach { arg =>
val env = SomeLogicThatPotentiallyThrowsException(arg)
}
}
object SomeLogicThatPotentiallyThrowsException {
def apply(arg: String): String = {
if(arg==throwexception) throw Exception("Boom!")
else arg
}
}
so whenever I'm trying to use the plugin and execute the task as follows:
sbt theTask throwexception
The error is only logged whenever I've added streams.value.log as a statement in the theTask task..
So yeah.. am I doing anything wrong here or is this a bug in sbt?
Thanks for reading people

How to bind enum on routes using scala?

I'm trying to receive some filter list as query param of a get request, which is also an enum on my controller. To do so, I'm using play framework with scala. The problem is: I can't putting the enum type as query param on it, because IDE doesn't recognize as a valid type.
So, I have something like this on routes file
GET /service-orders/ controllers.ServiceOrdersController.listServiceOrders(status: ServiceStatus)
My enum file:
object ServiceStatus extends Enumeration {
type ServiceStatus = Value
val Pending = Value("pending")
val Started = Value("started")
val Completed = Value("completed")
val Error = Value("error")
}
On build.sbt, I use this trying to inject package on routes file
routesImport ++= Seq(
"serviceOrders.models.ServiceStatus"
),
I tried a lot of things, but with no success. I read in some place that I could use QueryStringBindable function, but I can't did this works well... Can you guys please help me to solve this?
Edit: Btw, there's a way to check if status is contained on a list of filters without making this?
.filter {
serviceOrder =>
status.map(serviceOrder.serviceStatus === _)
.reduceOption(_ || _)
.getOrElse(true: Rep[Boolean])
}
This was the only way I could thought to filter status by a list of filters received as query param from API.
You can implement the QueryStringBindable instance like so:
package serviceOrders.models
object ServiceStatus extends Enumeration {
type ServiceStatus = Value
val Pending = Value("pending")
val Started = Value("started")
val Completed = Value("completed")
val Error = Value("error")
implicit val queryStringBindable: QueryStringBindable[ServiceStatus] =
new QueryStringBindable[ServiceStatus] {
override def bind(
key: String,
params: Map[String, Seq[String]]
): Option[Either[String, ServiceStatus]] =
params.get(key).collect {
case Seq(s) =>
ServiceStatus.values.find(_.toString == s).toRight("invalid value")
}
override def unbind(key: String, value: ServiceStatus): String =
implicitly[QueryStringBindable[String]].unbind(key, value.toString)
}
}
In build.sbt you need this:
routesImport ++= Seq("serviceOrders.models.ServiceStatus._")
And this in your routes file:
GET /some/route controllers.SomeController.index(status: ServiceStatus)
Then you can create an index method that takes a ServiceStatus parameter in SomeController and Play will take care of the query parameters.
// edit:
You could actually use the QueryStringBindable.Parsing class to simplify the implementation further.

Akka Stream return object from Sink

I've got a SourceQueue. When I offer an element to this I want it to pass through the Stream and when it reaches the Sink have the output returned to the code that offered this element (similar as Sink.head returns an element to the RunnableGraph.run() call).
How do I achieve this? A simple example of my problem would be:
val source = Source.queue[String](100, OverflowStrategy.fail)
val flow = Flow[String].map(element => s"Modified $element")
val sink = Sink.ReturnTheStringSomehow
val graph = source.via(flow).to(sink).run()
val x = graph.offer("foo")
println(x) // Output should be "Modified foo"
val y = graph.offer("bar")
println(y) // Output should be "Modified bar"
val z = graph.offer("baz")
println(z) // Output should be "Modified baz"
Edit: For the example I have given in this question Vladimir Matveev provided the best answer. However, it should be noted that this solution only works if the elements are going into the sink in the same order they were offered to the source. If this cannot be guaranteed the order of the elements in the sink may differ and the outcome might be different from what is expected.
I believe it is simpler to use the already existing primitive for pulling values from a stream, called Sink.queue. Here is an example:
val source = Source.queue[String](128, OverflowStrategy.fail)
val flow = Flow[String].map(element => s"Modified $element")
val sink = Sink.queue[String]().withAttributes(Attributes.inputBuffer(1, 1))
val (sourceQueue, sinkQueue) = source.via(flow).toMat(sink)(Keep.both).run()
def getNext: String = Await.result(sinkQueue.pull(), 1.second).get
sourceQueue.offer("foo")
println(getNext)
sourceQueue.offer("bar")
println(getNext)
sourceQueue.offer("baz")
println(getNext)
It does exactly what you want.
Note that setting the inputBuffer attribute for the queue sink may or may not be important for your use case - if you don't set it, the buffer will be zero-sized and the data won't flow through the stream until you invoke the pull() method on the sink.
sinkQueue.pull() yields a Future[Option[T]], which will be completed successfully with Some if the sink receives an element or with a failure if the stream fails. If the stream completes normally, it will be completed with None. In this particular example I'm ignoring this by using Option.get but you would probably want to add custom logic to handle this case.
Well, you know what offer() method returns if you take a look at its definition :) What you can do is to create Source.queue[(Promise[String], String)], create helper function that pushes pair to stream via offer, make sure offer doesn't fail because queue might be full, then complete promise inside your stream and use future of the promise to catch completion event in external code.
I do that to throttle rate to external API used from multiple places of my project.
Here is how it looked in my project before Typesafe added Hub sources to akka
import scala.concurrent.Promise
import scala.concurrent.Future
import java.util.concurrent.ConcurrentLinkedDeque
import akka.stream.scaladsl.{Keep, Sink, Source}
import akka.stream.{OverflowStrategy, QueueOfferResult}
import scala.util.Success
private val queue = Source.queue[(Promise[String], String)](100, OverflowStrategy.backpressure)
.toMat(Sink.foreach({ case (p, param) =>
p.complete(Success(param.reverse))
}))(Keep.left)
.run
private val futureDeque = new ConcurrentLinkedDeque[Future[String]]()
private def sendQueuedRequest(request: String): Future[String] = {
val p = Promise[String]
val offerFuture = queue.offer(p -> request)
def addToQueue(future: Future[String]): Future[String] = {
futureDeque.addLast(future)
future.onComplete(_ => futureDeque.remove(future))
future
}
offerFuture.flatMap {
case QueueOfferResult.Enqueued =>
addToQueue(p.future)
}.recoverWith {
case ex =>
val first = futureDeque.pollFirst()
if (first != null)
addToQueue(first.flatMap(_ => sendQueuedRequest(request)))
else
sendQueuedRequest(request)
}
}
I realize that blocking synchronized queue may be bottleneck and may grow indefinitely but because API calls in my project are made only from other akka streams which are backpressured I never have more than dozen items in futureDeque. Your situation may differ.
If you create MergeHub.source[(Promise[String], String)]() instead you'll get reusable sink. Thus every time you need to process item you'll create complete graph and run it. In that case you won't need hacky java container to queue requests.

Interact (i/o) with an external process in Scala

I'm looking for a simple way to start an external process and then write strings to its input and read its output.
In Python, this works:
mosesProcess = subprocess.Popen([mosesBinPath, '-f', mosesModelPath], stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE);
# ...
mosesProcess.stdin.write(aRequest);
mosesAnswer = mosesProcess.stdout.readline().rstrip();
# ...
mosesProcess.stdin.write(anotherRequest);
mosesAnswer = mosesProcess.stdout.readline().rstrip();
# ...
mosesProcess.stdin.close();
I think in Scala this should be done with scala.sys.process.ProcessBuilder and scala.sys.process.ProcessIO but I don't get how they work (especially the latter).
EDIT:
I have tried things like:
val inputStream = new scala.concurrent.SyncVar[java.io.OutputStream];
val outputStream = new scala.concurrent.SyncVar[java.io.InputStream];
val errStream = new scala.concurrent.SyncVar[java.io.InputStream];
val cmd = "myProc";
val pb = process.Process(cmd);
val pio = new process.ProcessIO(stdin => inputStream.put(stdin),
stdout => outputStream.put(stdout),
stderr => errStream.put(stderr));
pb.run(pio);
inputStream.get.write(("request1" + "\n").getBytes);
println(outputStream.get.read); // It is blocked here
inputStream.get.write(("request2" + "\n").getBytes);
println(outputStream.get.read);
inputStream.get.close()
But the execution gets stuck.
Granted, attrib below is not a great example on the write side of things. I have an EchoServer that would input/output
import scala.sys.process._
import java.io._
object EchoClient{
def main(args: Array[String]) {
var bContinue=true
var cmd="C:\\\\windows\\system32\\attrib.exe"
println(cmd)
val process = Process (cmd)
val io = new ProcessIO (
writer,
out => {scala.io.Source.fromInputStream(out).getLines.foreach(println)},
err => {scala.io.Source.fromInputStream(err).getLines.foreach(println)})
while (bContinue) {
process run io
var answer = readLine("Run again? (y/n)? ")
if (answer=="n" || answer=="N")
bContinue=false
}
}
def reader(input: java.io.InputStream) = {
// read here
}
def writer(output: java.io.OutputStream) = {
// write here
//
}
// TODO: implement an error logger
}
output below :
C:\\windows\system32\attrib.exe
A C:\dev\EchoClient$$anonfun$1.class
A C:\dev\EchoClient$$anonfun$2$$anonfun$apply$1.class
A C:\dev\EchoClient$$anonfun$2.class
A C:\dev\EchoClient$$anonfun$3$$anonfun$apply$2.class
A C:\dev\EchoClient$$anonfun$3.class
A C:\dev\EchoClient$.class
A C:\dev\EchoClient.class
A C:\dev\EchoClient.scala
A C:\dev\echoServer.bat
A C:\dev\EchoServerChg$$anonfun$main$1.class
A C:\dev\EchoServerChg$.class
A C:\dev\EchoServerChg.class
A C:\dev\EchoServerChg.scala
A C:\dev\ScannerTest$$anonfun$main$1.class
A C:\dev\ScannerTest$.class
A C:\dev\ScannerTest.class
A C:\dev\ScannerTest.scala
Run again? (y/n)?
Scala API for ProcessIO:
new ProcessIO(in: (OutputStream) ⇒ Unit, out: (InputStream) ⇒ Unit, err: (InputStream) ⇒ Unit)
I suppose you should provide at least two arguments, 1 outputStream function (writing to the process), 1 inputStream function (reading from the process).
For instance:
def readJob(in: InputStream) {
// do smthing with in
}
def writeJob(out: OutputStream) {
// do somthing with out
}
def errJob(err: InputStream) {
// do smthing with err
}
val process = new ProcessIO(writeJob, readJob, errJob)
Please keep in mind that the streams are Java streams so you will have to check Java API.
Edit: the package page provides examples, maybe you could take a look at them.
ProcessIO is the way to go for low level control and input and output interaction. There even is an often overlooked helper object BasicIO that assists with creating common ProcessIO instances for reading, connecting in/out streams with utility functions. You can look at the source for BasicIO.scala to see what it is doing internally in creating the ProcessIO Instances.
You can sometimes find inspiration from test cases or tools created for the class itself by the project. In the case of Scala, have a look at the source on GitHub. We are fortunate in that there is a detailed example of ProcessIO being used for the scala GraphViz Dot process runner DotRunner.scala!

NPE when accessing val that is not lazy

EDIT2:
So another heads up on this:
I still have no idea why this happens, but I have now a similar problem with jOOQ and the Dialect I have to it. My code here looks like this:
object MyDB {
private lazy val dialect = SQLDialect.POSTGRES
def withSession[T](f: DSLContext => T) = f(DSL.using(getConnectionPool, dialect))
}
if I remove the "lazy" it blows up when I try to execute jOOQ queries in line 552 of https://github.com/jOOQ/jOOQ/blob/version-3.2.0/jOOQ/src/main/java/org/jooq/impl/DefaultRenderContext.java
That happens to be a line where the dialect is evaluated. After I added the lazy everything works as expected.
Maybe this is an issue with the threading of LiftWeb and the executing thread does not see the correct value of the val? I have no idea...
EDIT:
I have found a way to do what I want simply by adding a lazy to the values in the first, broken version. So with lazy vals it all works well.
However I'll let this stay open, as I have no idea how to explain this behavior.
Original Post:
So I am trying to use Parameterized Queries in Slick.
My code is below, my problem is that I get an NPE (see comments) when I try to run this from within the webapplication (liftweb, container started with sbt) (the application creates an object of the class PlayerListCollector that is given the string "cola")
When I execute the object as App from within Eclipse the println at the bottom works just fine.
class PlayerListCollector(term: String) {
import PlayerListCollector._
val searchResult = executeSearch(term)
}
object PlayerListCollector extends Loggable with App{
private val searchNameCurrent = Parameters[String].flatMap {
case (term) => {
for {
p <- Players if p.uberName.isNotNull
n <- p.displayName if (n.displayName.toLowerCase.like(term))
} yield (p.id, n.displayName)
}
}
private def executeSearch(term: String) = {
val lowerTerm = "%"+term.toLowerCase()+"%"
logger info "HELLO " +lowerTerm // prints HELLO %cola%
val foo = searchNameCurrent(lowerTerm) // NPE right in this line
logger info foo // never executed from here on ...
val byCurrent = foo.list
logger info byCurrent
[...]
}
// this works if run directly from within eclipse!
println(DB withSession {
searchNameCurrent("%cola%").list
})
}
The problem vanishes when I change the code to look like this:
[...]
object PlayerListCollector extends Loggable with App{
private def executeSearch(term: String) = {
val searchNameCurrent = Parameters[String].flatMap {
case (term) => {
for {
p <- Players if p.uberName.isNotNull
n <- p.displayName if (n.displayName.toLowerCase.like(term))
} yield (p.id, n.displayName)
}
}
val lowerTerm = "%"+term.toLowerCase()+"%"
logger info "HELLO " +lowerTerm // prints HELLO %cola%
val foo = searchNameCurrent(lowerTerm) // executes just fine when the query is in a local val
logger info foo
val byCurrent = foo.list
logger info byCurrent // prints expected output
[...]
}
[...]
}
I have no idea whatsoever why this happens.
Isn't the whole point of a paramterized query to put it in a val that is only once filled with a value so it does not need to be compiled multiple times?
So it turns out I used the App-Trait (http://www.scala-lang.org/api/current/index.html#scala.App) on these objects.
Reading the big fat caveat tells us what is happening I guess.