Process Interaction through stdin/stdout - scala

I am trying to build a class that starts a system process which waits for stdin. The class should have another method which takes a string, inputs that into the system process, and return the process' output.
The reason is that starting the process involves loading a lot of data and hence takes a while.
I am trying to dummy-test this with bc, so that bc is started and waits for input. I would envision an interface like this:
case class BcWrapper(executable: File) {
var bc: Option[???] = None
def startBc(): Unit = bc = Some(???)
def calc(input: String): String = bc.get.???
def stopBc(): Unit = bc.get.???
}
I would like to be able to use it like this:
val wrapper = BcWrapper(new File("/usr/bin/bc"))
wrapper.startBc()
val result1 = wrapper.calc("1 + 1") // should be "2"
val result2 = wrapper.calc(???)
[...]
wrapper.stopBc()
This topic has been touched in multiple questions, but never fully answered for a use case like this one. This question or this one seems to come close. However, I am not sure how to implement the ProcessLogger, nor whether to use one in the first place.
Unfortunately, the Scala documentation is not very elaborate either.
Note that I do not want to read from stdin, but want to call a function.
The background is that I want to read a large file, read it line by line, preprocess the lines, pass them to the external process, and post-process the output.

You can get something similar, but simpler, like so.
import sys.process._
import util.Try
class StdInReader(val reader :String) {
def send(input :String) :Try[String] =
Try(s"/bin/echo $input".#|(reader).!!.trim)
}
usage:
val bc = new StdInReader("/usr/bin/bc")
bc.send("2 * 8") //res0: scala.util.Try[String] = Success(16)
bc.send("12 + 8") //res1: scala.util.Try[String] = Success(20)
bc.send("22 - 8") //res2: scala.util.Try[String] = Success(14)
Programs that send a non-zero exit-code (bc doesn't) will result with a Failure().
If you need more fine-grained control you might start with something like this and expand on it.
import sys.process._
class ProcHandler(val cmnd :String) {
private val resbuf = collection.mutable.Buffer.empty[String]
def run(data :Seq[String]) :Unit = {
cmnd.run(new ProcessIO(
in => {
val writer = new java.io.PrintWriter(in)
data.foreach(writer.println)
writer.close()
},
out => {
val src = io.Source.fromInputStream(out)
src.getLines().foreach(resbuf += _)
src.close()
},
_.close() //maybe create separate buffer for stderr?
)).exitValue()
}
def results() :Seq[String] = {
val rs = collection.mutable.Buffer.empty[String]
resbuf.copyToBuffer(rs)
resbuf.clear()
rs
}
}
usage:
val bc = new ProcHandler("/usr/bin/bc")
bc.run(List("4+5","6-2","2*5"))
bc.run(List("99/3","11*77"))
bc.results() //res0: Seq[String] = ArrayBuffer(9, 4, 10, 33, 847)
OK, I did some more research and found this. It appears to get at what you want but there are limitations. In particular, the process stays open for input until you want to get output. At that point IO streams are closed to insure all buffers are flushed.
import sys.process._
import util.Try
class ProcHandler(val cmnd :String) {
private val procInput = new java.io.PipedOutputStream()
private val procOutput = new java.io.PipedInputStream()
private val proc = cmnd.run( new ProcessIO(
{ in => // attach to the process's internal input stream
val istream = new java.io.PipedInputStream(procInput)
val buf = Array.fill(100)(0.toByte)
Iterator.iterate(istream.read(buf)){ br =>
in.write(buf, 0, br)
istream.read(buf)
}.takeWhile(_>=0).toList
in.close()
},
{ out => // attach to the process's internal output stream
val ostream = new java.io.PipedOutputStream(procOutput)
val buf = Array.fill(100)(0.toByte)
Iterator.iterate(out.read(buf)){ br =>
ostream.write(buf, 0, br)
out.read(buf)
}.takeWhile(_>=0).toList
out.close()
},
_ => () // ignore stderr
))
private val procO = new java.io.BufferedReader(new java.io.InputStreamReader(procOutput))
private val procI = new java.io.PrintWriter(procInput, true)
def feed(str :String) :Unit = procI.println(str)
def feed(ss :Seq[String]) :Unit = ss.foreach(procI.println)
def read() :List[String] = {
procI.close() //close input before reading output
val lines = Stream.iterate(Try(procO.readLine)){_ =>
Try(procO.readLine)
}.takeWhile(_.isSuccess).map(_.get).toList
procO.close()
lines
}
}
usage:
val bc = new ProcHandler("/usr/bin/bc")
bc.feed(List("9*3","4+11")) //res0: Unit = ()
bc.feed("4*13") //res1: Unit = ()
bc.read() //res2: List[String] = List(27, 15, 52)
bc.read() //res3: List[String] = List()
OK, this is my final word on the subject. I think this ticks every item on your wish list: start the process only once, it stays alive until actively closed, allows alternating the writing and reading.
import sys.process._
class ProcHandler(val cmnd :Seq[String]) {
private var os: java.io.OutputStream = null
private var is: java.io.InputStream = null
private val pio = new ProcessIO(os = _, is = _, _.close())
private val proc = cmnd.run(pio)
def feed(ss :String*) :Unit = {
ss.foreach(_.foreach(os.write(_)))
os.flush()
}
def ready :Boolean = is.available() > 0
def read() :String = {
Seq.fill[Char](is.available())(is.read().toChar).mkString
}
def close() :Unit = {
proc.exitValue()
os.close()
is.close()
}
}
There are still issues and much room for improvement. IO is handled at a basic level (streams) and I'm not sure what I'm doing here is completely safe and correct. The input, feed(), is required to supply the necessary NewLine terminations, and the output, read(), is just a raw String and not separated into a nice collection of string results.
Note that this will bleed system resources if the client code fails to close() all processes.
Note also that reading doesn't wait for content (i.e. no blocking). After writing the response might not be immediately available.
usage:
val bc = new ProcHandler(Seq("/usr/bin/bc","-q"))
bc.feed("44-21\n", "21*4\n")
bc.feed("67+11\n")
if (bc.ready) bc.read() else "not ready" // "23\n84\n78\n"
bc.feed("67-11\n")
if (bc.ready) bc.read() else "not ready" // "56\n"
bc.feed("67*11\n", "1+2\n")
if (bc.ready) bc.read() else "not ready" // "737\n3\n"
if (bc.ready) bc.read() else "not ready" // "not ready"
bc.close()

Related

File Upload and processing using akka-http websockets

I'm using some sample Scala code to make a server that receives a file over websocket, stores the file temporarily, runs a bash script on it, and then returns stdout by TextMessage.
Sample code was taken from this github project.
I edited the code slightly within echoService so that it runs another function that processes the temporary file.
object WebServer {
def main(args: Array[String]) {
implicit val actorSystem = ActorSystem("akka-system")
implicit val flowMaterializer = ActorMaterializer()
val interface = "localhost"
val port = 3000
import Directives._
val route = get {
pathEndOrSingleSlash {
complete("Welcome to websocket server")
}
} ~
path("upload") {
handleWebSocketMessages(echoService)
}
val binding = Http().bindAndHandle(route, interface, port)
println(s"Server is now online at http://$interface:$port\nPress RETURN to stop...")
StdIn.readLine()
binding.flatMap(_.unbind()).onComplete(_ => actorSystem.shutdown())
println("Server is down...")
}
implicit val actorSystem = ActorSystem("akka-system")
implicit val flowMaterializer = ActorMaterializer()
val echoService: Flow[Message, Message, _] = Flow[Message].mapConcat {
case BinaryMessage.Strict(msg) => {
val decoded: Array[Byte] = msg.toArray
val imgOutFile = new File("/tmp/" + "filename")
val fileOuputStream = new FileOutputStream(imgOutFile)
fileOuputStream.write(decoded)
fileOuputStream.close()
TextMessage(analyze(imgOutFile))
}
case BinaryMessage.Streamed(stream) => {
stream
.limit(Int.MaxValue) // Max frames we are willing to wait for
.completionTimeout(50 seconds) // Max time until last frame
.runFold(ByteString(""))(_ ++ _) // Merges the frames
.flatMap { (msg: ByteString) =>
val decoded: Array[Byte] = msg.toArray
val imgOutFile = new File("/tmp/" + "filename")
val fileOuputStream = new FileOutputStream(imgOutFile)
fileOuputStream.write(decoded)
fileOuputStream.close()
Future(Source.single(""))
}
TextMessage(analyze(imgOutFile))
}
private def analyze(imgfile: File): String = {
val p = Runtime.getRuntime.exec(Array("./run-vision.sh", imgfile.toString))
val br = new BufferedReader(new InputStreamReader(p.getInputStream, StandardCharsets.UTF_8))
try {
val result = Stream
.continually(br.readLine())
.takeWhile(_ ne null)
.mkString
result
} finally {
br.close()
}
}
}
}
During testing using Dark WebSocket Terminal, case BinaryMessage.Strict works fine.
Problem: However, case BinaryMessage.Streaming doesn't finish writing the file before running the analyze function, resulting in a blank response from the server.
I'm trying to wrap my head around how Futures are being used here with the Flows in Akka-HTTP, but I'm not having much luck outside trying to get through all the official documentation.
Currently, .mapAsync seems promising, or basically finding a way to chain futures.
I'd really appreciate some insight.
Yes, mapAsync will help you in this occasion. It is a combinator to execute Futures (potentially in parallel) in your stream, and present their results on the output side.
In your case to make things homogenous and make the type checker happy, you'll need to wrap the result of the Strict case into a Future.successful.
A quick fix for your code could be:
val echoService: Flow[Message, Message, _] = Flow[Message].mapAsync(parallelism = 5) {
case BinaryMessage.Strict(msg) => {
val decoded: Array[Byte] = msg.toArray
val imgOutFile = new File("/tmp/" + "filename")
val fileOuputStream = new FileOutputStream(imgOutFile)
fileOuputStream.write(decoded)
fileOuputStream.close()
Future.successful(TextMessage(analyze(imgOutFile)))
}
case BinaryMessage.Streamed(stream) =>
stream
.limit(Int.MaxValue) // Max frames we are willing to wait for
.completionTimeout(50 seconds) // Max time until last frame
.runFold(ByteString(""))(_ ++ _) // Merges the frames
.flatMap { (msg: ByteString) =>
val decoded: Array[Byte] = msg.toArray
val imgOutFile = new File("/tmp/" + "filename")
val fileOuputStream = new FileOutputStream(imgOutFile)
fileOuputStream.write(decoded)
fileOuputStream.close()
Future.successful(TextMessage(analyze(imgOutFile)))
}
}

How to write a string to Scala Process?

I start and have running a Scala process.
val dir = "/path/to/working/dir/"
val stockfish = Process(Seq("wine", dir + "stockfish_8_x32.exe"))
val logger = ProcessLogger(printf("Stdout: %s%n", _))
val stockfishProcess = stockfish.run(logger, connectInput = true)
The process reads from and writes to standard IO (console). How can I send a string command to the process if it's been already started?
Scala process API has ProcessBuilder which has in turn bunch of useful methods. But ProcessBuilder is used before a process starts to compose complex shell commands. Also Scala has ProcessIO to handle input or output. I don't need it too. I just need to send message to my process.
In Java I would do something like this.
String dir = "/path/to/working/dir/";
ProcessBuilder builder = new ProcessBuilder("wine", dir + "stockfish_8_x32.exe");
Process process = builder.start();
OutputStream stdin = process.getOutputStream();
InputStream stdout = process.getInputStream();
BufferedReader reader = new BufferedReader(new InputStreamReader(stdout));
BufferedWriter writer = new BufferedWriter(new OutputStreamWriter(stdin));
new Thread(() -> {
try {
String line;
while ((line = reader.readLine()) != null) {
System.out.println("Stdout: " + line);
}
} catch (IOException e) {
e.printStackTrace();
}
}).start();
Thread.sleep(5000); // it's just for example
writer.write("quit"); // send to the process command to stop working
writer.newLine();
writer.flush();
It works quite well. I start my process, get InputStream and OutputStream from it, and use the streams to interact with the process.
It appears Scala Process trait provides no ways to write to it. ProcessBuilder is useless after process run. And ProcessIO is just for IO catching and handling.
Are there any ways to write to Scala running process?
UPDATE:
I don't see how I may use ProcessIO to pass a string to running process.
I did the following.
import scala.io.Source
import scala.sys.process._
object Sample extends App {
def out = (output: java.io.OutputStream) => {
output.flush()
output.close()
}
def in = (input: java.io.InputStream) => {
println("Stdout: " + Source.fromInputStream(input).mkString)
input.close()
}
def go = {
val dir = "/path/to/working/dir/"
val stockfishSeq = Seq("wine", dir + "/stockfish_8_x32.exe")
val pio = new ProcessIO(out, in, err => {})
val stockfish = Process(stockfishSeq)
stockfish.run(pio)
Thread.sleep(5000)
System.out.write("quit\n".getBytes)
pio.writeInput(System.out) // "writeInput" is function "out" which I have passed to conforming ProcessIO instance. I can invoke it from here. It takes OutputStream but where can I obtain it? Here I just pass System.out for example.
}
go
}
Of course it does not work and I failed to understand how to implement functionality as in my Java snippet above. It would be great to have advice or snippet of Scala code clearing my issue.
I think the documentation around Scala processes (specifically the usage and semantics of ProcessIO) could use some improvement. The first time I tried using this API, I also found it very confusing, and it took some trial and error to get my subprocess i/o working correctly.
I think seeing a simple example is probably all you really need. I'll do something really simple: invoking bc as a subprocess to do some trivial computations, and then printing the answers to my stdout. My goal is to do something like this (but from Scala rather than from my shell):
$ printf "1+2\n3+4\n" | bc
3
7
Here's how I'd do it in Scala:
import scala.io.Source
import scala.sys.process._
object SimpleProcessExample extends App {
def out = (output: java.io.OutputStream) => {
output.flush()
output.close()
}
def in = (input: java.io.InputStream) => {
println("Stdout: " + Source.fromInputStream(input).mkString)
input.close()
}
// limit scope of any temporary variables
locally {
val calcCommand = "bc"
// strings are implicitly converted to ProcessBuilder
// via scala.sys.process.ProcessImplicits.stringToProcess(_)
val calcProc = calcCommand.run(new ProcessIO(
// Handle subprocess's stdin
// (which we write via an OutputStream)
in => {
val writer = new java.io.PrintWriter(in)
writer.println("1 + 2")
writer.println("3 + 4")
writer.close()
},
// Handle subprocess's stdout
// (which we read via an InputStream)
out => {
val src = scala.io.Source.fromInputStream(out)
for (line <- src.getLines()) {
println("Answer: " + line)
}
src.close()
},
// We don't want to use stderr, so just close it.
_.close()
))
// Using ProcessBuilder.run() will automatically launch
// a new thread for the input/output routines passed to ProcessIO.
// We just need to wait for it to finish.
val code = calcProc.exitValue()
println(s"Subprocess exited with code $code.")
}
}
Notice that you don't actually call any of the methods of the ProcessIO object directly because they're automatically called by the ProcessBuilder.
Here's the result:
$ scala SimpleProcessExample
Answer: 3
Answer: 7
Subprocess exited with code 0.
If you wanted interaction between the input and output handlers to the subprocess, you can use standard thread communication tools (e.g., have both close over an instance of BlockingQueue).
Here is an example of obtaining input and output streams from a process, which you can write to and read from after the process starts:
object demo {
import scala.sys.process._
def getIO = {
// create piped streams that can attach to process streams:
val procInput = new java.io.PipedOutputStream()
val procOutput = new java.io.PipedInputStream()
val io = new ProcessIO(
// attach to the process's internal input stream
{ in =>
val istream = new java.io.PipedInputStream(procInput)
val buf = Array.fill(100)(0.toByte)
var br = 0
while (br >= 0) {
br = istream.read(buf)
if (br > 0) { in.write(buf, 0, br) }
}
in.close()
},
// attach to the process's internal output stream
{ out =>
val ostream = new java.io.PipedOutputStream(procOutput)
val buf = Array.fill(100)(0.toByte)
var br = 0
while (br >= 0) {
br = out.read(buf)
if (br > 0) { ostream.write(buf, 0, br) }
}
out.close()
},
// ignore stderr
{ err => () }
)
// run the command with the IO object:
val cmd = List("awk", "{ print $1 + $2 }")
val proc = cmd.run(io)
// wrap the raw streams in formatted IO objects:
val procO = new java.io.BufferedReader(new java.io.InputStreamReader(procOutput))
val procI = new java.io.PrintWriter(procInput, true)
(procI, procO)
}
}
Here's a short example of using the input and output objects. Note that it's hard to guarantee that the process will receive it's input until you close the input streams/objects, since everything is piped, buffered, etc.
scala> :load /home/eje/scala/input2proc.scala
Loading /home/eje/scala/input2proc.scala...
defined module demo
scala> val (procI, procO) = demo.getIO
procI: java.io.PrintWriter = java.io.PrintWriter#7e809b79
procO: java.io.BufferedReader = java.io.BufferedReader#5cc126dc
scala> procI.println("1 2")
scala> procI.println("3 4")
scala> procI.println("5 6")
scala> procI.close()
scala> procO.readLine
res4: String = 3
scala> procO.readLine
res5: String = 7
scala> procO.readLine
res6: String = 11
scala>
In general, if you are managing both input and output simultaneously in the same thread, there is the potential for deadlock, since either read or write can block waiting for the other. It is safest to run input logic and output logic in their own threads. With these threading concerns in mind, it is also possible to just put the input and output logic directly into the definitions { in => ... } and { out => ... }, as these are both run in separate threads automatically
I haven't actually tried this, but the documentation says that you can use a instance of ProcessIO to handle the Process's input and output in a manner similar to what you would do in Java.
var outPutStream: Option[OutputStream] = None
val io = new ProcessIO(
{ outputStream =>
outPutStream = Some(outputStream)
},
Source.fromInputStream(_).getLines().foreach(println),
Source.fromInputStream(_).getLines().foreach(println)
)
command run io
val out = outPutStream.get
out.write("test" getBytes())
You can get an InputStream in the same way.

Writing data generated in scala to a text file

I was hoping somebody could help, I'm new to scala and I'm having some issues writing my output to a text file.
I have a data table and I've written some code to read it in one line at a time, do what I want it to do, and now I need it to write that line to a text file.
So for example, I have the following table of data type
Name, Date, goX, goY, stopX, stopY
1, 12/01/01, 1166, 2299, 3300, 4477
My code, takes the first characters of goX and goY and creates a new number, in this instance 1.2 and does the same for stopX and stopY so in this case you get 3.4
What I want to get in the text file is essentially the following:
go, stop
1.2, 3.4
and I want it to go through hundreds of lines doing this until I have a long list of on and off in the text file.
My current code is as follows, this is almost certainly not the most elegant solution but it is my first ever scala/java code:
import scala.io.Source
object FT2 extends App {
for(line<-Source.fromFile("C://Users//Data.csv").getLines){
var array = line.split(",")
val gox = (array(2));
val xStringGo = gox.toString
val goX =xStringGo.dropRight(1|2)
val goy = (array(3));
val yStringGo = goy.toString
val goY = yStringGo.dropRight(1|2)
val goXY = goX+"."+goY
val stopx = (array(4));
val xStringStop = stopx.toString
val stopX =xStringStop.dropRight(1|2)
val stopy = (array(3));
val yStringStop = stopy.toString
val stopY = yStringStop.dropRight(1|2)
val stopXY = stopX+"."+stopY
val GoStop = List(goXY,stopXY)
//This is where I want to print GoStop to a text file
}
Any help is much appreciated!
This should do it:
import java.io._
val data = List("everything", "you", "want", "to", "write", "to", "the", "file")
val file = "whatever.txt"
val writer = new BufferedWriter(new OutputStreamWriter(new FileOutputStream(file)))
for (x <- data) {
writer.write(x + "\n") // however you want to format it
}
writer.close()
But you can make it a little nicer by creating a method that will automatically close stuff for you:
def using[T <: Closeable, R](resource: T)(block: T => R): R = {
try { block(resource) }
finally { resource.close() }
}
using(new BufferedWriter(new OutputStreamWriter(new FileOutputStream(file)))) {
writer =>
for (x <- data) {
writer.write(x + "\n") // however you want to format it
}
}
So:
using(new BufferedWriter(new OutputStreamWriter(new FileOutputStream("output.txt")))) {
writer =>
for(line <- io.Source.fromFile("input.txt").getLines) {
writer.write(line + "\n") // however you want to format it
}
}

Creating serializable objects from Scala source code at runtime

To embed Scala as a "scripting language", I need to be able to compile text fragments to simple objects, such as Function0[Unit] that can be serialised to and deserialised from disk and which can be loaded into the current runtime and executed.
How would I go about this?
Say for example, my text fragment is (purely hypothetical):
Document.current.elements.headOption.foreach(_.open())
This might be wrapped into the following complete text:
package myapp.userscripts
import myapp.DSL._
object UserFunction1234 extends Function0[Unit] {
def apply(): Unit = {
Document.current.elements.headOption.foreach(_.open())
}
}
What comes next? Should I use IMain to compile this code? I don't want to use the normal interpreter mode, because the compilation should be "context-free" and not accumulate requests.
What I need to get hold off from the compilation is I guess the binary class file? In that case, serialisation is straight forward (byte array). How would I then load that class into the runtime and invoke the apply method?
What happens if the code compiles to multiple auxiliary classes? The example above contains a closure _.open(). How do I make sure I "package" all those auxiliary things into one object to serialize and class-load?
Note: Given that Scala 2.11 is imminent and the compiler API probably changed, I am happy to receive hints as how to approach this problem on Scala 2.11
Here is one idea: use a regular Scala compiler instance. Unfortunately it seems to require the use of hard disk files both for input and output. So we use temporary files for that. The output will be zipped up in a JAR which will be stored as a byte array (that would go into the hypothetical serialization process). We need a special class loader to retrieve the class again from the extracted JAR.
The following assumes Scala 2.10.3 with the scala-compiler library on the class path:
import scala.tools.nsc
import java.io._
import scala.annotation.tailrec
Wrapping user provided code in a function class with a synthetic name that will be incremented for each new fragment:
val packageName = "myapp"
var userCount = 0
def mkFunName(): String = {
val c = userCount
userCount += 1
s"Fun$c"
}
def wrapSource(source: String): (String, String) = {
val fun = mkFunName()
val code = s"""package $packageName
|
|class $fun extends Function0[Unit] {
| def apply(): Unit = {
| $source
| }
|}
|""".stripMargin
(fun, code)
}
A function to compile a source fragment and return the byte array of the resulting jar:
/** Compiles a source code consisting of a body which is wrapped in a `Function0`
* apply method, and returns the function's class name (without package) and the
* raw jar file produced in the compilation.
*/
def compile(source: String): (String, Array[Byte]) = {
val set = new nsc.Settings
val d = File.createTempFile("temp", ".out")
d.delete(); d.mkdir()
set.d.value = d.getPath
set.usejavacp.value = true
val compiler = new nsc.Global(set)
val f = File.createTempFile("temp", ".scala")
val out = new BufferedOutputStream(new FileOutputStream(f))
val (fun, code) = wrapSource(source)
out.write(code.getBytes("UTF-8"))
out.flush(); out.close()
val run = new compiler.Run()
run.compile(List(f.getPath))
f.delete()
val bytes = packJar(d)
deleteDir(d)
(fun, bytes)
}
def deleteDir(base: File): Unit = {
base.listFiles().foreach { f =>
if (f.isFile) f.delete()
else deleteDir(f)
}
base.delete()
}
Note: Doesn't handle compiler errors yet!
The packJar method uses the compiler output directory and produces an in-memory jar file from it:
// cf. http://stackoverflow.com/questions/1281229
def packJar(base: File): Array[Byte] = {
import java.util.jar._
val mf = new Manifest
mf.getMainAttributes.put(Attributes.Name.MANIFEST_VERSION, "1.0")
val bs = new java.io.ByteArrayOutputStream
val out = new JarOutputStream(bs, mf)
def add(prefix: String, f: File): Unit = {
val name0 = prefix + f.getName
val name = if (f.isDirectory) name0 + "/" else name0
val entry = new JarEntry(name)
entry.setTime(f.lastModified())
out.putNextEntry(entry)
if (f.isFile) {
val in = new BufferedInputStream(new FileInputStream(f))
try {
val buf = new Array[Byte](1024)
#tailrec def loop(): Unit = {
val count = in.read(buf)
if (count >= 0) {
out.write(buf, 0, count)
loop()
}
}
loop()
} finally {
in.close()
}
}
out.closeEntry()
if (f.isDirectory) f.listFiles.foreach(add(name, _))
}
base.listFiles().foreach(add("", _))
out.close()
bs.toByteArray
}
A utility function that takes the byte array found in deserialization and creates a map from class names to class byte code:
def unpackJar(bytes: Array[Byte]): Map[String, Array[Byte]] = {
import java.util.jar._
import scala.annotation.tailrec
val in = new JarInputStream(new ByteArrayInputStream(bytes))
val b = Map.newBuilder[String, Array[Byte]]
#tailrec def loop(): Unit = {
val entry = in.getNextJarEntry
if (entry != null) {
if (!entry.isDirectory) {
val name = entry.getName
// cf. http://stackoverflow.com/questions/8909743
val bs = new ByteArrayOutputStream
var i = 0
while (i >= 0) {
i = in.read()
if (i >= 0) bs.write(i)
}
val bytes = bs.toByteArray
b += mkClassName(name) -> bytes
}
loop()
}
}
loop()
in.close()
b.result()
}
def mkClassName(path: String): String = {
require(path.endsWith(".class"))
path.substring(0, path.length - 6).replace("/", ".")
}
A suitable class loader:
class MemoryClassLoader(map: Map[String, Array[Byte]]) extends ClassLoader {
override protected def findClass(name: String): Class[_] =
map.get(name).map { bytes =>
println(s"defineClass($name, ...)")
defineClass(name, bytes, 0, bytes.length)
} .getOrElse(super.findClass(name)) // throws exception
}
And a test case which contains additional classes (closures):
val exampleSource =
"""val xs = List("hello", "world")
|println(xs.map(_.capitalize).mkString(" "))
|""".stripMargin
def test(fun: String, cl: ClassLoader): Unit = {
val clName = s"$packageName.$fun"
println(s"Resolving class '$clName'...")
val clazz = Class.forName(clName, true, cl)
println("Instantiating...")
val x = clazz.newInstance().asInstanceOf[() => Unit]
println("Invoking 'apply':")
x()
}
locally {
println("Compiling...")
val (fun, bytes) = compile(exampleSource)
val map = unpackJar(bytes)
println("Classes found:")
map.keys.foreach(k => println(s" '$k'"))
val cl = new MemoryClassLoader(map)
test(fun, cl) // should call `defineClass`
test(fun, cl) // should find cached class
}

Scala Process - Capture Standard Out and Exit Code

I'm working with the Scala scala.sys.process library.
I know that I can capture the exit code with ! and the output with !! but what if I want to capture both?
I've seen this answer https://stackoverflow.com/a/6013932/416338 which looks promising, but I'm wondering if there is a one liner and I'm missing something.
I have the following utility method for running commands:
import sys.process._
def runCommand(cmd: Seq[String]): (Int, String, String) = {
val stdoutStream = new ByteArrayOutputStream
val stderrStream = new ByteArrayOutputStream
val stdoutWriter = new PrintWriter(stdoutStream)
val stderrWriter = new PrintWriter(stderrStream)
val exitValue = cmd.!(ProcessLogger(stdoutWriter.println, stderrWriter.println))
stdoutWriter.close()
stderrWriter.close()
(exitValue, stdoutStream.toString, stderrStream.toString)
}
As you can see, it captures stdout, stderr and result code.
You can use ProcessIO. I needed something like that in a Specs2 Test, where I had to check the exit value as well as the output of a process depending on the input on stdin (in and out are of type String):
"the operation" should {
f"return '$out' on input '$in'" in {
var res = ""
val io = new ProcessIO(
stdin => { stdin.write(in.getBytes)
stdin.close() },
stdout => { res = convertStreamToString(stdout)
stdout.close() },
stderr => { stderr.close() })
val proc = f"$operation $file".run(io)
proc.exitValue() must be_==(0)
res must be_==(out)
}
}
I figured that might help you. In the example I am ignoring what ever comes from stderr.
You can specify an output stream that catches the text:
import sys.process._
val os = new java.io.ByteArrayOutputStream
val code = ("volname" #> os).!
os.close()
val opt = if (code == 0) Some(os.toString("UTF-8")) else None
The one-line-ish use of BasicIO or ProcessLogger is appealing.
scala> val sb = new StringBuffer
sb: StringBuffer =
scala> ("/bin/ls /tmp" run BasicIO(false, sb, None)).exitValue
res0: Int = 0
scala> sb
res1: StringBuffer = ...
or
scala> import collection.mutable.ListBuffer
import collection.mutable.ListBuffer
scala> val b = ListBuffer[String]()
b: scala.collection.mutable.ListBuffer[String] = ListBuffer()
scala> ("/bin/ls /tmp" run ProcessLogger(b append _)).exitValue
res4: Int = 0
scala> b mkString "\n"
res5: String = ...
Depending on what you mean by capture, perhaps you're interested in output unless the exit code is nonzero. In that case, handle the exception.
scala> val re = "Nonzero exit value: (\\d+)".r.unanchored
re: scala.util.matching.UnanchoredRegex = Nonzero exit value: (\d+)
scala> Try ("./bomb.sh" !!) match {
| case Failure(f) => f.getMessage match {
| case re(x) => println(s"Bad exit $x")
| }
| case Success(s) => println(s)
| }
warning: there were 1 feature warning(s); re-run with -feature for details
Bad exit 3
The response provided by 'Alex Cruise' in your link is fairly concise, barring poorer performance.
You could extend sys.process.ProcessLogger to manage the
var out = List[String]()
var err = List[String]()
internally, with getters for the out.reverse and err.reverse results.
Here's a really simple Scala wrapper that allows you to retrieve stdout, stderr and exit code.
import scala.sys.process._
case class ProcessInfo(stdout: String, stderr: String, exitCode: Int)
object CommandRunner {
def runCommandAndGetOutput(command: String): ProcessInfo = {
val stdout = new StringBuilder
val stderr = new StringBuilder
val status = command ! ProcessLogger(stdout append _, stderr append _)
ProcessInfo(stdout.toString(), stderr.toString(), status)
}
}
I combined these and came up with this. The expected RC is there because I have a program I need to run in one project that returns 1 when it works. This does depend on the text of the Exception, but it will still do something reasonable it that doesn't match.
private val ProcessErrorP: Regex = "(.*): error=(\\d+),(.*)".r.unanchored
case class ProcessInfo(stdout: String, stderr: String, exitCode: Int, private val expectedRd: Int) {
def succeeded: Boolean = exitCode == expectedRd
def failed: Boolean = !succeeded
def asOpt: Option[String] = if (succeeded) None else Some(stderr)
}
/**
* Run a simple command
* #param command -- what to run
* #return -- what happened
*/
def run(command: String, expectedRc: Int = 0): ProcessInfo = {
try {
val stdout = new StringBuilder
val stderr = new StringBuilder
val status = command ! ProcessLogger(stdout append _, stderr append _)
ProcessInfo(stdout.toString(), stderr.toString(), status, expectedRc)
} catch {
case io: IOException =>
val dm = io.getMessage
dm match {
case ProcessErrorP(message, code, reason) =>
ProcessInfo("", s"$message, $reason", code.toInt, expectedRc)
case m: String =>
ProcessInfo("", m, 999, expectedRc)
}
}
}