SBT & Scala AudioSystem - scala

I have a strange problem that I've been trying to figure out for a while now.
I have an application that I am building with Scala & Spray and which uses the AudioSystem API.
I build and test the application using SBT.
I have a boot.scala which extends "App".
If I place the following code in boot.scala and run it through Eclipse(without sbt) (Run As... Scala App) it runs fine...
val stream:AudioInputStream = AudioSystem.getAudioInputStream(new File("test.wav"))
val audioFormat:AudioFormat = stream.getFormat();
val samplingRate = audioFormat.getSampleRate()
println("Sampling Rate: "+samplingRate)
The sampling rate of the file is output as expected.
I have the same code in a Specs2 Route test similar to...
"API" should {
"Respond to POST requests" in {
val stream:AudioInputStream = AudioSystem.getAudioInputStream(new File("test.wav"))
val audioFormat:AudioFormat = stream.getFormat();
val samplingRate = audioFormat.getSampleRate()
println("Sampling Rate: "+samplingRate)
...
However when I execute this from a terminal using "sbt test" I get the following error...
UnsupportedAudioFileException: : could not get audio input stream from input file
I know the file (test.wav) is ok as I can play it and executing the code through Eclipse works fine. The terminal (and its encodings) seem ok too as I put together a test file which just runs the same few lines of code and ran it from a terminal successfully.
The problem seems to only occur with SBT!
Anyone got any ideas?
Thanks

Finally found the answer after literally days of searching...
Why does AudioSystem.getMixerInfo() return different results under sbt vs Scala?
"This is a classloader issue. javax.sound does NOT like having the context classloader be anything other than the system classloader." and the fix for me was...
val cl = classOf[javax.sound.sampled.AudioSystem].getClassLoader
val old = Thread.currentThread.getContextClassLoader
var audioFormat:AudioFormat = null
try {
Thread.currentThread.setContextClassLoader(cl)
val stream:AudioInputStream =
AudioSystem.getAudioInputStream(new ByteArrayInputStream(data))
audioFormat = stream.getFormat()
} finally Thread.currentThread.setContextClassLoader(old)
Thanks

Related

Content not saved in file while spark job running on cluster (yarn)

I am facing a weird issue and I've stuck with that for a while.
Basically in my spark application I have a method, where I create a file in MapR fs and save some content in this file.
The method is like below:
def collector(value: String): Unit = {
val conf = new Configuration()
val fs= FileSystem.get(conf)
val path = getConfig("path")
Try {fs.create(new Path(path + s"${scala.util.Random.alphanumeric take 10 mkString}" +
Calendar.getInstance.getTimeInMillis))
.write(value.getBytes);
fs.close()}.getOrElse(logger.warn("File not saved"))
}
This method is called from different object. When I run it with --master local[*], then the file is created in specific location in the FS with string that is passed in value. However, when I run it on cluster (--master yarn), then only empty file is saved. When I print value, its printing the string. But for some reason not saving it in the file.
I wonder if anybody has any idea why?
Thanks

Store execution plan of Spark´s dataframe

I am currently trying to store the execution plan of a Spark´s dataframe into HDFS (through dataframe.explain(true) command)
The issue I am finding is that when I am using the explain(true) command, I am able to see the output by the command line and by the logs, however if I create a file (let´s say a .txt) with the content of the dataframe´s explain the file will appear empty.
I believe the issue relates to the configuration of Spark, but I am unable to
find any information about this in internet
(for those who want to see more about the plan execution of the dataframes using the explain function please refer to https://jaceklaskowski.gitbooks.io/mastering-apache-spark/spark-sql-dataset-operators.html#explain)
if I create a file (let´s say a .txt) with the content of the dataframe´s explain
How exactly did you try to achieve this?
explain writes its result to console, using println, and returns Unit, as can be seen in Dataset.scala:
def explain(extended: Boolean): Unit = {
val explain = ExplainCommand(queryExecution.logical, extended = extended)
sparkSession.sessionState.executePlan(explain).executedPlan.executeCollect().foreach {
// scalastyle:off println
r => println(r.getString(0))
// scalastyle:on println
}
}
So, unless you redirect the console output to write to your file (along with anything else printed to the console...), you won't be able to write explain's output to file.
The best way I have found is to redirect the output to a file when you run the job. I have used the following command :
spark-shell --master yarn -i test.scala > getlogs.log
my scala file has the following simple commands :
val df = sqlContext.sql("SELECT COUNT(*) FROM testtable")
df.explain(true)
exit()

How to get all projects in a SBT task?

I'm writing a SBT task, which will get all projects of the whole project, then I can run some tasks against them.
The pseudo code is like:
val projects = someTaskToGetProjects.value
val updateReports = projects.map(p => (update in p).value)
But I can't find any task or setting to get the project list, how to do it?
I think buildDependencies might suit your needs, otherwise loadedBuild has everything.
val projects = buildDependencies.value.classpath.keys
val updateReports = projects.map(p => (update in p).value)

Read property file under classpath using scala

I am trying to read a property file from classpath using scala. But it looks like it won't work, it is different from java. The following 2 code snippet, one is java (working), another is scala (not working). I don't understand what is the difference.
// working
BufferedReader reader = new BufferedReader(new InputStreamReader(
Test.class.getResourceAsStream("conf/fp.properties")));
// not working
val reader = new BufferedReader(new InputStreamReader(
getClass.getResourceAsStream("conf/fp.properties")));
Exception in thread "main" java.lang.NullPointerException
at java.io.Reader.<init>(Reader.java:78)
at java.io.InputStreamReader.<init>(InputStreamReader.java:72)
at com.ebay.searchscience.searchmetrics.fp.conf.FPConf$.main(FPConf.scala:31)
at com.ebay.searchscience.searchmetrics.fp.conf.FPConf.main(FPConf.scala)
This code finally worked for me:
import java.util.Properties
import scala.io.Source
// ... somewhere inside module.
var properties : Properties = null
val url = getClass.getResource("/my.properties")
if (url != null) {
val source = Source.fromURL(url)
properties = new Properties()
properties.load(source.bufferedReader())
}
And now you have plain old java.util.Properties to handle what my legacy code actually needed to receive.
I am guessing that your BufferedReader is a java.io.BufferedReader
In that case you could simply do the following:
import scala.io.Source.fromUrl
val reader = fromURL(getClass.getResource("conf/fp.properties")).bufferedReader()
However, this leaves the question open as to what you are planning to do with the reader afterwards. scala.io.Source already has some useful methods that might make lots of your code superfluous .. see ScalaDoc
My prefered solution is with com.typesafe.scala-logging. I did put an application.conf file in main\resources folder, with content like:
services {
mongo-db {
retrieve = """http://xxxxxxxxxxxx""",
base = """http://xxxxxx"""
}
}
and the to use it in a class, first load the config factory from typesafe and then just use it.
val conf = com.typesafe.config.ConfigFactory.load()
conf.getString("services.mongo-db.base"))
Hope it helps!
Ps. I bet that every file on resources with .conf as extension will be read.
For reading a Properties file i'd recommend to use java.util.ResourceBundle.getBundle("conf/fp"), it makes life a little easier.
The NullPointerException you are seeing is caused by a bug in the underlying Java code. It could be caused by a mistyped file name.
Sometimes you get this error also if you're trying to load the resource with the wrong classloader.
Check the resource url carefully against your classpath.
Try Source.fromInputStream(getClass.getResourceAsStream(...))
Try Source.fromInputStream(getClass.getClassLoader.getResourceAsStream())
Maybe you are using other classloaders you can try?
The same story goes for Source.fromUrl(...)
If you're trying to load configuration files and you control their format, you should have a look at Typesafe's Config utility.
The Null Pointer Exception you are getting is from getResourceAsStream returning null. The following junit.scala snippet shows how there is a difference in class vs classloader. see What is the difference between Class.getResource() and ClassLoader.getResource()?. Here I assume fileName is the name of a file residing in the class path, but not a file next to the class running the test.
assertTrue(getClass.getClassLoader().getResourceAsStream(fileName) != null)
assertTrue(getClass.getClassLoader().getResourceAsStream("/" + fileName) == null)
assertTrue(getClass.getResourceAsStream(fileName) == null)
assertTrue(getClass.getResourceAsStream("/" + fileName) != null)

Start external console application from Scala in interactive mode

I am using scala.sys.process to start an external console application from within my Scala code. However, I hit a road block when the console app requires the user input.
Basically, when I start the console app with
Seq("powershell" , "myConsoleApp.exe").run
myConsoleApp.exe will not be started in its own "window". I can see the console app is running when I check the Task Manager. Without an actual window, I can't really key in anything.
I know Scala can return the program output to a String or a Stream[String] - I guess Scala will probably be able pipe input to the external process also.
But I really don't want to re-write such logic in Scala when all of them are already available in the external program.
I am wondering if there is a way to start an external console program in its own window? Or is this a shortcoming with scala.sys.process.
Thanks,
Adapted from this Java answer Show the CMD window with Java
import scala.sys.process._
Seq("cmd", "/c", "start", "PowerShell.exe", "myConsoleApp.exe") run
After some more googling, I found that my problem is more in the way I call powershell.
Here is a solution that works for me:
Seq("powershell", "Start-Process", "myConsoleApp.exe")
This will run interactively from Scala console, copy and :paste
val con = System.console
new java.lang.Thread() {
val in = new java.lang.Thread() {
override def run() {
while (true) {
Thread.sleep(1)
if (con.reader.ready)
con.reader.read()
}
}
}
override def run() {
in.start()
while (true) {
Thread.sleep(1000)
con.printf("\nHai")
}
}
}.start()

Categories