I'm using IntelliJ IDEA Ultimate 16.1.2 and Scala 12. When setting breakpoints in foreach loops, they don't get hit.
Breakpoints above the foreach work and get the tick in the breakpoint (valid breakpoint) but the ones in the foreach don't get the tick and the program doesn't break there either.
I tried invalidating the IntelliJ Cash, restarting my PC and IntelliJ, rebuilding maven - nothing worked.
This are the VM parameters I'm using, but I've also tried it without any, which didn't help:
-XX:+UnlockCommercialFeatures
-XX:+FlightRecorder
-Dcom.sun.management.jmxremote
-XX:StartFlightRecording=filename=recording.jfr
-server
-Xms1G
-Xmx4G
-XX:+UseG1GC
-XX:+UseStringDeduplication
Thanks for your help!
Here is my code:
class RunTestCasesAction extends AbstractAction {
def actionPerformed(e: ActionEvent) = {
val parent = methodToGetParentComponent() //breakpoint works
getFileName(parent).foreach { testFileName =>
val dialog = new SomeDialog() // breakpoint doesn't work
}
}
private def getFileName(parent: Component): Option[String] = {
val baseDir = getExportDir
val fc = new JFileChooser(baseDir)
val rc = fc.showDialog(null, "Select test file")
if (rc == JFileChooser.APPROVE_OPTION) Some(fc.getSelectedFile.toString) else None
}
What you are doing is creating new objects and assigning them inside a foreach loop.
This has no effect on other part on the projects.
Compiler optimizes your code and this way this code block is never executed. This val dialog = new SomeDialog() are created inside foreach and this objects are immediatelly destroyed, cause their scope is only inside foreach loop, so there is no point of doing it so.
If you want to go inside the loop, put there something what matters, what will be reusable. Eg. you can collect this objects.
Make sure compiler will not optimize your code this way.
Probably you could try to use some compiler optimizations flags, but I am not sure about that.
Here are informations on how debuggers work.
Related
I'm trying to figure out how to pass args to this scala object:
I have this class written in this sbt project path: allaboutscala/src/main/scala/gzip_practice/gzipwriter
package gzip_practice
import java.io._
import java.util.zip._
/** Gzcat
*/
object gzcat extends App {
private val buf = new Array[Byte](1024)
try {
for (path <- args) {
try {
var in = new GZIPInputStream(new FileInputStream(path))
var n = in.read(buf)
while (n >= 0) {
System.out.write(buf, 0, n)
n = in.read(buf)
}
}
catch {
case _:FileNotFoundException =>
System.err.printf("File Not Found: %s", path)
case _:SecurityException =>
System.err.printf("Permission Denied: %s", path)
}
}
}
finally {
System.out.flush
}
}
This is an sbt project called allaboutscala. I am trying to run it with:
scala src/main/scala/gzip_practice/gzipwriter.scala "hi" but the command just hangs and I don't know why.
How am I supposed to run this object constructor with args?
You can use the scala command as a script runner.
Normally, it will wrap your "script" code in a main method.
But if you have an object with a main method, like your App, it will use that for the entry point.
However, it doesn't like package statements in the script.
If you comment out your package statement, you can compile and run with:
scala -nc somefile.scala myarg.gz
-nc means "no compile daemon"; otherwise, it will start a second process to compile scripts, so that subsequent compiles go faster; but it is a brittle workflow and I don't recommend it.
I confirmed that your code works.
Usually, folks use sbt or an IDE to compile and package in a jar to run with scala myapp.jar.
An object is a static instance of a class. You could construct it using:
object gzcat(args: String*) extends App {
...
}
args is bound as a val within the object gzcat.
Are you trying to run it with repl? I would suggest running it with sbt, then you can run sbt projects from project root directory with command line parameter as follows:
sbt "run file1.txt file2.txt"
The quotes are required. If you leave sbt shell open, then it running it will be much faster. Open shell in project root with
sbt
In the sbt shell:
run file1.txt file2.txt
Within the sbt shell, no quotes.
I came across an issue earlier where I couldn't run an indivdual scala test, it would always try to run all of them even if I set the configuration to just be running one test. Does anyone know of any settings/configuration I can change to get it to run?
class MyTest extends PlaySpec {
val setTo = new AfterWord("set to")
"Setting" when setTo {
"value a" in {
//test stuff
}
"value b" in {
//test stuff
}
}
Turns out it was the use of the AfterWord that was messing up my test, once I removed it the tests ran fine. I'm not sure why they're incompatible but if you want to run individual tests, don't use an AfterWord.
My code looks like:
case class SRecord(trialId: String, private var _max:Int) {
def max=_max
def max_=(value:Int):Unit=_max=value
}
Then later on I apply a function onto it:
def groupSummaryRecords(it:Iterator[Option[SRecord]], optionSummary:Option[SRecord]):Option[SRecord] = {
var max=0;
var sRecord1 : Option[SRecord] = None
var i=0
while(it.hasNext) {
var sRecord:Option[SRecord] = it.next();
if(i==0) {
sRecord1 = sRecord;
}
..
}
sRecord1.max=max; // getting 'reassignment to val' compilation error
..
}
Why am i getting this compilation error, and how to fix it ?
If I instead change sRecord and sRecord1 instances to be of type SRecord instead of Option[SRecord] as well as the method signature, it all works fine however.
But in some cases I may have a null SRecord hence the use of None/Some. I am new to Scala, using Option/Some all over feels like a real pain if you ask me, i am just thinking of removing all this Option nonsense and testing for 'null' in good ol' Java, at least my code would work ??!
With the line sRecord1.max=max you are trying to call the max method on an Option[SRecord], not an SRecord. You want to access the contained SRecord (if any) and call the method on that, which can be done using foreach:
sRecord1.foreach(_.max=max)
which is desugared to:
sRecord1.foreach( srec => srec.max=max )
(the actual name "srec" is made up, the compiler will assign some internal name, but you get the idea). If sRecord1 is None, this won't do anything, but if it is Some(srec), the method execution will be passed in to operate on the contained instance.
Here is my code.
var link = scala.collection.mutable.LinkedHashMap[String, String]()
var fieldTypeMapRDD = fixedRDD.mapPartitionsWithIndex((idx, itr) => itr.map(s => (s(8), s(9))))
fieldTypeMapRDD.foreach { i =>
println(i)
link.put(i._1, i._2)
}
println(link.size)// here size is zero
I want to access link out side loop .Please help.
Why your code is not supposed to work:
Before your foreach task is started, whole your function's closure inside foreach block is serialized and sent first to master, then to each of workers. This means each of them will have its own instance of mutable.LinkedHashMap as copy of link.
During foreach block each worker will put each of its items inside its own link copy
After your task is done you have still empty local link and several non-empty former copies on each of worker nodes.
Moral is clear: don't use local mutable collections with RDD. It's just not going to work.
One way to get whole collection to local machine is collect method.
You can use it as:
val link = fieldTypeMapRDD.collect.toMap
or in case of need to preserve the order:
import scala.collection.immutable.ListMap
val link = ListMap(fieldTypeMapRDD.collect:_*)
But if you are really into mutable collections, you can modify your code a bit. Just change
fieldTypeMapRDD.foreach {
to
fieldTypeMapRDD.toLocalIterator.foreach {
See also this question.
Solved. IntelliJ didn't highlight the fact that my imports were incomplete.
Hi,
I have a simple Scala program that I'm trying to develop using jMock. Setting basic expectations works nicely but for some reason Scala does not understand my attempt to return a value from a mock object. My maven build spews out the following error
TestLocalCollector.scala:45: error: not found: value returnValue
one (nodeCtx).getParameter("FilenameRegex"); will( returnValue(regex))
^
And the respective code snippets are
#Before def setUp() : Unit = { nodeCtx = context.mock(classOf[NodeContext]) }
...
// the value to be returned
val regex = ".*\\.data"
...
// setting the expectations
one (nodeCtx).getParameter("FilenameRegex"); will( returnValue(regex))
To me it sounds that Scala is expecting that the static jMock method returnValue would be a val? What am I missing here?
Are you sure about the ';'?
one (nodeCtx).getParameter("FilenameRegex") will( returnValue(regex))
might work better.
In this example you see a line like:
expect {
one(blogger).todayPosts will returnValue(List(Post("...")))
}
with the following comment:
Specify what the return value should be in the same expression by defining "will" as Scala infix operator.
In the Java equivalent we would have to make a separate method call (which our favorite IDE may insist on putting on the next line!)
one(blogger).todayPosts; will(returnValue(List(Post("..."))))
^
|
-- semicolon only in the *Java* version
The OP explains it himself:
the returnValue static method was not visible, thus the errors.
And the will method just records an action on the latest mock operation, that's why it can be on the next line or after the semicolon :)
import org.jmock.Expectations
import org.jmock.Expectations._
...
context.checking(
new Expectations {
{ oneOf (nodeCtx).getParameter("FilenameRegex") will( returnValue(".*\\.data") ) }
}
)