Unexpected lazy value lastNoSuccessVar in trait Parsers - scala

Could you please advise what lazy value is all about here?..
[ERROR] ## Exception when compiling 284 sources to C:\LocalFolder\dev\project-buckets\OFP\project\target\classes
java.lang.AssertionError: assertion failed:
unexpected lazy value lastNoSuccessVar in trait Parsers <mutable> lazy <expandedname> private[this]
while compiling: C:\LocalFolder\dev\project-buckets\OFP\project\target\generated-sources\scalaxb\com\project\tec\ofp\job\indox\agreement\protocol\xmlprotocol.scala
during phase: mixin
library version: version 2.13.6
compiler version: version 2.13.6
reconstructed args: ...
last tree to typer: Ident(isNotValid)
tree position: line 300 of C:\LocalFolder\dev\project-buckets\OFP\project\src\com\project\tec\ofp\job\indox\agreement\validation\rules\ContentRulesWrapper.scala
tree tpe: runtime.BooleanRef
symbol: variable isNotValid
symbol definition: var isNotValid: runtime.BooleanRef (a TermSymbol)
symbol package: com.project.tec.ofp.job.indox.agreement.validation.rules
symbol owners: variable isNotValid -> method $anonfun$EX_CSA_15$9 -> object ContentRulesWrapper
call site: <$anon: com.project.tec.ofp.job.indox.agreement.protocol.XMLProtocol$DefaultCommusitecofpjobindoxagreementcsa_LinkedIdentifiersFormat> in package protocol in package protocol
== Source file context for tree position ==
297 deliverMap.map(f => {
298 var isNotValid: Boolean = false
299 f._2.foreach(f => if (f == "Refer to Agreement" | f == "") isNotValid = true)
300 if (isNotValid) f._1
301 }).mkString("|")
302
303 val isRaised: Boolean = RulesDef.EX_CSA_15(deliverMap)
scala.reflect.internal.SymbolTable.throwAssertionError(SymbolTable.scala:171)
scala.tools.nsc.transform.Mixin.$anonfun$publicizeTraitMethods$2(Mixin.scala:237)
scala.tools.nsc.transform.Mixin.$anonfun$publicizeTraitMethods$2$adapted(Mixin.scala:233)
scala.reflect.internal.Scopes$Scope.foreach(Scopes.scala:455)
scala.tools.nsc.transform.Mixin.publicizeTraitMethods(Mixin.scala:233)
scala.tools.nsc.transform.Mixin.$anonfun$addMixedinMembers$12(Mixin.scala:404)
scala.tools.nsc.transform.Mixin.$anonfun$addMixedinMembers$12$adapted(Mixin.scala:401)
scala.collection.IterableOnceOps.foreach(IterableOnce.scala:563)
scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:561)
scala.collection.AbstractIterable.foreach(Iterable.scala:919)
scala.collection.IterableOps$WithFilter.foreach(Iterable.scala:889)
scala.tools.nsc.transform.Mixin.addMixedinMembers(Mixin.scala:401)
scala.tools.nsc.transform.Mixin$MixinTransformer.preTransform(Mixin.scala:440)
scala.tools.nsc.transform.Mixin$MixinTransformer.transform(Mixin.scala:650)
scala.tools.nsc.transform.Mixin$MixinTransformer.transform(Mixin.scala:415)
scala.reflect.api.Trees$Transformer.transformTemplate(Trees.scala:2595)
scala.reflect.internal.Trees$ClassDef.$anonfun$transform$2(Trees.scala:361)
scala.reflect.api.Trees$Transformer.atOwner(Trees.scala:2633)
scala.reflect.internal.Trees$ClassDef.transform(Trees.scala:360)
scala.tools.nsc.transform.Mixin$MixinTransformer.transform(Mixin.scala:650)
scala.tools.nsc.transform.Mixin$MixinTransformer.transform(Mixin.scala:415)
scala.reflect.api.Trees$Transformer.$anonfun$transformStats$1(Trees.scala:2622)
scala.reflect.api.Trees$Transformer.transformStats(Trees.scala:2620)
scala.reflect.internal.Trees$PackageDef.$anonfun$transform$1(Trees.scala:342)
scala.reflect.api.Trees$Transformer.atOwner(Trees.scala:2633)
scala.reflect.internal.Trees$PackageDef.transform(Trees.scala:342)
scala.tools.nsc.transform.Mixin$MixinTransformer.transform(Mixin.scala:650)
scala.tools.nsc.ast.Trees$Transformer.transformUnit(Trees.scala:183)
scala.tools.nsc.transform.Transform$Phase.apply(Transform.scala:32)
scala.tools.nsc.Global$GlobalPhase.applyPhase(Global.scala:454)
scala.tools.nsc.Global$GlobalPhase.run(Global.scala:401)
scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1519)
scala.tools.nsc.Global$Run.compileUnits(Global.scala:1503)
scala.tools.nsc.Global$Run.compileSources(Global.scala:1495)
scala.tools.nsc.Global$Run.compileFiles(Global.scala:1609)
xsbt.CachedCompiler0.run(CompilerBridge.scala:163)
xsbt.CachedCompiler0.run(CompilerBridge.scala:134)
xsbt.CompilerBridge.run(CompilerBridge.scala:39)
sbt.internal.inc.AnalyzingCompiler.compile(AnalyzingCompiler.scala:91)
sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$7(MixedAnalyzingCompiler.scala:192)
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
sbt.internal.inc.MixedAnalyzingCompiler.timed(MixedAnalyzingCompiler.scala:247)
sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$4(MixedAnalyzingCompiler.scala:182)
sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$4$adapted(MixedAnalyzingCompiler.scala:163)
sbt.internal.inc.JarUtils$.withPreviousJar(JarUtils.scala:239)
sbt.internal.inc.MixedAnalyzingCompiler.compileScala$1(MixedAnalyzingCompiler.scala:163)
sbt.internal.inc.MixedAnalyzingCompiler.compile(MixedAnalyzingCompiler.scala:210)
sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1(IncrementalCompilerImpl.scala:528)
sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1$adapted(IncrementalCompilerImpl.scala:528)
sbt.internal.inc.Incremental$.$anonfun$apply$5(Incremental.scala:175)
sbt.internal.inc.Incremental$.$anonfun$apply$5$adapted(Incremental.scala:173)
sbt.internal.inc.Incremental$$anon$2.run(Incremental.scala:459)
sbt.internal.inc.IncrementalCommon$CycleState.next(IncrementalCommon.scala:116)
sbt.internal.inc.IncrementalCommon$$anon$1.next(IncrementalCommon.scala:56)
sbt.internal.inc.IncrementalCommon$$anon$1.next(IncrementalCommon.scala:52)
sbt.internal.inc.IncrementalCommon.cycle(IncrementalCommon.scala:263)
sbt.internal.inc.Incremental$.$anonfun$incrementalCompile$8(Incremental.scala:414)
sbt.internal.inc.Incremental$.withClassfileManager(Incremental.scala:501)
sbt.internal.inc.Incremental$.incrementalCompile(Incremental.scala:401)
sbt.internal.inc.Incremental$.apply(Incremental.scala:167)
sbt.internal.inc.IncrementalCompilerImpl.compileInternal(IncrementalCompilerImpl.scala:528)
sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileIncrementally$1(IncrementalCompilerImpl.scala:482)
sbt.internal.inc.IncrementalCompilerImpl.handleCompilationError(IncrementalCompilerImpl.scala:332)
sbt.internal.inc.IncrementalCompilerImpl.compileIncrementally(IncrementalCompilerImpl.scala:420)
sbt.internal.inc.IncrementalCompilerImpl.compile(IncrementalCompilerImpl.scala:137)
sbt_inc.SbtIncrementalCompiler.compile(SbtIncrementalCompiler.java:179)
scala_maven.ScalaCompilerSupport.incrementalCompile(ScalaCompilerSupport.java:365)
scala_maven.ScalaCompilerSupport.compile(ScalaCompilerSupport.java:122)
scala_maven.ScalaCompilerSupport.doExecute(ScalaCompilerSupport.java:89)
scala_maven.ScalaMojoSupport.execute(ScalaMojoSupport.java:305)
org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:137)
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:210)
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:156)
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:148)
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:117)
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:81)
org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:56)
org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:305)
org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:192)
org.apache.maven.DefaultMaven.execute(DefaultMaven.java:105)
org.apache.maven.cli.MavenCli.execute(MavenCli.java:957)
org.apache.maven.cli.MavenCli.doMain(MavenCli.java:289)
org.apache.maven.cli.MavenCli.main(MavenCli.java:193)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:282)
org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:225)
org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:406)
org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:347)
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 49.546 s
[INFO] Finished at: 2021-11-01T18:57:23+02:00
[INFO] ------------------------------------------------------------------------
---------------------------------------------------

As the comments suggest, more information will help people understand what the problem is and allow people to help. I've taken a best guess at the types involved and you could try the following:
val deliverMap: Seq[(String, Seq[String])] = ???
deliverMap.map(f => {
val isNotValid = f._2.exists(f => f == "Refer to Agreement" || f.isEmpty)
if (isNotValid) f._1 else ""
}).mkString("|")
Some of the reasons the code from the error won't compile which immediately jump out:
You need to use || instead of | for an or expression
The line if (isNotValid) f._1 doesn't not return anything in the case that isNotValid is false. In Scala just about everything is an expression which returns a value.

Related

sbt throws AssertionError at compilation with indecipherable error message

I've been getting these weird error messages when trying to recompile my sbt project, after adding some small changes to the code. Sometimes, depending on the edited code, the error disappears after cleaning the sbt project, but other times the error persists.
One example of how the error appears (but doesn't persist)
Adding this method to the Terrain class and compiling gives an error message that disappears after cleaning the project and recompiling:
def genChunkInputData(p: Vector3i): FloatBuffer = {
val inputData = MemoryUtil.memAllocFloat(4 * (Chunk.SIZE+ 1) * (Chunk.SIZE+ 1) * (Chunk.SIZE+ 1))
val r = (0 to Chunk.SIZE + 1)
for (i <- r; j <- r; k <- r) {
inputData.put(0.1F + 0.5F*min(i, Chunk.SIZE - i).toFloat/ Chunk.SIZE). //RED
put(0.1F + 0.5F*min(j, Chunk.SIZE - j).toFloat/ Chunk.SIZE). //GREEN
put(0.1F + 0.5F*min(k, Chunk.SIZE - k).toFloat/ Chunk.SIZE).//BLUE
put(isovalue((i + p(2)*Chunk.SIZE).toDouble, (j + p(1)*Chunk.SIZE).toDouble, (k + p(0)*Chunk.SIZE).toDouble)) //ISOVALUE
}
inputData.flip()
inputData
}
Here are some snippets from the error message:
[error] ## Exception when compiling 21 sources to D:\Computer science\Scala\Meandering Depths\target\scala-2.13\classes
[error] java.lang.AssertionError: assertion failed:
[error] List(method apply$mcI$sp, method apply$mcI$sp)
[error] while compiling: D:\Computer science\Scala\Meandering Depths\src\main\scala\game\Terrain.scala
[error] during phase: globalPhase=specialize, enteringPhase=explicitouter
[error] library version: version 2.13.3
[error] compiler version: version 2.13.3
[error] reconstructed args:
and
[error]
[error] last tree to typer: Select(Ident(r), foreach$mVc$sp)
[error] tree position: line 77 of D:\Computer science\Scala\Meandering Depths\src\main\scala\game\Terrain.scala
[error] tree tpe: (f: Int => Unit): Unit
[error] symbol: (final override) method foreach$mVc$sp in class Range
[error] symbol definition: final override def foreach$mVc$sp(f: Int => Unit): Unit (a MethodSymbol)
[error] symbol package: scala.collection.immutable
[error] symbol owners: method foreach$mVc$sp -> class Range
[error] call site: method $anonfun$genChunkInputData in package game
[error]
[error] == Source file context for tree position ==
[error]
[error] 74 def genChunkInputData(p: Vector3i): FloatBuffer = {
[error] 75 val inputData = MemoryUtil.memAllocFloat(4 * (Chunk.SIZE+ 1) * (Chunk.SIZE+ 1) * (Chunk.SIZE+ 1))
[error] 76 val r = (0 to Chunk.SIZE)
[error] 77 for (i <- r; j <- r; k <- r) {
[error] 78 inputData.put(0.1F + 0.5F*min(i, Chunk.SIZE - i).toFloat/ Chunk.SIZE). //RED
[error] 79 put(0.1F + 0.5F*min(j, Chunk.SIZE - j).toFloat/ Chunk.SIZE). //GREEN
[error] 80 put(0.1F + 0.5F*min(k, Chunk.SIZE - k).toFloat/ Chunk.SIZE).//BLUE
[error] scala.reflect.internal.SymbolTable.throwAssertionError(SymbolTable.scala:170)
[error] scala.reflect.internal.Symbols$Symbol.suchThat(Symbols.scala:2024)[error] scala.tools.nsc.transform.SpecializeTypes$SpecializationTransformer.matchingSymbolInPrefix$1(SpecializeTypes.scala:1573)
However, after following these steps, if you choose to remove that method from the Terrain class, it will again throw a similar error, but this time the source seems to have changed (to a different method, which had had no issue before and worked just fine). Again, by cleaning the build and recompiling, the error disappears.
Here is the change to the error message:
[error] last tree to typer: Function(value $anonfun)
[error] tree position: line 150 of D:\Computer science\Scala\Meandering Depths\src\main\scala\game\Terrain.scala
[error] tree tpe: Int => Unit
[error] symbol: value $anonfun
[error] == Source file context for tree position ==
[error]
[error] 147
[error] 148 private def isovalue(x: Double, y: Double, z: Double): Float = {
[error] 149 var res = -0.1F
[error] 150 for (i <- 0 until 4)
[error] 151 res += amp(i) * noise(i).noise3_XYBeforeZ(freq(i) * x, freqY(i) * y,freq(i) * z).toFloat
[error] 152 res
[error] 153 }
What is weird is, sometimes after following these steps of adding some code, compiling and getting an error, cleaning the build and recompiling with no error and then removing said code and compiling to get a new error message, the apparent source of the new error might come from a completely different class, which was behaving just fine prior to that.
What is more, and this is actually my real issue, is that sometimes the change in code causes the error to persist despite cleaning the build.
It also doesn't help that the error message doesn't tell me much, and confusingly enough it seems to be pointing to the wrong source at times. I must confess however that I don't really understand how sbt works and I'm just using it to import some libraries, but I had no issues with it before.
EDIT: Apparently it's caused by the scala compiler, not sbt itself. Error message seems to be similar to this one: https://github.com/scala/bug/issues/9578
I've tested the code from that link and it does give the same type of error I was getting with my code (and the source presented in the error message is again irrelevant to the error). I am using breeze in my project, so that seems to be the source of the problem. I'll attempt to remove it from the project and see if the error still occurs.
I have managed to find the source of the problem, which as expected has nothing to do with sbt, but with Breeze. The problem is exactly the one in this open issue: https://github.com/scala/bug/issues/9578
My workaround was to stop using DenseVector[Int] (right now I am using DenseVector[Float] but will probably switch to a different linear algebra library soon).

Why does maven give me this Scala error doing clean install

I can't seem to find why scala packages are being "thought of" by maven as a member of package org.apache.kafka.streams.scala. I usually use sbt but I have to use maven on this one. Any help would be appreciated.
/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/bin/java -Dmaven.multiModuleProjectDirectory=/Users/jcena1/IdeaProjects/datapipe-scala-merge-transform "-Dmaven.home=/Applications/IntelliJ IDEA.app/Contents/plugins/maven/lib/maven3" "-Dclassworlds.conf=/Applications/IntelliJ IDEA.app/Contents/plugins/maven/lib/maven3/bin/m2.conf" "-javaagent:/Applications/IntelliJ IDEA.app/Contents/lib/idea_rt.jar=65172:/Applications/IntelliJ IDEA.app/Contents/bin" -Dfile.encoding=UTF-8 -classpath "/Applications/IntelliJ IDEA.app/Contents/plugins/maven/lib/maven3/boot/plexus-classworlds-2.5.2.jar" org.codehaus.classworlds.Launcher -Didea.version=2018.2.1 install
[INFO] Scanning for projects...
[WARNING]
[WARNING] Some problems were encountered while building the effective model for io.confluent:Jointransform:jar:4.1.1
[WARNING] 'dependencies.dependency.(groupId:artifactId:type:classifier)' must be unique: org.scala-lang:scala-library:jar -> duplicate declaration of version ${scala.version} # io.confluent:Jointransform:[unknown-version], /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/pom.xml, line 259, column 21
[WARNING]
[WARNING] It is highly recommended to fix these problems because they threaten the stability of your build.
[WARNING]
[WARNING] For this reason, future Maven versions might no longer support building such malformed projects.
[WARNING]
[INFO] ------------------------------------------------------------------------
[INFO] Detecting the operating system and CPU architecture
[INFO] ------------------------------------------------------------------------
[INFO] os.detected.name: osx
[INFO] os.detected.arch: x86_64
[INFO] os.detected.version: 10.13
[INFO] os.detected.version.major: 10
[INFO] os.detected.version.minor: 13
[INFO] os.detected.classifier: osx-x86_64
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Jointransform 4.1.1
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-checkstyle-plugin:2.17:check (validate) # Jointransform ---
[INFO]
[INFO] --- maven-enforcer-plugin:3.0.0-M1:enforce (enforce-versions) # Jointransform ---
[INFO]
[INFO] --- build-helper-maven-plugin:1.10:add-source (add-source) # Jointransform ---
[INFO] Source directory: /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala added.
[INFO] Source directory: /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/target/generated-sources added.
[INFO]
[INFO] --- avro-maven-plugin:1.8.2:schema (default) # Jointransform ---
[INFO]
[INFO] --- maven-resources-plugin:3.0.2:resources (default-resources) # Jointransform ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 28 resources
[INFO]
[INFO] --- maven-compiler-plugin:3.3:compile (default-compile) # Jointransform ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 11 source files to /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/target/classes
[INFO]
[INFO] --- scala-maven-plugin:3.2.1:compile (default) # Jointransform ---
[INFO] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala:-1: info: compiling
[INFO] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/target/generated-sources:-1: info: compiling
[INFO] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/target/generated-sources/annotations:-1: info: compiling
[INFO] Compiling 16 source files to /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/target/classes at 1536749088682
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Filtertransform.scala:10: error: object language is not a member of package org.apache.kafka.streams.scala
[ERROR] import scala.language.implicitConversions
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Filtertransform.scala:14: error: object collection is not a member of package org.apache.kafka.streams.scala
[ERROR] import scala.collection.mutable
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Filtertransform.scala:75: error: Symbol 'type org.apache.kafka.streams.Consumed' is missing from the classpath.
[ERROR] This symbol is required by 'value com.lightbend.kafka.scala.streams.StreamsBuilderS.consumed'.
[ERROR] Make sure that type Consumed is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
[ERROR] A full rebuild may help if 'StreamsBuilderS.class' was compiled against an incompatible version of org.apache.kafka.streams.
[ERROR] val stream: KStreamS[String, Array[Byte]] = builder.stream(inputTopic)
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:12: error: object collection is not a member of package org.apache.kafka.streams.scala
[ERROR] import scala.collection.JavaConverters._
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:11: error: object language is not a member of package org.apache.kafka.streams.scala
[ERROR] import scala.language.implicitConversions
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:20: error: object collection is not a member of package org.apache.kafka.streams.scala
[ERROR] import scala.collection.mutable
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:21: error: object collection is not a member of package org.apache.kafka.streams.scala
[ERROR] import scala.collection.mutable.ListBuffer
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:39: error: Symbol 'type org.apache.kafka.streams.Consumed' is missing from the classpath.
[ERROR] This symbol is required by 'method com.lightbend.kafka.scala.streams.ImplicitConversions.consumedFromSerde'.
[ERROR] Make sure that type Consumed is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
[ERROR] A full rebuild may help if 'ImplicitConversions.class' was compiled against an incompatible version of org.apache.kafka.streams.
[ERROR] (out:+ s)
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:57: error: value asScala is not a member of java.util.List[String]
[ERROR] possible cause: maybe a semicolon is missing before `value asScala'?
[ERROR] .asScala
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:147: error: not found: value Consumed
[ERROR] implicit val c = Consumed.`with`(Serdes.String(), Serdes.ByteArray())
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:149: error: polymorphic expression cannot be instantiated to expected type;
[ERROR] found : [K, V]org.apache.kafka.streams.kstream.KTable[K,V]
[ERROR] required: org.apache.kafka.streams.scala.kstream.KTable[String,Array[Byte]]
[ERROR] val table: KTable[String, Array[Byte]] = builder.table(inputTopic)
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:156: error: not found: value Serialized
[ERROR] implicit val sb = Serialized.`with`(Serdes.String(), Serdes.ByteArray())
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:157: error: not found: value Serialized
[ERROR] implicit val ss = Serialized.`with`(Serdes.String(), Serdes.String())
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:158: error: not found: value Produced
[ERROR] implicit val p = Produced.`with`(Serdes.String(),Serdes.String())
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:241: error: value asScala is not a member of java.util.List[String]
[ERROR] possible cause: maybe a semicolon is missing before `value asScala'?
[ERROR] .asScala
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:245: error: value asScala is not a member of java.util.List[String]
[ERROR] val newValue = getValFromJSONMessage(lv, aggregateColumnList.asScala.toList.head)
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:232: error: diverging implicit expansion for type org.apache.kafka.streams.kstream.Serialized[KR,VR]
[ERROR] starting with method consumedFromSerde in object ImplicitConversions
[ERROR] val groupedK = table.groupBy{ (key: String, value: Array[Byte]) =>
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:252: error: not found: type Aggregator
[ERROR] val newAggregator: Aggregator[String, Array[Byte], Array[Byte]] = new Aggregator[String, Array[Byte], Array[Byte]] {
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/GroupBytransform.scala:252: error: not found: type Aggregator
[ERROR] val newAggregator: Aggregator[String, Array[Byte], Array[Byte]] = new Aggregator[String, Array[Byte], Array[Byte]] {
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Jointransform.scala:10: error: object language is not a member of package org.apache.kafka.streams.scala
[ERROR] import scala.language.implicitConversions
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Jointransform.scala:11: error: object collection is not a member of package org.apache.kafka.streams.scala
[ERROR] import scala.collection.JavaConverters._
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Jointransform.scala:18: error: object collection is not a member of package org.apache.kafka.streams.scala
[ERROR] import scala.collection.mutable.ListBuffer
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Jointransform.scala:37: error: not found: type ListBuffer
[ERROR] val lst = new ListBuffer[String]
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Jointransform.scala:84: error: value asScala is not a member of java.util.List[String]
[ERROR] Note: implicit value j is not applicable here because it comes after the application point and it lacks an explicit result type
[ERROR] val leftRekeyColumnList = if(rekeyLeftNeeded) joinStep.getStringList("joinOn.leftFields").asScala.toList else List.empty
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Jointransform.scala:85: error: value asScala is not a member of java.util.List[String]
[ERROR] Note: implicit value j is not applicable here because it comes after the application point and it lacks an explicit result type
[ERROR] val rightRekeyColumnList = if(rekeyRightNeeded) joinStep.getStringList("joinOn.rightFields").asScala.toList else List.empty
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Jointransform.scala:96: error: value asScala is not a member of java.util.List[String]
[ERROR] Note: implicit value j is not applicable here because it comes after the application point and it lacks an explicit result type
[ERROR] if(!JoinUtils.validateFilterExpression(filterExpression.asScala.toList)) {
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Jointransform.scala:147: error: value asScala is not a member of java.util.List[String]
[ERROR] JoinUtils.joinValues(lv, rv, node, outputFields.asScala.toList)
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Jointransform.scala:157: error: value asScala is not a member of java.util.List[String]
[ERROR] val filterPredicate = JoinUtils.buildPostJoinPredicate(filterExpression.asScala.toList, lv)
[ERROR] ^
[ERROR] /Users/jcena1/IdeaProjects/datapipe-scala-merge-transform/src/main/scala/acme/Jointransform.scala:165: error: value asScala is not a member of java.util.List[String]
[ERROR] val ba: Array[Byte] = JoinUtils.removeEntryFromJsonNode(extraFields.asScala.toList, value)
[ERROR] ^
[ERROR] 29 errors found
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 10.937 s
[INFO] Finished at: 2018-09-12T06:44:53-04:00
[INFO] Final Memory: 35M/376M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.1:compile (default) on project Jointransform: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1) -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
Process finished with exit code 1
I needed to prefix _root_ to all the Scala core packages like for example
import _root_.scala.collection.JavaConverters._

Macro untypecheck required

I'm running into problems in my open-source project using Macros to generate some code. Everything works fine if I use c.untypecheck, but ideally I'd prefer not to have to do that.
This is the relevant code: https://github.com/outr/reactify/blob/master/shared/src/main/scala/com/outr/reactify/Macros.scala#L46
If I remove the c.untypecheck I get the following compile-time error:
[error] (reactifyJVM/test:compileIncremental) java.lang.AssertionError: assertion failed:
[error] transformCaseApply: name = previousVal tree = previousVal / class scala.reflect.internal.Trees$Ident
[error] while compiling: /home/mhicks/projects/open-source/reactify/shared/src/test/scala/specs/BasicSpec.scala
[error] during phase: refchecks
[error] library version: version 2.12.1
[error] compiler version: version 2.12.1
[error] reconstructed args: -classpath /home/mhicks/projects/open-source/reactify/jvm/target/scala-2.12/test-classes:/home/mhicks/projects/open-source/reactify/jvm/target/scala-2.12/classes:/home/mhicks/.ivy2/cache/org.scala-lang/scala-reflect/jars/scala-reflect-2.12.1.jar:/home/mhicks/.ivy2/cache/org.scalatest/scalatest_2.12/bundles/scalatest_2.12-3.0.1.jar:/home/mhicks/.ivy2/cache/org.scalactic/scalactic_2.12/bundles/scalactic_2.12-3.0.1.jar:/home/mhicks/.ivy2/cache/org.scala-lang.modules/scala-xml_2.12/bundles/scala-xml_2.12-1.0.5.jar:/home/mhicks/.ivy2/cache/org.scala-lang.modules/scala-parser-combinators_2.12/bundles/scala-parser-combinators_2.12-1.0.4.jar -bootclasspath /usr/java/jdk1.8.0_92/jre/lib/resources.jar:/usr/java/jdk1.8.0_92/jre/lib/rt.jar:/usr/java/jdk1.8.0_92/jre/lib/sunrsasign.jar:/usr/java/jdk1.8.0_92/jre/lib/jsse.jar:/usr/java/jdk1.8.0_92/jre/lib/jce.jar:/usr/java/jdk1.8.0_92/jre/lib/charsets.jar:/usr/java/jdk1.8.0_92/jre/lib/jfr.jar:/usr/java/jdk1.8.0_92/jre/classes:/home/mhicks/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.12.1.jar
[error]
[error] last tree to typer: TypeTree(class Position)
[error] tree position: line 148 of /home/mhicks/projects/open-source/reactify/shared/src/test/scala/specs/BasicSpec.scala
[error] tree tpe: org.scalactic.source.Position
[error] symbol: case class Position in package source
[error] symbol definition: case class Position extends Product with Serializable (a ClassSymbol)
[error] symbol package: org.scalactic.source
[error] symbol owners: class Position
[error] call site: <$anon: com.outr.reactify.ChangeListener[Int]> in package specs
[error]
[error] == Source file context for tree position ==
[error]
[error] 145 current should be(15)
[error] 146 }
[error] 147 "observe a complex change" in {
[error] 148 val v1 = Var(5)
[error] 149 val v2 = Var(10)
[error] 150 val v3 = Var(v1 + v2)
[error] 151 var changed = 0
[error] Total time: 1 s, completed Jan 31, 2017 4:43:03 PM
If I add it back everything compiles and works just fine. In more complex use-cases I've been encountering some issues at compile-time Could not find proxy for ... and I think this might be the reason.
Any suggestions would be greatly appreciated.
You're introducing an untyped tree into a typed tree.
The incoming tree is typechecked, and then the outgoing tree (that your macro emits) is typechecked again, but the typer does not descend into a tree that is already typechecked (i.e., that has a type already assigned to it).
Because you're introducing new symbols, you can't just use the incoming context to typecheck your reference.
So, the simplest solution is what you arrived at, to untypecheck the outgoing tree. It's also sufficient to untypecheck the transformed tree, to allow typer to descend to your new, untyped tree.
I had to reduce the exploding test by commenting out code. It's unfortunate that it's not immediately obvious what source line causes the error. Maybe it's more obvious if you're familiar with the macro involved.
class Sample {
def sample(): Unit = {
val v = Var(5)
v := v + 5
}
}
The tree in question, from -Xprint:typer -Yshow-trees:
Apply( // def +(x: Int): Int in class Int, tree.tpe=Int
com.outr.reactify.`package`.state2Value[Int](previousVal)."$plus" // def +(x: Int): Int in class Int, tree.tpe=(x: Int)Int
5
)
Also worth mentioning that it was easier to write a quick compile script with the "reconstructed args" in the error message, to eliminate sbt incremental compilation, ScalaTest macros and other mysteries.
Edit, the API for setting by hand:
def setStateChannel(value: c.Tree): c.Tree = {
val observables = retrieveObservables(c)(value)
val channel = c.prefix.tree
val selfReference = observables.exists(_.equalsStructure(channel))
val untyped =
q"""
val previousValue = com.outr.reactify.State.internalFunction($channel)
val previousVal = com.outr.reactify.Val(previousValue())
"""
val retyped = c.typecheck(untyped)
val transformed = if (selfReference) {
val transformer = new Transformer {
override def transform(tree: c.universe.Tree): c.universe.Tree = if (tree.equalsStructure(channel)) {
val t = q"previousVal"
val Block(_ :: v :: Nil, _) = retyped
c.internal.setSymbol(t, v.symbol)
c.internal.setType(t, v.tpe)
} else {
super.transform(tree)
}
}
transformer.transform(value)
} else {
value
}
val res = q"$channel.update(List(..$observables), $transformed)"
q"$retyped ; $res"
}

How can I use Scala macros to create an object?

I am trying to create a Scala macro that will generate an object - something like
object SomeEnum {
sealed abstract class Enum(name: String)
case object Option1 extends Enum("option1")
case object Option2 extends Enum("option2")
private val elements: Seq[Enum] = Seq(Option1, Option2)
def apply(code: String): Enum = {
...
}
}
I thought I might be able to create a macro createEnum, so I could just put
createEnum("SomeEnum", "Option1", "Option2") into my code and have it generate. Seems like it's calling out for a macro.
But I must not be understanding macros. I am using Scala 2.11.6, and just to try to get something working, I created the following:
object createEnumObj {
def createEnumImpl(c: scala.reflect.macros.whitebox.Context)(ename: c.Expr[String]): c.universe.ModuleDef = {
import c.universe._
val Literal(Constant(s_ename: String)) = ename.tree
val oname = TermName(s_ename)
val barLine = q"val bar: Int = 5"
q"object $oname { $barLine }"
}
def createEnum(ename: String): Unit = macro createEnumImpl
}
This is in a separate project - everything seems to be compiling for it OK.
If I stick a call to createEnumObj.createEnum into some source and try to compile that, I get a billion lines (give or take a few) of exception output, which seems to repeat something like this:
[error] (main/compile:compile) java.lang.AssertionError: assertion failed:
[error] object foo extends scala.AnyRef {
[error] def <init>() = {
[error] super.<init>();
[error] ()
[error] };
[error] val bar: Int = 5
[error] }
[error] while compiling: /Users/bob/ICL/ironcore-id/src/main/scala/package.scala
[error] during phase: typer
[error] library version: version 2.11.6
[error] compiler version: version 2.11.6
[error] reconstructed args: -Xfuture ...
error]
[error] last tree to typer: term foo
[error] tree position: line 8 of /Users/bob/ICL/ironcore-id/src/main/scala/package.scala
[error] symbol: <none>
[error] symbol definition: <none> (a NoSymbol)
[error] symbol package: <none>
[error] symbol owners:
[error] call site: <none> in <none>
[error]
[error] == Source file context for tree position ==
[error]
[error] 5 type DateTime = Int
[error] 6
[error] 7 createEnumObj.createEnum("foo")
[error] 8
[error] 9 }
[error] Total time: 2 s, completed Jun 18, 2015 2:46:05 PM
What I am trying to do doesn't seem too dissimilar to this question, but I'm obviously missing something. Any ideas about how to accomplish this would be gratefully accepted.
Thanks,
Bob

I've got error for generating pickler/unpickler for a type with type-parameter

I'm trying to use scala-pickling for my project; but I've got a problem with it. let assume that I've got this code:
import scala.pickling._
import scala.pickling.Defaults._
import scala.pickling.json._
sealed trait State
case class Married(name:String) extends State
case object Single extends State
trait Person[T<:State] {
def name:String
def surname:String
def age:Int
def state:T
}
case class Male[T<:State](
val name:String,
val surname:String,
val age:Int,
val state:T) extends Person[T]
case class Female[T<:State](
val name:String,
val surname:String,
val age:Int,
val state:T) extends Person[T]
def hideType[T<:State]:Person[T] = Male("Hussein", "?", 145, Single).asInstanceOf[Person[T]]
When I'm try to hideType.pickle , I get a compile-time error:
error: Cannot generate a pickler for Person[T]. Recompile with -Xlog-implicits for details
What's the problem for generating a pickler/unpickler in this case?
more info:
scala 2.11.6
scala-pickling 0.10.0
EDIT 1:
result of compile with "-Xlog-implicits":
[info] Loading global plugins from /home/someone/i/etc/sbt/0.13/plugins
[info] Loading project definition from /home/someone/tmp/pickling/project
[info] Set current project to pickling (in build file:/home/someone/tmp/pickling/)
[info] Compiling 1 Scala source to /home/someone/tmp/pickling/target/scala-2.11/classes...
[info] /home/someone/tmp/pickling/src/main/scala/com/example/Hello.scala:35: genPickler is not a valid implicit value for scala.pickling.Pickler[com.example.Person[T]] because:
[info] hasMatchingSymbol reported error: stepping aside: repeating itself
[info] hideType.pickle
[info] ^
[info] /home/someone/tmp/pickling/src/main/scala/com/example/Hello.scala:35: genPickler is not a valid implicit value for scala.pickling.Pickler[com.example.Person[T]] because:
[info] hasMatchingSymbol reported error: polymorphic expression cannot be instantiated to expected type;
[info] found : [T]scala.pickling.Pickler[T]
[info] required: scala.pickling.Pickler[com.example.Person[?]]
[info] hideType.pickle
[info] ^
[error] /home/someone/tmp/pickling/src/main/scala/com/example/Hello.scala:35: Cannot generate a pickler for com.example.Person[T]. Recompile with -Xlog-implicits for details
[error] hideType.pickle
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 2 s, completed May 26, 2015 4:31:45 AM
I am not sure what is the intention but you got a generic variable.
Try
hideType[State].pickle
or
hideType[Married].pickle