I have an old Play 2.6 Scala project that works in production. I'm trying to start it back locally but now I have the following error with Slick: macro implementation not found: apply for every TableQuery
For instance with this line:
val usersCars: TableQuery[UsersCars] = TableQuery[UsersCars]
throws the error.
I tried to change the file where it is defined but without success.
The Scala version is 2.12.5
Related
I am using spark 3.0.1 in my kotlin project. Compilation fails with the following error:
e: org.jetbrains.kotlin.util.KotlinFrontEndException: Exception while analyzing expression at (51,45) in /home/user/project/src/main/kotlin/ModelBuilder.kt
...
Caused by: java.lang.IllegalStateException: No parameter with index 0-0 (name=reverser$module$1 access=16) in method scala.collection.TraversableOnce.reverser$2
at org.jetbrains.kotlin.load.java.structure.impl.classFiles.AnnotationsAndParameterCollectorMethodVisitor.visitParameter(Annotations.kt:48)
at org.jetbrains.org.objectweb.asm.ClassReader.readMethod(ClassReader.java:1149)
at org.jetbrains.org.objectweb.asm.ClassReader.accept(ClassReader.java:680)
at org.jetbrains.org.objectweb.asm.ClassReader.accept(ClassReader.java:392)
at org.jetbrains.kotlin.load.java.structure.impl.classFiles.BinaryJavaClass.<init>(BinaryJavaClass.kt:77)
at org.jetbrains.kotlin.load.java.structure.impl.classFiles.BinaryJavaClass.<init>(BinaryJavaClass.kt:40)
at org.jetbrains.kotlin.cli.jvm.compiler.KotlinCliJavaFileManagerImpl.findClass(KotlinCliJavaFileManagerImpl.kt:115)
at org.jetbrains.kotlin.cli.jvm.compiler.KotlinCliJavaFileManagerImpl.findClass(KotlinCliJavaFileManagerImpl.kt:85)
at org.jetbrains.kotlin.cli.jvm.compiler.KotlinCliJavaFileManagerImpl$findClass$$inlined$getOrPut$lambda$1.invoke(KotlinCliJavaFileManagerImpl.kt:113)
at org.jetbrains.kotlin.cli.jvm.compiler.KotlinCliJavaFileManagerImpl$findClass$$inlined$getOrPut$lambda$1.invoke(KotlinCliJavaFileManagerImpl.kt:48)
at org.jetbrains.kotlin.load.java.structure.impl.classFiles.ClassifierResolutionContext.resolveClass(ClassifierResolutionContext.kt:60)
at org.jetbrains.kotlin.load.java.structure.impl.classFiles.ClassifierResolutionContext.resolveByInternalName$frontend_java(ClassifierResolutionContext.kt:101)
at org.jetbrains.kotlin.load.java.structure.impl.classFiles.BinaryClassSignatureParser$parseParameterizedClassRefSignature$1.invoke(BinaryClassSignatureParser.kt:141)
I've cleaned/rebuilt the project several times, removed the build directory and tried building from the command line with gradle.
The code where this happens:
val data = listOf(...)
val schema = StructType(arrayOf(
StructField("label", DataTypes.DoubleType, false, Metadata.empty()),
StructField("sentence", DataTypes.StringType, false, Metadata.empty())
))
val dataframe = spark.createDataFrame(data, schema) // <- offending line.
Was using kotlin version 1.4.0, upgraded to 1.4.10 without any change, still same error.
Looks like this bug (and this) already reported to JetBrains, but is it really not possible to use spark 3 (local mode) in kotlin 1.4?
I managed to get it working with Spring Boot (2.3.5) by adding the following to the dependencyManagement:
dependencies {
dependencySet("org.scala-lang:2.12.10") {
entry("scala-library")
}
}
This will downgrade the scala-library jar from 2.12.12 to 2.12.10 version, which is the same version of the scala-reflect jar in my project. I'm also using Kotlin 1.4.10
Are you trying to use this API?
https://spark.apache.org/docs/2.3.0/api/java/org/apache/spark/sql/SparkSession.html#createDataFrame-java.util.List-java.lang.Class-
There is no method which takes java.util.List as well as Schema object AFAIK..
I am using Flink 1.8.0,
Trying to compile this line of code
val mockState = mock[KeyedStateStore]
with Intellij yields the following error (although that in sbt it seems to compile fine)
Error:(38, 23) class FoldingStateDescriptor in package state is
deprecated: see corresponding Javadoc for more information.
val mockState = mock[KeyedStateStore]
Although I am not calling the getFoldingState which is deprecated . any ideas how to get around that ?
I just started working with the MLib for Spark and tried to run the provided examples, more specifically https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/ml/DCTExample.scala
However, compilation using the IntelliJ IDE fails with the message
Error:(41, 35) No TypeTag available for (org.apache.spark.ml.linalg.Vector,)
val df = spark.createDataFrame(data.map(Tuple1.apply)).toDF("features")
The project setup uses jdk1.8.0_121, spark2.11-2.1.0 and scala 2.10.6.
Any ideas on why the example fails to run? I followed the following tutorial during installation: https://www.supergloo.com/fieldnotes/intellij-scala-spark/
You can't have spark for Scala 2.11 (that's what _2.11 in the name means) with Scala 2.10, though this specific error looks quite strange. Switch to Scala 2.11.8.
I am new to Scala and am trying to code read a file using the following code
scala> val textFile = sc.textFile("README.md")
scala> textFile.count()
But I keep getting the following error
error: not found: value sc
I have tried everything, but nothing seems to work. I am using Scala version 2.10.4 and Spark 1.1.0 (I have even tried Spark 1.2.0 but it doesn't work either). I have sbt installed and compiled yet not able to run sbt/sbt assembly. Is the error because of this?
You should run this code using ./spark-shell. It's scala repl with provided sparkContext. You can find it in your apache spark distribution in folder spark-1.4.1/bin.
I've got some code that references scala.collection.jcl written against Scala 2.7.7. I'm now trying to compile it against Scala 2.8 for the first time, and I'm getting this error:
"value jcl is not a member of package collection".
Is there a substitute/replacement for jcl in 2.8?
It looks like JavaConversions does the job somewhat:
import scala.collection.JavaConversions._