Why does json4s need a Scala compiler as a runtime dependency - scala

I've discovered that by using json4s native
<dependency>
<groupId>org.json4s</groupId>
<artifactId>json4s-native_2.10</artifactId>
<version>3.2.9</version>
</dependency>
brings scalap and scala-compiler dependencies.
Why does it need it?
Does it generate code on the fly at runtime?
Why doesn't it use macros that do this processing at compile time?

The people of json4s have answered me in this issue the following:
Because we need to read the byte code to find out information about scala primitives. This is more necessary on 2.9 than it is on 2.10

Related

Flink Scala Missing Import

In my Flink project I cannot find certain libraries for connectors (specifically I need to ingest a CSV once and read several TBs of parquet data in either batch or streaming mode). I think I have all the required packages, but I am still getting:
[ERROR] import org.apache.flink.connector.file.src.FileSource
[ERROR] ^
[ERROR] C:\Users\alias\project\...\MyFlinkJob.scala:46: error: not found: type FileSource
My POM.xml is rather large, but I think I have the relevant imports:
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-parquet</artifactId>
<version>1.15.2</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-filesystem_${scala.binary.version}</artifactId>
<version>1.11.6</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-hadoop-bulk_2.12</artifactId>
<version>1.14.6</version>
</dependency>
I am using the following versions:
<scala.version>2.12.16</scala.version>
<scala.binary.version>2.12</scala.binary.version>
<log4j.version>2.17.1</log4j.version>
<flink.version>1.15.1</flink.version>
Do I need a different import path for Scala than Java?
I wish the Flink documentation had the imports in example code snippets as I spend a long time trying to figure out the imports. What are recommended ._ imports?
I've looked through the symbols in the package but didn't find FileSystem. I looked for different tutorials and example projects showing how to read/listen-to parquet and CSV files with Flink. I made some progress this way, but of the few examples I found in Scala (not Java) for using Parquet files as a source the imports still didn't work even after adding their dependencies and running mvn clean install.
I tried using GitHub's advance search to find a public Scala project using FileSource and eventually found one with the following dependency:
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-files</artifactId>
<version>${project.version}</version>
</dependency>
This package was missing on index.scala-lang.org where I thought I should be looking for dependencies (this is my first Scala project so I thought that was the place to find packages like PyPi in Python). It seems that MVN Repository may be a better place to look.
Flink 1.15 has a Scala-free classpath, which has resulted in a number of Flink artifacts no longer having a Scala suffix. You can read all about it in the dedicated Flink blog on this topic: https://flink.apache.org/2022/02/22/scala-free.html
You can also see in that blog how you can use any Scala version with Flink instead of being limited to Scala 2.12.6.
TL;DR: you should use the Java APIs in your application. The Scala APIs will also be deprecated as of Flink 1.17.
Last but not least: don't mix & match Flink version. That won't work.

Scala signature Http has wrong version expected: 5.0 found: 5.2 in Http.class

I am using akka.http in my Scala code in Intellij. But when I try to build my project I get the following error:
scalac: error while loading Http, class file 'C:\Users\XXXXXX.m2\repository\com\typesafe\akka\akka-http-core_2.13\10.2.9\akka-http-core_2.13-10.2.9.jar(akka/http/scaladsl/Http.class)' is broken
(class java.lang.RuntimeException/error reading Scala signature of Http.class: Scala signature Http has wrong version
expected: 5.0
found: 5.2 in Http.class)
My POM.xml looks like following:
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-actor_3</artifactId>
<version>2.6.19</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.typesafe.akka/akka-http -->
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-http_2.13</artifactId>
<version>10.2.9</version>
</dependency>
My Project structure libraries looks like this:
What Can I do to resolve this error?. Thanks in advance.
I could resolve this issue with latest version of http-akka.
Scala dependencies are built with an specific scala version, and you should match your different dependencies versions, they are also specified in the artifact id after the underscore "_" sign, so you were trying to use an akka-actor_3 scala version 3, with an akka-http_2.13 built for scala version 2.13, so they were uncompatible

using AWS java SDK in Scala

I'm modifying some parts of spark core which is written in Scala. Towards that, I want to call AWS Java API. As far as I know, it is possible to import java libraries in Scala code as there are already java library calls and import in Scala code like this:
import java.util.concurrent.{ScheduledFuture, TimeUnit}
Here they are importing some built-in java libraries. But, I do want to import AWS Java SDK. In their official documentation, they say that to use the SDK we should add the dependency to the project pom.xml file to be able to build the project using mv:
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>1.11.106</version>
</dependency>
</dependencies>
I'm wondering whether this is enough or not? Can I now import AWS Java classes in spark Scala source code?
Can I now import AWS Java classes in spark Scala source code?
Yes

Scala and persistence framework version incompatible

I try to use slick and squeryl framework for data persistence with scala. I don't want to use Play framework, but just the persistence framework, but when I import slick (or squeryl) jar file, I encountered the issue below:
slick_2.10.1-2.0.0-M1.jar of <project_name> build path is cross-compiled with an incompatible version of Scala (2.10.1). In case this report is mistaken, this check can be disabled in the compiler preference page.
I used scala jar (2.11.6) under scala plugin on Eclipse, and I can run simple scala application. I can also get access to mysql dbms with jdbc. This problem appears when I import the slick (or squeryl) jar files. Is it because the framework does not support scala 2.11? Is downgrade scala version the solution? If so, can anyone point me a direction on how to downgrade the scala version under Eclipse scala plugin. Thank you very much
If you are using scala 2.11 you need to use this dependency for slick:
<dependency>
<groupId>com.typesafe.slick</groupId>
<artifactId>slick_2.11</artifactId>
<version>3.0.0</version>
</dependency>
The previous answer should resolve your issue with slick. If you'd like to use Squeryl, the dependency should be
<dependency>
<groupId>org.squeryl</groupId>
<artifactId>squeryl_2.11</artifactId>
<version>0.9.6-RC3</version>
</dependency>
Or, if you want to use 0.9.5
<dependency>
<groupId>org.squeryl</groupId>
<artifactId>squeryl_2.11</artifactId>
<version>0.9.5-7</version>
</dependency>
Libraries in Scala are only binary compatible with the minor version of Scala they were compiled against. You'll see that in these examples the correct scala version is appended to the artifact ID with an underscore.
If you have the ability to use SBT instead of Maven, I would recommend it. SBT can choose the proper version for you when you reference a dependency like the following
libraryDependencies += "org.squeryl" % "squeryl_2.11" % "0.9.6-RC3"

Scala error: class file is broken, bad constant pool index

I'm trying to call the Selenium Java libraries from Scala. I'm using Scala IDE (Eclipse), and Scala 2.10.2. What is causing this compiler error?
error while loading Function, class file '/Dev/selenium-2.35.0/libs/guava-
14.0.jar(com/google/common/base/Function.class)' is broken
(class java.lang.RuntimeException/bad constant pool index: 0 at pos: 479)
Sometimes I fix broken class file errors by including more jars -- jars that javac would not need to see, but apparently scalac does. But is this case I don't know what other jars I can add.
Found the answer. It's caused by this: https://code.google.com/p/guava-libraries/issues/detail?id=1095. The error disappeared when I added the jsr305 jar.
RobN's answer is correct, but I thought I'd write a little bit longer answer with my own experiences. This is related to this question and discussions on Guava issues 776 and 1095 mentioned by RobN.
I had this same problem trying to access
com.google.common.io.BaseEncoding.base64()
Eclipse claims the base64 member does not exist and Gradle build
produces the error in the question:
[ant:scalac] error: error while loading BaseEncoding, class file
'.../guava-16.0.jar(com/google/common/io/BaseEncoding.class)' is broken
The error is caused by optional dependency on some annotations in Guava's pom.xml. As explained in this answer, Java compiler ignores
annotations for which corresponding class file is not found, but Scala compiler
requires the defitions to compile.
Explicitly adding the dependency that is optional should solve the problem.
In this particular case Guava's pom.xml has following optional dependency and adding the dependency declarations below to your project will solve the problem:
Gradle:
compile 'com.google.code.findbugs:jsr305:2.0.2'
Maven:
<dependency>
<groupId>com.google.code.findbugs</groupId>
<artifactId>jsr305</artifactId>
<version>2.0.2</version>
</dependency>