I have a .proto file which Imports google/protobuf/wrappers.proto
while I run Scalapbc to generate the relevant scala code out of it it gives Import google/protobuf/wrappers.proto not found error.
as a workaround for now I have kept the wrappers.proto file in file system for now inside --proto_path
But I need to come up with a fix wherein I need add the relevant dependencies in build.sbt / pom.xml to unpack the jar containing default proto files (such as wrappers.proto) before calling Scalapbc
All the required dependencies are provided by scalabp runtime
import sbtprotoc.ProtocPlugin.ProtobufConfig
import scalapb.compiler.Version.scalapbVersion
libraryDependencies ++= Seq(
"com.thesamet.scalapb" %% "scalapb-runtime" % scalapbVersion,
"com.thesamet.scalapb" %% "scalapb-runtime" % scalapbVersion % ProtobufConfig
)
I uses the AkkaGrpcPugin for sbt which seems to handle all the dependencies.
In plugins.sbt I have
addSbtPlugin("com.lightbend.akka.grpc" % "sbt-akka-grpc" % "1.1.1")
In build.sbt I have
enablePlugins(AkkaGrpcPlugin)
In automatically picks up the files in src/main/protobuf for the project and generates the appropriate stub files. I can import standard files, e.g.
import "google/protobuf/timestamp.proto";
For multi-project builds I use something like this:
lazy val allProjects = (project in file("."))
.aggregate(util, grpc)
lazy val grpc =
project
.in(file("grpc"))
.settings(
???
)
.enablePlugins(AkkaGrpcPlugin)
lazy val util =
project
.in(file("util"))
.settings(
???
)
.dependsOn(grpc)
Thanks everyone for your answers. I really appreciate it.
I was able to solve the dependency issue by
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>unpack</id>
<phase>prepare-package</phase>
<goals>
<goal>unpack</goal>
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
<version>3.10.0</version>
<type>jar</type>
<includes>path/to/Files.whatsoever</includes>
<outputDirectory>${project.build.directory}/foldername</outputDirectory>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
This generates the required proto files inside target folder
Related
shapeless.DefaultSymbolicLabelling
shapeless.DefaultSymbolicLabelling$.instance(shapeless.HList)
getting this error while using both pureconfig and circe.
I'm using spark 3.1.2 with spark k8s operator.
This error is because of conflicting shapeless library versions. Spark 3.1.2 ships with shapeless 2.3.3 whereas both these packages need shapeless 2.3.7. To solve this I followed the steps mentioned here which involve shading ie renaming the dependency.
For SBT
If you are using the sbt-assembly plugin to create your JARs you can shade shapeless by adding to your assembly.sbt file the following setting:
assembly / assemblyShadeRules := Seq(ShadeRule.rename("shapeless.**" -> "new_shapeless.#1").inAll)
Maven
The maven-shade-plugin can shade shapeless by adding to your pom.xml file the following block:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
<relocations>
<relocation>
<pattern>shapeless</pattern>
<shadedPattern>shapelesspureconfig</shadedPattern>
</relocation>
</relocations>
</configuration>
</plugin>
I'm trying to build a scala project with docker Multi-Stage ability.
For starter, this is my dockerfile:
FROM maven:3.6.0-jdk-11-slim AS maven
RUN apt-get update
WORKDIR /build
COPY pom.xml .
RUN mvn -B de.qaware.maven:go-offline-maven-plugin:resolve-dependencies
COPY src src
RUN mvn -B -o install spring-boot:repackage
FROM openjdk:11.0.6
WORKDIR /opt/app
COPY --from=maven /build/target/app.jar app.jar
CMD ["java", "-jar", "/opt/app/app.jar"]
EXPOSE 8080
I noticed that after finishing the resolve-dependencies part, maven still trying to download dependencies on install stage. The errors that I get are related to the scala-maven-plugin that looking for non-existing dependencies that didn't fetched in the resolving stage. The errors looks like this:
Failed to execute goal
net.alchim31.maven:scala-maven-plugin:3.4.0:compile (default) on
project app: wrap:
org.apache.maven.artifact.resolver.ArtifactNotFoundException: Cannot
access ... in
offline mode and the artifact
org.scala-lang:scala-compiler:jar:2.11.12 has not been downloaded from
it before.
Even adding this dependency isn't enough because it fails on another dependecies.
The plugin in the POM looks like that:
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.4.0</version>
<executions>
<execution>
<goals>
<!-- Need to specify this explicitly, otherwise plugin won't be called when doing e.g. mvn compile -->
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<args>
<!-- work-around for https://issues.scala-lang.org/browse/SI-8358 -->
<arg>-nobootcp</arg>
<arg>-Yresolve-term-conflict:package</arg>
</args>
<scalaVersion>${scala.version}</scalaVersion>
</configuration>
</plugin>
Seems like the plugin doesn't stop there and download everything again.
Thanks guys..
Like there is a comment on your question it is better to use sbt as a first citizen build tool for Scala. Particularly I suggest using the sbt-native-packager in conjunction with the plugins JavaAppPackaging and DockerPlugin to create the docker image without a Dockerfile. There are some tutorials to create it on the web. Basically, you will need something like these lines on your build.sbt file (example from my project).
enablePlugins(JavaAppPackaging, JavaServerAppPackaging, DockerPlugin, AshScriptPlugin)
// ####### Dockerfile settings #######
import NativePackagerHelper._
packageName in Docker := packageName.value
version in Docker := version.value
dockerExposedPorts := List(8001, 2551)
dockerLabels := Map("user" -> "you.email#gmail.com")
dockerBaseImage := "openjdk:jre-alpine"
dockerRepository := Some("docker.user.name")
defaultLinuxInstallLocation in Docker := "/usr/local"
daemonUser in Docker := "daemon"
mappings in Universal ++= directory( baseDirectory.value / "src" / "main" / "resources" )
// ####### Dockerfile settings #######
and at the project/plugins.sbt file:
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.7.6")
Then you execute the following commands on your console in order to create the Dockerfile at target/docker/stage/Dockerfile.
sbt docker:stage
sbt docker:publishLocal
Finally, to solve this issue I used this plugin for 'go-offline' instead of maven's:
<plugin>
<groupId>de.qaware.maven</groupId>
<artifactId>go-offline-maven-plugin</artifactId>
<version>1.2.8</version>
</plugin>
Using it with this command:
mvn -B de.qaware.maven:go-offline-maven-plugin:resolve-dependencies
Also added to scala-maven-plugin this configuration which disable the incremental compile dependencies:
<configuration>
<recompileMode>all</recompileMode>
</configuration>
So the full plugin looks like this:
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>4.4.0</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<recompileMode>all</recompileMode>
</configuration>
</plugin>
I'm trying to use scalatest and spark-testing-base on Maven for integration testing Spark. The Spark job reads in a CSV file, validates the results, and inserts the data into a database. I'm trying to test the validation by putting in files of known format and seeing if and how they fail. This particular test just makes sure the validation passes. Unfortunately, scalatest can't find my tests.
Relevant pom plugins:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<skipTests>true</skipTests>
</configuration>
</plugin>
<!-- enable scalatest -->
<plugin>
<groupId>org.scalatest</groupId>
<artifactId>scalatest-maven-plugin</artifactId>
<version>1.0</version>
<configuration>
<reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
<wildcardSuites>com.cainc.data.etl.schema.proficiency</wildcardSuites>
</configuration>
<executions>
<execution>
<id>test</id>
<goals>
<goal>test</goal>
</goals>
</execution>
</executions>
</plugin>
And here's the test class:
class ProficiencySchemaITest extends FlatSpec with Matchers with SharedSparkContext with BeforeAndAfter {
private var schemaStrategy: SchemaStrategy = _
private var dataReader: DataFrameReader = _
before {
val sqlContext = new SQLContext(sc)
import sqlContext._
import sqlContext.implicits._
val dataInReader = sqlContext.read.format("com.databricks.spark.csv")
.option("header", "true")
.option("nullValue", "")
schemaStrategy = SchemaStrategyChooser("dim_state_test_proficiency")
dataReader = schemaStrategy.applySchema(dataInReader)
}
"Proficiency Validation" should "pass with the CSV file proficiency-valid.csv" in {
val dataIn = dataReader.load("src/test/resources/proficiency-valid.csv")
val valid: Try[DataFrame] = Try(schemaStrategy.validateCsv(dataIn))
valid match {
case Success(v) => ()
case Failure(e) => fail("Validation failed on what should have been a clean file: ", e)
}
}
}
When I run mvn test, it can't find any tests and outputs this message:
[INFO] --- scalatest-maven-plugin:1.0:test (test) # load-csv-into-db ---
[36mDiscovery starting.[0m
[36mDiscovery completed in 54 milliseconds.[0m
[36mRun starting. Expected test count is: 0[0m
[32mDiscoverySuite:[0m
[36mRun completed in 133 milliseconds.[0m
[36mTotal number of tests run: 0[0m
[36mSuites: completed 1, aborted 0[0m
[36mTests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0[0m
[33mNo tests were executed.[0m
UPDATE
By using:
<suites>com.cainc.data.etl.schema.proficiency.ProficiencySchemaITest</suites>
Instead of:
<wildcardSuites>com.cainc.data.etl.schema.proficiency</wildcardSuites>
I can get that one Test to run. Obviously, this is not ideal. It's possible wildcardSuites is broken; I'm going to open a ticket on GitHub and see what happens.
This is probably because there are some space characters in the project path.
Remove space in project path and the tests can be discovered successfully.
Hope this help.
Try excluding junit as a transitive dependency. Works for me. Example below, but note the Scala and Spark versions are specific to my environment.
<dependency>
<groupId>com.holdenkarau</groupId>
<artifactId>spark-testing-base_2.10</artifactId>
<version>1.5.0_0.6.0</version>
<scope>test</scope>
<exclusions>
<!-- junit is not compatible with scalatest -->
<exclusion>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
</exclusion>
</exclusion>
</dependency>
With me, it's because I wasn't using the following plugin:
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
<args>
<arg>-target:jvm-1.8</arg>
</args>
</configuration>
</plugin>
The issue I had with tests not getting discovered came down to the fact that the tests are discovered from the class files, so to make the tests get discovered I need to add <goal>testCompile</goal> to scala-maven-plugin goals.
In my case it's because of the nesting of tests inside the test directory and using the <memberOnlySuites> configuration. <memberonlySuites> only looks out for the test files in the give package / directory. Instead use <wildcardSuites> which will look into a package / directory and all it's subdirectories.
This happens quiet often when you are adding more tests to your test suite and organising them in a more structured manner.
Cause: Maven plugins does not compile your test code whenever you run mvn commands.
Work around:
Run scala tests using your IDE which will compile the test code and saves it in target directory. And when next time you run mvn test or any maven command which internally triggers maven's test cycle it should run the scala tests
I have Eclipse Kepler, I have installed the Maven and Scala plugins. I create a new Maven project and add the dependency
groupId: org.apache.spark
artifactId: spark-core_2.10
version: 1.1.0
as per current doc at http://spark.apache.org/downloads.html, all is fine, the jars for Scala 2.10 are also added to the project. I then add the "Scala Nature" to the project, this adds Scala 2.11 and I end up with the following error
More than one scala library found in the build path (C:/Eclipse/eclipse-jee-kepler-SR2-win32-x86_64/plugins/org.scala-lang.scala-library_2.11.2.v20140721-095018-73fb460c1c.jar, C:/Users/fff/.m2/repository/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar).
At least one has an incompatible version.
Please update the project build path so it contains only compatible scala libraries.
Is it possible to use Spark (from Maven) and Scala IDE Plugin together? Any ideas on how to fix this problem?
Thanks for your help. Regards
In short, yes, it is possible.
Spark is currently using Scala 2.10, and the latest Scala IDE is "cross published" for 2.10 and 2.11. You need to choose the 2.10-based version, which is 3.0.3.
However, the next major version, 4.0, which is in release candidate mode, has multi-version support. You can create a Scala project and select the Scala version you'd like to use (2.10 or 2.11). You could give that a try if you feel like it.
If someone stumbles here while searching for the same thing:
I recently created Maven archetype for bootstrapping a new Spark 1.3.0 with Scala 2.10.4 project.
Follow instructions here:
https://github.com/spark-in-action/scala-archetype-sparkinaction
For IntelliJ IDEA, first generate project from command line and then import into IDE.
You have installed the Scala Ide plugin, but Scala nature of a project is of use only if you include scala classes in your project.
Spark and Scala are however made to work together. Make sure you you use compatible versions. You can install scala on your computer, and then use the compatible spark maven dependency.
yes you can.. use the pom I am providing below
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.spark-scala</groupId>
<artifactId>spark-scala</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>${project.artifactId}</name>
<description>Spark in Scala</description>
<inceptionYear>2010</inceptionYear>
<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<encoding>UTF-8</encoding>
<scala.tools.version>2.10</scala.tools.version>
<!-- Put the Scala version of the cluster -->
<scala.version>2.10.4</scala.version>
</properties>
<!-- repository to add org.apache.spark -->
<repositories>
<repository>
<id>cloudera-repo-releases</id>
<url>https://repository.cloudera.com/artifactory/repo/</url>
</repository>
</repositories>
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<testSourceDirectory>src/test/scala</testSourceDirectory>
<plugins>
<plugin>
<!-- see http://davidb.github.com/scala-maven-plugin -->
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.1</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.13</version>
<configuration>
<useFile>false</useFile>
<disableXmlReport>true</disableXmlReport>
<includes>
<include>**/*Test.*</include>
<include>**/*Suite.*</include>
</includes>
</configuration>
</plugin>
<!-- "package" command plugin -->
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4.1</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.2.1</version>
</dependency>
</dependencies>
</project>
There are 2 types of Spark JAR files (just by looking at the Name):
Name includes the word "assembly" and not "core" (has Scala inside)
Name includes the word "core" and not "assembly" (no Scala inside).
You should include the "core" type in your Build Path via “Add External Jars” (the version you need) since the Scala IDE already shoves one Scala for you.
Alternatively, you can just take advantage of the SBT and add the following Dependency (again, pay attention to the versions you need):
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.0"
Then you should NOT include “forcefully” any spark JAR in the Build Path.
Happy sparking:
Zar
>
How can I set target JVM version in SBT?
In Maven (with maven-scala-plugin) it can be done as follows:
<plugin>
...
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
<args>
<arg>-target:jvm-1.5</arg>
</args>
</configuration>
</plugin>
You can specify compiler options in the project definition:
javacOptions ++= Seq("-source", "1.8", "-target", "1.8")
you have to add(in your build.sbt file):
scalacOptions += "-target:jvm-1.8"
otherwise it won't work.
As suggested by others in comments, the current sbt version (1.0, 0.13.15) uses the following notation for setting source and target JVMs.
javacOptions ++= Seq("-source", "1.8", "-target", "1.8")