Geotools JAI fatjar causing problems in native dependencies - scala

I have a scala project using geotools / JAI dependencies in order to process ESRi ASCII Grid files to WKT in java. When deploying a fat-jar, I get a SIGSEV in the native part of the jai code.
It is using the following dependencies:
lazy val geotools = "17.0"
libraryDependencies ++= Seq(
"org.geotools" % "gt-main" % geotools,
"org.geotools" % "gt-arcgrid" % geotools,
"org.geotools" % "gt-process-raster" % geotools)
When creating a fat jar like:
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs#_*) =>
xs match {
// Concatenate everything in the services directory to keep GeoTools happy.
case ("services" :: _ :: Nil) =>
MergeStrategy.concat
// Concatenate these to keep JAI happy.
case ("javax.media.jai.registryFile.jai" :: Nil) | ("registryFile.jai" :: Nil) | ("registryFile.jaiext" :: Nil) =>
MergeStrategy.concat
case (name :: Nil) => {
// Must exclude META-INF/*.([RD]SA|SF) to avoid "Invalid signature file digest for Manifest main attributes" exception.
if (name.endsWith(".RSA") || name.endsWith(".DSA") || name.endsWith(".SF")) {
MergeStrategy.discard
}
else {
MergeStrategy.deduplicate
}
}
case _ => MergeStrategy.deduplicate
}
case _ => MergeStrategy.deduplicate
}
One of JAIS JNI native C dependencies blows up with a SIGSEV. When instead simply using the (a lot unsafer MergeStrategy.first) like:
assemblyMergeStrategy in assembly := {
// TODO bad strategy, find out why below does not work.
case PathList("org", "datasyslab", xs#_*) => MergeStrategy.singleOrError
case PathList("com", "esotericsoftware", xs#_*) => MergeStrategy.last
case PathList("META-INF", "MANIFEST.MF") => MergeStrategy.discard
case _ => MergeStrategy.first
}
The code works without SIGSEV. When looking at the logs of MergeStrategy.SingleOrError, I cant figure out what is the dependency causing the problems in the native code:
[error] (*:assembly) deduplicate: different file contents found in the following:
[error] /Users/geoheil/.ivy2/cache/org.geotools/gt-coverage/jars/gt-coverage-17.0.jar:META-INF/registryFile.jai
[error] /Users/geoheil/.ivy2/cache/org.jaitools/jt-zonalstats/jars/jt-zonalstats-1.4.0.jar:META-INF/registryFile.jai
[error] /Users/geoheil/.ivy2/cache/org.geotools/gt-process-raster/jars/gt-process-raster-17.0.jar:META-INF/registryFile.jai
[error] /Users/geoheil/.ivy2/cache/org.jaitools/jt-rangelookup/jars/jt-rangelookup-1.4.0.jar:META-INF/registryFile.jai
[error] /Users/geoheil/.ivy2/cache/org.jaitools/jt-contour/jars/jt-contour-1.4.0.jar:META-INF/registryFile.jai
[error] /Users/geoheil/.ivy2/cache/org.jaitools/jt-vectorize/jars/jt-vectorize-1.4.0.jar:META-INF/registryFile.jai
[error] deduplicate: different file contents found in the following:
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.affine/jt-affine/jars/jt-affine-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.scale/jt-scale/jars/jt-scale-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.vectorbin/jt-vectorbin/jars/jt-vectorbin-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.translate/jt-translate/jars/jt-translate-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.algebra/jt-algebra/jars/jt-algebra-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.bandmerge/jt-bandmerge/jars/jt-bandmerge-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.bandselect/jt-bandselect/jars/jt-bandselect-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.bandcombine/jt-bandcombine/jars/jt-bandcombine-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.border/jt-border/jars/jt-border-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.buffer/jt-buffer/jars/jt-buffer-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.crop/jt-crop/jars/jt-crop-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.mosaic/jt-mosaic/jars/jt-mosaic-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.lookup/jt-lookup/jars/jt-lookup-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.nullop/jt-nullop/jars/jt-nullop-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.rescale/jt-rescale/jars/jt-rescale-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.stats/jt-stats/jars/jt-stats-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.warp/jt-warp/jars/jt-warp-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.zonal/jt-zonal/jars/jt-zonal-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.binarize/jt-binarize/jars/jt-binarize-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.format/jt-format/jars/jt-format-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.colorconvert/jt-colorconvert/jars/jt-colorconvert-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.errordiffusion/jt-errordiffusion/jars/jt-errordiffusion-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.orderdither/jt-orderdither/jars/jt-orderdither-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.colorindexer/jt-colorindexer/jars/jt-colorindexer-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.imagefunction/jt-imagefunction/jars/jt-imagefunction-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.piecewise/jt-piecewise/jars/jt-piecewise-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.classifier/jt-classifier/jars/jt-classifier-1.0.13.jar:META-INF/registryFile.jaiext
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.rlookup/jt-rlookup/jars/jt-rlookup-1.0.13.jar:META-INF/registryFile.jaiext
[error] deduplicate: different file contents found in the following:
[error] /Users/geoheil/.ivy2/cache/javax.media/jai_imageio/jars/jai_imageio-1.1.jar:META-INF/services/javax.imageio.spi.ImageInputStreamSpi
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.imageio-ext/imageio-ext-streams/jars/imageio-ext-streams-1.1.17.jar:META-INF/services/javax.imageio.spi.ImageInputStreamSpi
[error] deduplicate: different file contents found in the following:
[error] /Users/geoheil/.ivy2/cache/javax.media/jai_imageio/jars/jai_imageio-1.1.jar:META-INF/services/javax.imageio.spi.ImageOutputStreamSpi
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.imageio-ext/imageio-ext-streams/jars/imageio-ext-streams-1.1.17.jar:META-INF/services/javax.imageio.spi.ImageOutputStreamSpi
[error] deduplicate: different file contents found in the following:
[error] /Users/geoheil/.ivy2/cache/javax.media/jai_imageio/jars/jai_imageio-1.1.jar:META-INF/services/javax.imageio.spi.ImageReaderSpi
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.imageio-ext/imageio-ext-tiff/jars/imageio-ext-tiff-1.1.17.jar:META-INF/services/javax.imageio.spi.ImageReaderSpi
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.imageio-ext/imageio-ext-arcgrid/jars/imageio-ext-arcgrid-1.1.17.jar:META-INF/services/javax.imageio.spi.ImageReaderSpi
[error] deduplicate: different file contents found in the following:
[error] /Users/geoheil/.ivy2/cache/javax.media/jai_imageio/jars/jai_imageio-1.1.jar:META-INF/services/javax.imageio.spi.ImageWriterSpi
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.imageio-ext/imageio-ext-tiff/jars/imageio-ext-tiff-1.1.17.jar:META-INF/services/javax.imageio.spi.ImageWriterSpi
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.imageio-ext/imageio-ext-arcgrid/jars/imageio-ext-arcgrid-1.1.17.jar:META-INF/services/javax.imageio.spi.ImageWriterSpi
[error] deduplicate: different file contents found in the following:
[error] /Users/geoheil/.ivy2/cache/javax.media/jai_imageio/jars/jai_imageio-1.1.jar:META-INF/services/javax.media.jai.OperationRegistrySpi
[error] /Users/geoheil/.ivy2/cache/it.geosolutions.jaiext.crop/jt-crop/jars/jt-crop-1.0.13.jar:META-INF/services/javax.media.jai.OperationRegistrySpi
[error] deduplicate: different file contents found in the following:
[error] /Users/geoheil/.ivy2/cache/org.jaitools/jt-zonalstats/jars/jt-zonalstats-1.4.0.jar:META-INF/services/javax.media.jai.OperationsRegistrySpi
[error] /Users/geoheil/.ivy2/cache/org.geotools/gt-process-raster/jars/gt-process-raster-17.0.jar:META-INF/services/javax.media.jai.OperationsRegistrySpi
[error] /Users/geoheil/.ivy2/cache/org.jaitools/jt-rangelookup/jars/jt-rangelookup-1.4.0.jar:META-INF/services/javax.media.jai.OperationsRegistrySpi
[error] /Users/geoheil/.ivy2/cache/org.jaitools/jt-contour/jars/jt-contour-1.4.0.jar:META-INF/services/javax.media.jai.OperationsRegistrySpi
[error] /Users/geoheil/.ivy2/cache/org.jaitools/jt-vectorize/jars/jt-vectorize-1.4.0.jar:META-INF/services/javax.media.jai.OperationsRegistrySpi
[error] deduplicate: different file contents found in the following:
[error] /Users/geoheil/.ivy2/cache/org.geotools/gt-main/jars/gt-main-17.0.jar:META-INF/services/org.geotools.filter.FunctionFactory
[error] /Users/geoheil/.ivy2/cache/org.geotools/gt-process/jars/gt-process-17.0.jar:META-INF/services/org.geotools.filter.FunctionFactory
[error] deduplicate: different file contents found in the following:
[error] /Users/geoheil/.ivy2/cache/org.geotools/gt-main/jars/gt-main-17.0.jar:META-INF/services/org.geotools.util.ConverterFactory
[error] /Users/geoheil/.ivy2/cache/org.geotools/gt-process-raster/jars/gt-process-raster-17.0.jar:META-INF/services/org.geotools.util.ConverterFactory
[error] deduplicate: different file contents found in the following:
[error] /Users/geoheil/.ivy2/cache/org.geotools/gt-main/jars/gt-main-17.0.jar:META-INF/services/org.opengis.filter.expression.Function
[error] /Users/geoheil/.ivy2/cache/org.geotools/gt-coverage/jars/gt-coverage-17.0.jar:META-INF/services/org.opengis.filter.expression.Function
[error] /Users/geoheil/.ivy2/cache/org.geotools/gt-cql/jars/gt-cql-17.0.jar:META-INF/services/org.opengis.filter.expression.Function
and here the sigsev information
# A fatal error has been detected by the Java Runtime Environment:
#
# SIGSEGV (0xb) at pc=0x00007fd780b9bfe1, pid=21657, tid=0x00007f0674c4a700
#
# JRE version: OpenJDK Runtime Environment (8.0_121-b13) (build 1.8.0_121-8u121-b13-0ubuntu1.16.04.2-b13)
# Java VM: OpenJDK 64-Bit Server VM (25.121-b13 mixed mode linux-amd64 )
# Problematic frame:
# V [libjvm.so+0x8a4fe1]
#
# Failed to write core dump. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again
#
# An error report file with more information is saved as:
# /home/vagrant/development/projects/HybridAccess/spark/hs_err_pid21657.log
Compiled method (c1) 6599346 22508 ! 3 java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue::take (203 bytes)
total in heap [0x00007fd76bda5b90,0x00007fd76bda83a8] = 10264
relocation [0x00007fd76bda5cb8,0x00007fd76bda5eb8] = 512
main code [0x00007fd76bda5ec0,0x00007fd76bda7360] = 5280
stub code [0x00007fd76bda7360,0x00007fd76bda7508] = 424
metadata [0x00007fd76bda7508,0x00007fd76bda7560] = 88
scopes data [0x00007fd76bda7560,0x00007fd76bda7d80] = 2080
scopes pcs [0x00007fd76bda7d80,0x00007fd76bda8170] = 1008
dependencies [0x00007fd76bda8170,0x00007fd76bda8188] = 24
handler table [0x00007fd76bda8188,0x00007fd76bda8308] = 384
nul chk table [0x00007fd76bda8308,0x00007fd76bda83a8] = 160
And the links I have found for maven:
JAI dependencies in geotools are hard deploy as a single fat jar (http://docs.geotools.org/latest/userguide/faq.html#how-do-i-create-an-executable-jar-for-my-geotools-app, http://osgeo-org.1560.x6.nabble.com/Building-runnable-jar-file-td4320365.html).

Related

Cannot get past "paranamer" error with JlinkPlugin in sbt-native-packager

We're trying to include a minimal JDK as part of a Windows installer for a Play 2.8.8 application, using the version of sbt-native-packager included with Play.
Alas, when packaging the application, we get the following error.
[error] Exception in thread "main" java.lang.module.FindException: Module paranamer not found, required by com.fasterxml.jackson.module.paranamer
[error] at java.base/java.lang.module.Resolver.findFail(Resolver.java:877)
[error] at java.base/java.lang.module.Resolver.resolve(Resolver.java:191)
[error] at java.base/java.lang.module.Resolver.resolve(Resolver.java:140)
[error] at java.base/java.lang.module.Configuration.resolve(Configuration.java:422)
[error] at java.base/java.lang.module.Configuration.resolve(Configuration.java:256)
[error] at jdk.jdeps/com.sun.tools.jdeps.JdepsConfiguration$Builder.build(JdepsConfiguration.java:564)
[error] at jdk.jdeps/com.sun.tools.jdeps.JdepsTask.buildConfig(JdepsTask.java:603)
[error] at jdk.jdeps/com.sun.tools.jdeps.JdepsTask.run(JdepsTask.java:557)
[error] at jdk.jdeps/com.sun.tools.jdeps.JdepsTask.run(JdepsTask.java:533)
[error] at jdk.jdeps/com.sun.tools.jdeps.Main.run(Main.java:64)
[error] at jdk.jdeps/com.sun.tools.jdeps.Main$JDepsToolProvider.run(Main.java:73)
[error] at java.base/java.util.spi.ToolProvider.run(ToolProvider.java:137)
[error] at ru.eldis.toollauncher.ToolLauncher.runTool(ToolLauncher.java:68)
[error] at ru.eldis.toollauncher.ToolLauncher.lambda$main$1(ToolLauncher.java:33)
[error] at ru.eldis.toollauncher.ToolLauncher.main(ToolLauncher.java:48)
[error] java.lang.RuntimeException: Nonzero exit value: 1
[error] at scala.sys.package$.error(package.scala:30)
[error] at com.typesafe.sbt.packager.archetypes.jlink.JlinkPlugin$.runForOutput(JlinkPlugin.scala:212)
[error] at com.typesafe.sbt.packager.archetypes.jlink.JlinkPlugin$.$anonfun$runJavaTool$2(JlinkPlugin.scala:197
[error] at sbt.io.IO$.withTemporaryFile(IO.scala:534)
[error] at sbt.io.IO$.withTemporaryFile(IO.scala:544)
[error] at com.typesafe.sbt.packager.archetypes.jlink.JlinkPlugin$.runJavaTool(JlinkPlugin.scala:191)
[error] at com.typesafe.sbt.packager.archetypes.jlink.JlinkPlugin$.$anonfun$projectSettings$11(JlinkPlugin.scala:53)
[error] at com.typesafe.sbt.packager.archetypes.jlink.JlinkPlugin$.$anonfun$projectSettings$9(JlinkPlugin.scala:74)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:62)
[error] at sbt.std.Transform$$anon$4.work(Transform.scala:68)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:282)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:23)
[error] at sbt.Execute.work(Execute.scala:291)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:282)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:265)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:64)
[error] at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[error] at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
[error] at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[error] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
[error] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
[error] at java.base/java.lang.Thread.run(Thread.java:829)
[error] (jlinkModules) Nonzero exit value: 1
[error] Total time: 1 s, completed Sep 23, 2021, 11:57:58 AM
This error is a known problem with the JlinkPlugin, but we have already applied the recommended "fix" (the definition of the jlinkModulePath setting—see below) and it doesn't appear to resolve the problem. Running show jlinkModulePath within SBT results in the following:
[info] * C:\Users\Me\AppData\Local\Coursier\Cache\v1\https\repo1.maven.org\maven2\com\thoughtworks\paranamer\paranamer\2.8\paranamer-2.8.jar
so that looks OK.
Any suggestions?
Here's the relevant portions of our build.sbt file:
// Enable the Windows installer plugin.
//
// Note: This plugin requires that WiX Toolkit be installed to create the Windows installation
// script.
enablePlugins(WindowsPlugin)
// Enable the Java JLink plugin.
//
// This adds a Java installation to the web-application's installer file.
enablePlugins(JlinkPlugin)
// Package installation settings for Windows.
maintainer := "Me <me#emycompany.com>"
Windows / name := "MyProduct-" + version.value
Windows / packageSummary := "My web app"
Windows / packageDescription := """Windows installer for the My Web-Application."""
// WiX Toolkit product and upgrade GUIDs.
//
// DO NOT CHANGE THESE VALUES!
//
// NOTE: These are not the actual values. Doh!
wixProductId := "someproductguid"
wixProductUpgradeId := "someproductupgradeguid"
// Configure the Windows BAT file's JVM configuration file location.
batScriptConfigLocation := Some("%APP_HOME%\\conf\\jvmopts-bat")
// Configure JLink so that it knows where to find the "paranamer" module (that is not a typo: it really is "paranamer",
// not "parameter").
jlinkModulePath := {
// Get the full classpath with all the resolved dependencies.
fullClasspath.in(jlinkBuildImage).value
// Find the ones that have `paranamer` as their names.
.filter { item =>
item.get(moduleID.key).exists { modId =>
modId.name == "paranamer"
}
}
// Get raw `File` objects.
.map(_.data)
}
// Make jLink output a little smaller, by removing elements we do not require.
jlinkOptions ++= Seq(
"--no-header-files",
"--no-man-pages",
"--compress=2"
)

scala sbt libraryDependencies provided - Avoid downloading 3rd party library

I've the following Spark Scala code that references 3rd party libraries,
package com.protegrity.spark
import org.apache.spark.sql.api.java.UDF2
import com.protegrity.spark.udf.ptyProtectStr
import com.protegrity.spark.udf.ptyProtectInt
class ptyProtectStr extends UDF2[String, String, String] {
def call(input: String, dataElement: String): String = {
return ptyProtectStr(input, dataElement);
}
}
class ptyUnprotectStr extends UDF2[String, String, String] {
def call(input: String, dataElement: String): String = {
return ptyUnprotectStr(input, dataElement);
}
}
class ptyProtectInt extends UDF2[Integer, String, Integer] {
def call(input: Integer, dataElement: String): Integer = {
return ptyProtectInt(input, dataElement);
}
}
class ptyUnprotectInt extends UDF2[Integer, String, Integer] {
def call(input: Integer, dataElement: String): Integer = {
return ptyUnprotectInt(input, dataElement);
}
}
I want to create JAR file using SBT. My build.sbt looks like the following,
name := "Protegrity UDF"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"com.protegrity.spark" % "udf" % "2.3.2" % "provided",
"org.apache.spark" %% "spark-core" % "2.3.2" % "provided",
"org.apache.spark" %% "spark-sql" % "2.3.2" % "provided"
)
As you see, I trying to create a thin JAR file using "provided" option as my Spark environment already contains those libraries.
In spite of using "provided", sbt is trying to download from maven and throwing below error,
[warn] Note: Unresolved dependencies path:
[error] sbt.librarymanagement.ResolveException: Error downloading com.protegrity.spark:udf:2.3.2
[error] Not found
[error] Not found
[error] not found: C:\Users\user1\.ivy2\local\com.protegrity.spark\udf\2.3.2\ivys\ivy.xml
[error] not found: https://repo1.maven.org/maven2/com/protegrity/spark/udf/2.3.2/udf-2.3.2.pom
[error] at lmcoursier.CoursierDependencyResolution.unresolvedWarningOrThrow(CoursierDependencyResolution.scala:249)
[error] at lmcoursier.CoursierDependencyResolution.$anonfun$update$35(CoursierDependencyResolution.scala:218)
[error] at scala.util.Either$LeftProjection.map(Either.scala:573)
[error] at lmcoursier.CoursierDependencyResolution.update(CoursierDependencyResolution.scala:218)
[error] at sbt.librarymanagement.DependencyResolution.update(DependencyResolution.scala:60)
[error] at sbt.internal.LibraryManagement$.resolve$1(LibraryManagement.scala:52)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$12(LibraryManagement.scala:102)
[error] at sbt.util.Tracked$.$anonfun$lastOutput$1(Tracked.scala:69)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$20(LibraryManagement.scala:115)
[error] at scala.util.control.Exception$Catch.apply(Exception.scala:228)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11(LibraryManagement.scala:115)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11$adapted(LibraryManagement.scala:96)
[error] at sbt.util.Tracked$.$anonfun$inputChanged$1(Tracked.scala:150)
[error] at sbt.internal.LibraryManagement$.cachedUpdate(LibraryManagement.scala:129)
[error] at sbt.Classpaths$.$anonfun$updateTask0$5(Defaults.scala:2950)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:62)
[error] at sbt.std.Transform$$anon$4.work(Transform.scala:67)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:281)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:19)
[error] at sbt.Execute.work(Execute.scala:290)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:281)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:178)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:37)
[error] at java.util.concurrent.FutureTask.run(Unknown Source)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
[error] at java.util.concurrent.FutureTask.run(Unknown Source)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
[error] at java.lang.Thread.run(Unknown Source)
[error] (update) sbt.librarymanagement.ResolveException: Error downloading com.protegrity.spark:udf:2.3.2
[error] Not found
[error] Not found
[error] not found: C:\Users\user1\.ivy2\local\com.protegrity.spark\udf\2.3.2\ivys\ivy.xml
[error] not found: https://repo1.maven.org/maven2/com/protegrity/spark/udf/2.3.2/udf-2.3.2.pom
What change in build.sbt should I make to skip the maven download for "com.protegrity.spark"? Interestingly, I don't face this issue for "org.apache.spark" on the same build
Assuming that you have the JAR file available (but not through Maven or another artifact repository) wherever you're compiling the code, just place the JAR in (by default) the lib directory within your project (the path can be changed with the unmanagedBase setting in build.sbt if you need to do that for some reason).
Note that this will result in the unmanaged JAR being included in an assembly JAR. If you want to build a "slightly less fat" JAR that excludes the unmanaged JAR, you'll have to filter it out. One way to accomplish this is with
assemblyExcludedJars in assembly := {
val cp = (fullClasspath in assembly).value
cp.filter(_.data.getName == "name-of-unmanaged.jar")
}
If you don't have the JAR (or perhaps something very close to the JAR) handy, how exactly do you expect the compiler to typecheck your calls into the JAR?

Object kkapi is not a member of package

I am trying to import a module from another project and did as following:
As you can see on the image, the imported library is /home/developer/...kafka-api.
I am using the importing library in testing.
When I compile my spec files with statement test:compile and I've got following error:
[IJ]sbt:auth_stream> test:compile
[info] Compiling 3 Scala sources to /home/developer/Desktop/microservices/bary/auth-stream/target/scala-2.12/test-classes ...
[error] /home/developer/Desktop/microservices/bary/auth-stream/src/test/scala/io/khinkali/auth/AppSpec.scala:14:20: object kkapi is not a member of package io.khinkali
[error] import io.khinkali.kkapi.consumer.{KkConsumer, KkConsumerConfig, KkConsumerCreator}
[error] ^
[error] /home/developer/Desktop/microservices/bary/auth-stream/src/test/scala/io/khinkali/auth/AppSpec.scala:15:20: object kkapi is not a member of package io.khinkali
[error] import io.khinkali.kkapi.producer.{KkProducer, KkProducerCreator, MaxBlockMsConfig}
[error] ^
[error] /home/developer/Desktop/microservices/bary/auth-stream/src/test/scala/io/khinkali/auth/AppSpec.scala:24:56: not found: value KkConsumer
[error] private val consumer: IO[Consumer[String, String]] = KkConsumer.create(createConsumer())
[error] ^
[error] /home/developer/Desktop/microservices/bary/auth-stream/src/test/scala/io/khinkali/auth/AppSpec.scala:52:5: not found: type KkConsumerCreator
[error] : KkConsumerCreator
[error] ^
[error] /home/developer/Desktop/microservices/bary/auth-stream/src/test/scala/io/khinkali/auth/AppSpec.scala:25:56: not found: value KkProducer
[error] private val producer: IO[Producer[String, String]] = KkProducer.create(createProducer())
[error] ^
[error] /home/developer/Desktop/microservices/bary/auth-stream/src/test/scala/io/khinkali/auth/AppSpec.scala:46:5: not found: type KkProducerCreator
[error] : KkProducerCreator
[error] ^
[error] /home/developer/Desktop/microservices/bary/auth-stream/src/test/scala/io/khinkali/auth/AppSpec.scala:47:5: not found: value KkProducerCreator
[error] = KkProducerCreator(sys.env.get("KAFKA_SERVER").get,
[error] ^
[error] /home/developer/Desktop/microservices/bary/auth-stream/src/test/scala/io/khinkali/auth/AppSpec.scala:49:10: not found: value MaxBlockMsConfig
[error] List(MaxBlockMsConfig(2000)))
[error] ^
[error] /home/developer/Desktop/microservices/bary/auth-stream/src/test/scala/io/khinkali/auth/AppSpec.scala:53:5: not found: value KkConsumerCreator
[error] = KkConsumerCreator(sys.env.get("KAFKA_SERVER").get,
[error] ^
[error] /home/developer/Desktop/microservices/bary/auth-stream/src/test/scala/io/khinkali/auth/AppSpec.scala:57:16: not found: type KkConsumerConfig
[error] List.empty[KkConsumerConfig])
[error] ^
[error] 10 errors found
[error] (test:compileIncremental) Compilation failed
What am I doing wrong?
Hint, that the package of both project starts with the name, namely:
The current project:
The imported project:
As you can see, the name differs only at the end. Could it be the problem?
What I am trying to approach is, to use a function for kafka-api project.

Error after running "sbt test" in Chisel 3

I'm trying to use chisel 3.
I tried to test GCD.scala file in the chisel project template repo using sbt test and sbt "test-only example.GCD" commands following the answer to a previous question. But this gives an error(s) that I cannot find the reason for. I didn't do any changes to the build.sbt file or repo layout. I'm posting only the last part of the error message since it is very long and repetitive.
[info] Loading project definition from /home/isuru/fyp/ChiselProjects/TrialProject/project
[info] Set current project to chisel-module-template (in build file:/home/isuru/fyp/ChiselProjects/TrialProject/)
[info] Compiling 1 Scala source to /home/isuru/fyp/ChiselProjects/TrialProject/target/scala-2.11/classes...
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:5: not found: object Chisel3
[error] import Chisel3._
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:7: not found: type Module
[error] class GCD extends Module {
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:8: not found: type Bundle
[error] val io = new Bundle {
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:9: not found: value UInt
[error] val a = UInt(INPUT, 16)
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:9: not found: value INPUT
[error] val a = UInt(INPUT, 16)
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:10: not found: value UInt
[error] val b = UInt(INPUT, 16)
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:10: not found: value INPUT
[error] val b = UInt(INPUT, 16)
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:11: not found: value Bool
[error] val e = Bool(INPUT)
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:11: not found: value INPUT
[error] val e = Bool(INPUT)
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:12: not found: value UInt
[error] val z = UInt(OUTPUT, 16)
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:12: not found: value OUTPUT
[error] val z = UInt(OUTPUT, 16)
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:13: not found: value Bool
[error] val v = Bool(OUTPUT)
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:13: not found: value OUTPUT
[error] val v = Bool(OUTPUT)
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:15: not found: value Reg
[error] val x = Reg(UInt())
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:15: not found: value UInt
[error] val x = Reg(UInt())
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:16: not found: value Reg
[error] val y = Reg(UInt())
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:16: not found: value UInt
[error] val y = Reg(UInt())
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:17: not found: value when
[error] when (x > y) { x := x - y }
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:18: not found: value unless
[error] unless (x > y) { y := y - x }
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:19: not found: value when
[error] when (io.e) { x := io.a; y := io.b }
[error] ^
[error] 20 errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 2 s, completed Dec 1, 2016 8:26:25 PM
The errors you have shown suggest that sbt is somehow not finding Chisel, could you by chance show the full list of errors (especially early on ones)? With the following sequence of commands I am unable to reproduce the errors you are seeing:
git clone git#github.com:ucb-bar/chisel-template.git
cd chisel-template
sbt test
It is not the cause of this issue, but to run the test in chisel-template you should actually run sbt "test-only examples.test.GCDTester". example.GCD is the top of the design, but to run the test you have to refer to the Tester class in src/test/scala/examples/test/GCDUnitTest.scala.
I just encountered this same problem when making my own chisel project. However it was not that the import chisel3._ was wrong. The problem I had was I did not have a build.sbt file included in my directory.
I found my solution here.
https://chisel.eecs.berkeley.edu/2.0.6/getting-started.html

Apache Spark Build error

I'm building Apache spark source code in ubuntu 14.04.4 (spark version: 1.6.0 with Scala code runner version 2.10.4) with command
sudo sbt/sbt assembly
and getting the following error,
[warn] def deleteRecursively(dir: TachyonFile, client: TachyonFS) {
[warn] ^
[error] [error] while compiling:
/home/ashish/spark-apps/spark-1.6.1/core/src/main/scala/org/apache/spark/util/random/package.scala
[error] during phase: jvm [error] library
version: version 2.10.5 [error] compiler version: version
2.10.5 [error] reconstructed args: -deprecation -Xplugin:/home/ashish/.ivy2/cache/org.spark-project/genjavadoc-plugin_2.10.5/jars/genjavadoc-plugin_2.10.5-0.9-spark0.jar
-feature -P:genjavadoc:out=/home/ashish/spark-apps/spark-1.6.1/core/target/java -classpath /home/ashish/spark-apps/spark-1.6.1/core/target/scala-2.10/classes:/home/ashish/spark-apps/spark-1.6.1/launcher/target/scala-2.10/classes:/home/ashish/spark-apps/spark-1.6.1/network/common/target/scala-2.10/classes:/home/ashish/spark-apps/spark-1.6.1/network/shuffle/target/scala-2.10/classes:/home/ashish/spark-apps/spark-1.6.1/unsafe/target/scala-2.10/classes:/home/ashish/.ivy2/cache/org.spark-project.spark/unused/jars/unused-1.0.0.jar:/home/ashish/.ivy2/cache/com.google.guava/guava/bundles/guava-14.0.1.jar:/home/ashish/.ivy2/cache/io.netty/netty-all/jars/netty-all-4.0.29.Final.jar:/home/ashish/.ivy2/cache/org.fusesource.leveldbjni/leveldbjni-all/bundles/leveldbjni-all-1.8.jar:/home/ashish/.ivy2/cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.4.4.jar:/home/ashish/.ivy2/cache/com.fasterxml.jackson.core/jackson-annotations/bundles/jackson-annotations-2.4.4.jar:/home/ashish/.ivy2/cache/com.fasterxml.jackson.core/jackson-core/bundles/jackson-......and
many other jars...
[error] [error] last tree to typer:
Literal(Constant(collection.mutable.Map)) [error]
symbol: null [error] symbol definition: null [error]
tpe: Class(classOf[scala.collection.mutable.Map]) [error]
symbol owners: [error] context owners: package package ->
package random [error] [error] == Enclosing template or
block == [error] [error] Template( // val :
in package random,
tree.tpe=org.apache.spark.util.random.package.type [error]
"java.lang.Object" // parents [error] ValDef( [error]
private [error] "_" [error] [error]
[error] ) [error] DefDef( // def ():
org.apache.spark.util.random.package.type in package random
[error] [error] "" [error]
[] [error] List(Nil) [error] //
tree.tpe=org.apache.spark.util.random.package.type [error]
Block( // tree.tpe=Unit [error] Apply( // def ():
Object in class Object, tree.tpe=Object [error]
package.super."" // def (): Object in class Object,
tree.tpe=()Object [error] Nil [error] )
[error] () [error] ) [error] ) [error]
) [error] [error] == Expanded type of tree == [error]
[error] ConstantType(value = Constant(collection.mutable.Map))
[error] [error] uncaught exception during compilation:
java.io.IOException [error] File name too long [warn] 45
warnings found [error] two errors found [error]
(core/compile:compile) Compilation failed [error] Total time:
5598 s, completed 5 Apr, 2016 9:06:50 AM
Where I'm getting wrong?
You should build Spark with Maven...
download the source and run ./bin/mvn clean package
Probably similar to http://apache-spark-user-list.1001560.n3.nabble.com/spark-github-source-build-error-td10532.html
Try sudo sbt/sbt clean assembly