Unable to import locally published Scala plugin - scala

I have a project which I publish locally to my .m2 directory as a plugin, which later I need to import into a different Scala project and use it.
It seems like the publishing step is executed correctly.
The build.sbt file of the plugin project looks like this:
lazy val root = (project in file("."))
.enablePlugins(SbtPlugin)
.settings(
name := "pluginName",
organization := "com.myOrg",
pluginCrossBuild / sbtVersion := {
scalaBinaryVersion.value match {
case "2.12" => "1.4.6" // set minimum sbt version
}
}
)
resolvers += "confluent" at "https://packages.confluent.io/maven"
libraryDependencies ++= Seq(
"io.confluent" % "kafka-schema-registry-client" % "7.0.1"
// some other dependemcies
)
After running the compile and publishLocal commands in sbt shell I get the next message:
[info] delivering ivy file to /Users/me/Work/repos/external/pluginName/target/scala-2.12/sbt-1.0/ivy-1.0.0.xml
[info] published pluginName to /Users/me/.ivy2/local/com.myOrg/pluginName/scala_2.12/sbt_1.0/1.0.0/poms/pluginName.pom
[info] published pluginName to /Users/me/.ivy2/local/com.myOrg/pluginName/scala_2.12/sbt_1.0/1.0.0/jars/pluginName.jar
[info] published pluginName to /Users/me/.ivy2/local/com.myOrg/pluginName/scala_2.12/sbt_1.0/1.0.0/srcs/pluginName-sources.jar
[info] published pluginName to /Users/me/.ivy2/local/com.myOrg/pluginName/scala_2.12/sbt_1.0/1.0.0/docs/pluginName-javadoc.jar
[info] published ivy to /Users/me/.ivy2/local/com.myOrg/pluginName/scala_2.12/sbt_1.0/1.0.0/ivys/ivy.xml
[success] Total time: 0 s, completed 3 Jan 2022, 10:07:43
In order to import/install this plugin in the other Scala project, I have added the next line to the plugins.sbt file: addSbtPlugin("com.myOrg" % "pluginName" % "1.0.0")
I also added libs-release-local and libs-snapshot-local to the externalResolvers section in the buid.sbt file.
After reloading and compiling the project I received this error:
[error] (update) sbt.librarymanagement.ResolveException: Error downloading io.confluent:kafka-schema-registry-client:7.0.1
[error] Not found
[error] Not found
[error] not found: https://repo1.maven.org/maven2/io/confluent/kafka-schema-registry-client/7.0.1/kafka-schema-registry-client-7.0.1.pom
[error] not found: /Users/me/.ivy2/local/io.confluent/kafka-schema-registry-client/7.0.1/ivys/ivy.xml
[error] not found: https://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/io.confluent/kafka-schema-registry-client/7.0.1/ivys/ivy.xml
[error] not found: https://repo.typesafe.com/typesafe/ivy-releases/io.confluent/kafka-schema-registry-client/7.0.1/ivys/ivy.xml
I am kind of new to Scala and I don't understand what and I doing wrong.
Can anyone shed some light on this problem?

You're publishing to your local Maven cache, but sbt uses Ivy.
Try removing the publishTo setting, it shouldn't be needed. Just use the publishLocal task to publish to your local Ivy cache.

Related

chisel compilation error: object plugin is not a member of package chisel3.internal

I am studying chisel3 with a small trial project.
I finished code, fixed several syntax issues in compilation, then, it reported an error without indicating error file and line number.
$ sbt test
[info] welcome to sbt 1.4.9 (Red Hat, Inc. Java 1.8.0_292)
[info] loading settings for project fparser-build from plugins.sbt ...
[info] loading project definition from /mnt/disk1/yupeng/repos/fparser/project
[info] loading settings for project root from build.sbt ...
[info] set current project to fparser (in build file:/mnt/disk1/yupeng/repos/fparser/)
[info] compiling 3 Scala sources to /mnt/disk1/yupeng/repos/fparser/target/scala-2.12/classes ...
[error] ## Exception when compiling 3 sources to /mnt/disk1/yupeng/repos/fparser/target/scala-2.12/classes
[error] scala.reflect.internal.Types$TypeError: object plugin is not a member of package chisel3.internal
[error]
[error]
[error] scala.reflect.internal.Types$TypeError: object plugin is not a member of package chisel3.internal
[error] (Compile / compileIncremental) scala.reflect.internal.Types$TypeError: object plugin is not a member of package chisel3.internal
[error] Total time: 3 s, completed Jul 16, 2021 4:38:42 PM
what does it mean? please help.
I just found, the error is gone after I changed chisel3 versions in build.sbt.
libraryDependencies ++= Seq(
"edu.berkeley.cs" %% "chisel3" % "3.4.3",
// "edu.berkeley.cs" %% "chisel3" % "3.2.6", // this one generate plugin error above
"edu.berkeley.cs" %% "chiseltest" % "0.3.3" % "test",
"edu.berkeley.cs" %% "rocketchip" % "1.2.6"
Previously I changed from 3.4.3 to 3.2.6 because sbt warning of
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
I ran sbt evicted and it said rocketchip 1.2.6 should use chisel3 3.2.6.
Maybe someone can clarify.

How to correctly start a Scala Project with sbt in IntelliJ?

I want to build a Scala Client that connects to the Twitter Streaming API. I created the app on Twitter to get the API and Token Keys. Now I created a new Scala Project in IntelliJ and I would like to use the OAuth2 Client from https://index.scala-lang.org/dakatsuka/akka-http-oauth2-client/akka-http-oauth2-client/0.2.0?target=_2.12
What is the correct way of proceeding? I inserted the dependency statement in the build.sbt file:
libraryDependencies += "com.github.dakatsuka" %% "akka-http-oauth2-client" % "0.1.0"
and added the import statement at the top of the class:
import java.net.URI
import com.github.dakatsuka.akka.http.oauth2.client.{ Client, Config }
object Client extends App {
val config = Config(
clientId = "..."
clientSecret = "..."
site = URI.create("api.twitter.com")
)
val client = Client(config)
// for the moment nothing else
}
Is this correct and what do I need to do such that it obtaines the dakatsuka library?
EDIT to follow up on comment
According to the documentation:
I enter reload and then update at the sbt shell prompt but when I do that it returns:
IJ]sbt:stream_scala> update
[warn]
[warn] Note: Unresolved dependencies path:
[error] stack trace is suppressed; run last update for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error downloading com.github.dakatsuka:akka-http-oauth2-client_2.13:0.2.0
[error] Not found
[error] Not found
[error] not found: /home/mzh/.ivy2/local/com.github.dakatsuka/akka-http-oauth2-client_2.13/0.2.0/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/com/github/dakatsuka/akka-http-oauth2-client_2.13/0.2.0/akka-http-oauth2-client_2.13-0.2.0.pom
[error] not found: https://oss.sonatype.org/content/repositories/snapshots/com/github/dakatsuka/akka-http-oauth2-client_2.13/0.2.0/akka-http-oauth2-client_2.13-0.2.0.pom
[error] Total time: 2 s, completed Jul 6, 2020, 7:36:08 PM
[IJ]sbt:stream_scala>
The Sonatype Resolver is there:
[IJ]sbt:stream_scala> inspect resolvers
[info] Setting: scala.collection.Seq[sbt.librarymanagement.Resolver] = Vector(Sonatype OSS Snapshots: https://oss.sonatype.org/content/repositories/snapshots)
[info] Description:
[info] The user-defined additional resolvers for automatically managed dependencies.
My build.sbt looks like this:
name := "stream_scala"
version := "0.1"
scalaVersion := "2.13.3"
resolvers += "Sonatype OSS Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots"
libraryDependencies += "com.github.dakatsuka" %% "akka-http-oauth2-client" % "0.2.0"

Modifying and Building Spark core

I am trying to make a modification to the Apache Spark source code. I created a new method and added it to the RDD.scala file within the Spark source code I downloaded. After making the modification to RDD.scala, I built Spark using
mvn -Dhadoop.version=2.2.0 -DskipTests clean package
I then created a sample Scala Spark Application as mentioned here
I tried using the new function I created, and I got a compilation error when using sbt to create a jar for Spark. How exactly do I compile Spark with my modification and attach the modified jar to my project? The file I modified is RDD.scala within the core project. I run sbt package from the root dir of my Spark Application Project.
Here is the sbt file:
name := "N Spark"
version := "1.0"
scalaVersion := "2.11.6"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.3.0"
Here is the error:
sbt package
[info] Loading global plugins from /Users/Raggy/.sbt/0.13/plugins
[info] Set current project to Noah Spark (in build file:/Users/r/Downloads/spark-proj/n-spark/)
[info] Updating {file:/Users/r/Downloads/spark-proj/n-spark/}n-spark...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Compiling 1 Scala source to /Users/r/Downloads/spark-proj/n-spark/target/scala-2.11/classes...
[error] /Users/r/Downloads/spark-proj/n-spark/src/main/scala/SimpleApp.scala:11: value reducePrime is not a member of org.apache.spark.rdd.RDD[Int]
[error] logData.reducePrime(_+_);
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 24 s, completed Apr 11, 2015 2:24:03 AM
UPDATE
Here is the updated sbt file
name := "N Spark"
version := "1.0"
scalaVersion := "2.10"
libraryDependencies += "org.apache.spark" % "1.3.0"
I get the following error for this file:
[info] Loading global plugins from /Users/Raggy/.sbt/0.13/plugins
/Users/Raggy/Downloads/spark-proj/noah-spark/simple.sbt:7: error: No implicit for Append.Value[Seq[sbt.ModuleID], sbt.impl.GroupArtifactID] found,
so sbt.impl.GroupArtifactID cannot be appended to Seq[sbt.ModuleID]
libraryDependencies += "org.apache.spark" % "1.3.0"
Delete libraryDependencies from build.sbt and just copy the custom-built Spark jar to the lib directory in your application project.

How can I use an sbt plugin as a dependency in a multi-project build?

I have two sbt plugin projects that both use multi-project builds. I would like to use one of the plugins as a dependency for the other. I was able to get this working with single project builds, but once I move to the multi-project build, I can't seem to get the dependencies to link up correctly.
my-test-plugin
build.sbt
lazy val commonSettings = Seq(
organization := "com.example",
name := "my-test-plugin",
version := "0.1.0-SNAPSHOT",
scalaVersion := "2.10.5"
)
// The contents of root are largely unimportant here
lazy val root = (project in file(".")).settings(commonSettings: _*)
lazy val plugin = (project in file("plugin"))
.settings(commonSettings: _*)
.settings(
sbtPlugin := true
)
my-test-plugin/plugin/src/main/scala/PluginTest.scala
package com.example.test
// Sample code I would like to access from another plugin
object PluginTest {
def foo: Unit = println("test")
}
my-sub-plugin
build.sbt
lazy val commonSettings = Seq(
organization := "com.sample",
name := "my-sub-plugin",
version := "0.1.0-SNAPSHOT",
scalaVersion := "2.10.5"
)
lazy val root = (project in file(".")).settings(commonSettings: _*)
lazy val plugin = (project in file("plugin"))
.settings(commonSettings: _*)
.settings(
sbtPlugin := true,
libraryDependencies += Defaults.sbtPluginExtra("com.example" % "my-test-plugin" % "0.1.0-SNAPSHOT", "0.13", "2.10")
).dependsOn(root)
my-sub-plugin/plugin/src/main/scala/SubPluginTest.scala
package com.sample.test
object SubPluginTest {
def bar = com.example.test.PluginTest.foo
}
But the last file doesn't compile:
[error] /home/mike/code/sbt-tests/my-sub-plugin/plugin/src/main/scala/SubPluginTest.scala:4: object example is not a member of package com
[error] def bar = com.example.test.PluginTest.foo
[error] ^
When I publish-local and plugin/publish-local both projects (rather, just compile the second), the artifacts resolve correctly, but SubPlugintest.scala fails to compile with the above error, as if the dependency isn't there. However, if I remove the root projects and put the plugin files in root (without lazy vals or anything, just a flat build.sbt structure), it works.
What am I missing here?
I don't think it's relevant, but I tried 0.13.5 and 0.13.8. I've also fruitlessly tried adding the dependency without sbtPluginExtra, and putting them in plugins.sbt as well (I didn't think that would work, but hey).
Edit:
The dependency jars appear exist locally and resolve correctly:
$ jar tf ~/.ivy2/local/com.example/my-test-plugin/scala_2.10/sbt_0.13/0.1.0-SNAPSHOT/jars/my-test-plugin.jar
META-INF/MANIFEST.MF
com/
com/example/
com/example/test/
com/example/test/PluginTest$.class
com/example/test/PluginTest.class
$ jar tf ~/.ivy2/local/com.example/my-test-plugin_2.10/0.1.0-SNAPSHOT/jars/my-test-plugin_2.10.jar
META-INF/MANIFEST.MF
com/
com/example/
com/example/test/
com/example/test/DummyCode.class
com/example/test/DummyCode$.class
You're not setting a distinct name for each module:
my-test-plugin
> ;show root/name ;show plugin/name
[info] my-test-plugin
[info] my-test-plugin
my-sub-plugin
> ;show root/name ;show plugin/name
[info] my-sub-plugin
[info] my-sub-plugin
As you can see, publishing in my-test-plugin works:
> ;root/publishLocal ;plugin/publishLocal
[info] Wrote /Users/dnw/Desktop/sbt-tests/my-test-plugin/target/scala-2.10/my-test-plugin_2.10-0.1.0-SNAPSHOT.pom
[info] :: delivering :: com.example#my-test-plugin_2.10;0.1.0-SNAPSHOT :: 0.1.0-SNAPSHOT :: integration :: Sat Apr 11 09:16:15 BST 2015
[info] delivering ivy file to /Users/dnw/Desktop/sbt-tests/my-test-plugin/target/scala-2.10/ivy-0.1.0-SNAPSHOT.xml
[info] published my-test-plugin_2.10 to /Users/dnw/.ivy2/local/com.example/my-test-plugin_2.10/0.1.0-SNAPSHOT/poms/my-test-plugin_2.10.pom
[info] published my-test-plugin_2.10 to /Users/dnw/.ivy2/local/com.example/my-test-plugin_2.10/0.1.0-SNAPSHOT/jars/my-test-plugin_2.10.jar
[info] published my-test-plugin_2.10 to /Users/dnw/.ivy2/local/com.example/my-test-plugin_2.10/0.1.0-SNAPSHOT/srcs/my-test-plugin_2.10-sources.jar
[info] published my-test-plugin_2.10 to /Users/dnw/.ivy2/local/com.example/my-test-plugin_2.10/0.1.0-SNAPSHOT/docs/my-test-plugin_2.10-javadoc.jar
[info] published ivy to /Users/dnw/.ivy2/local/com.example/my-test-plugin_2.10/0.1.0-SNAPSHOT/ivys/ivy.xml
[success] Total time: 0 s, completed 11-Apr-2015 09:16:15
[info] Wrote /Users/dnw/Desktop/sbt-tests/my-test-plugin/plugin/target/scala-2.10/sbt-0.13/my-test-plugin-0.1.0-SNAPSHOT.pom
[info] :: delivering :: com.example#my-test-plugin;0.1.0-SNAPSHOT :: 0.1.0-SNAPSHOT :: integration :: Sat Apr 11 09:16:15 BST 2015
[info] delivering ivy file to /Users/dnw/Desktop/sbt-tests/my-test-plugin/plugin/target/scala-2.10/sbt-0.13/ivy-0.1.0-SNAPSHOT.xml
[info] published my-test-plugin to /Users/dnw/.ivy2/local/com.example/my-test-plugin/scala_2.10/sbt_0.13/0.1.0-SNAPSHOT/poms/my-test-plugin.pom
[info] published my-test-plugin to /Users/dnw/.ivy2/local/com.example/my-test-plugin/scala_2.10/sbt_0.13/0.1.0-SNAPSHOT/jars/my-test-plugin.jar
[info] published my-test-plugin to /Users/dnw/.ivy2/local/com.example/my-test-plugin/scala_2.10/sbt_0.13/0.1.0-SNAPSHOT/srcs/my-test-plugin-sources.jar
[info] published my-test-plugin to /Users/dnw/.ivy2/local/com.example/my-test-plugin/scala_2.10/sbt_0.13/0.1.0-SNAPSHOT/docs/my-test-plugin-javadoc.jar
[info] published ivy to /Users/dnw/.ivy2/local/com.example/my-test-plugin/scala_2.10/sbt_0.13/0.1.0-SNAPSHOT/ivys/ivy.xml
[success] Total time: 0 s, completed 11-Apr-2015 09:16:15
But note the publish paths:
com.example/my-test-plugin_2.10/0.1.0-SNAPSHOT/jars/my-test-plugin_2.10.jar
com.example/my-test-plugin/scala_2.10/sbt_0.13/0.1.0-SNAPSHOT/jars/my-test-plugin.jar
A non-sbt-plugin called my-test-plugin cross-built for Scala 2.10.
An sbt-plugin called my-test-plugin, cross-built for Scala 2.10 and sbt 0.13.
This has an impact when trying to resolve my-test-plugin in my-sub-plugin:
[error] Modules were resolved with conflicting cross-version suffixes in {file:/Users/dnw/Desktop/sbt-tests/my-sub-plugin/}root:
[error] com.example:my-test-plugin <none>, _2.10
java.lang.RuntimeException: Conflicting cross-version suffixes in: com.example:my-test-plugin
at scala.sys.package$.error(package.scala:27)
at sbt.ConflictWarning$.processCrossVersioned(ConflictWarning.scala:46)
at sbt.ConflictWarning$.apply(ConflictWarning.scala:32)
Try and specify different names for each module and it should work.

Resolving Dependencies for eclipselink by creating a fat jar with sbt and sbt-assembly?

I try to create a fat jar with sbt and sbt-assembly plugin for my project with Scala and EclipseLink JPA, but the assembly command failed, because the eclipse.inf file will found twice.
> assembly
[info] Including from cache: commonj.sdo-2.1.1.jar
[info] Including from cache: javax.persistence-2.1.0.jar
[info] Including from cache: scala-library.jar
[info] Including from cache: eclipselink-2.5.1.jar
[info] Run completed in 38 milliseconds.
[info] Checking every *.class/*.jar file's SHA-1.
[info] Merging files...
[warn] Merging 'org\eclipse\persistence\descriptors\copying' with strategy 'rename'
[warn] Merging 'META-INF\MANIFEST.MF' with strategy 'discard'
[trace] Stack trace suppressed: run last *:assembly for the full output.
[error] (*:assembly) deduplicate: different file contents found in the following:
[error] C:\Users\u987\WebApps\gr\lib_managed\jars\org.eclipse.persistence\javax.persistence\jav ax.persistence-2.1.0.jar:META-INF/eclipse.inf
[error] C:\Users\u987\WebApps\gr\lib_managed\jars\org.eclipse.persistence\commonj.sdo\commonj.sdo-2.1.1.jar:META-INF/eclipse.inf
My build.sbt looks like:
import AssemblyKeys._
name := "TelegramReceiver"
version := "0.1"
scalaVersion := "2.10.3"
retrieveManaged in ThisBuild := true
libraryDependencies ++= Seq(
"org.scalatest" % "scalatest_2.10" % "2.0" % "test",
"org.eclipse.persistence" % "eclipselink" % "2.5.1"
)
I try to solve the problem with the mergeStrategy from the sbt assembly plugin, but it doesn't work. I use sbt 0.13.1.
Thanks in advance for help!
You can add to your settings a custom merge strategy like this:
mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
{
case PathList("eclipse.inf") => MergeStrategy.rename // use any of the available strategies like `first`
case x => old(x)
}}
See this doc for more details.