I'm trying to import angular 2 packages as external dependencies to a ScalaJS project.
For example, the core dep is referenced as #angular/core, and is available on WebJars at "org.webjars.npm" % "angular__core" "2.0.0-rc.6".
When importing in ScalaJS, however, when using this declaration:
jsDependencies += "org.webjars.npm" % "angular__core" % ngVersion / "angular__core.js"
I receive the following error:
[error] (compile:resolvedJSDependencies) org.scalajs.core.tools.jsdep.JSLibResolveException: Some references to JS libraries could not be resolved:
[error] - Missing JS library: angular__core.js
[error] originating from: root:compile
I have determined that the problem is related to the angular__core.js declaration; changing "jquery.js" in "org.webjars" % "jquery" % "1.10.2" / "jquery.js" replicated the error.
How then, should I name the angular dependency? The documentation is not very specific, saying "[the name declaration] include[s] a file ... in the said WebJar when your project is run or tested."
I've tried the following names:
Angular.js
#angular/core.js
angular-core.js
angular.core.js
angular_core.js
angular__core.js
It is the name of the .js file found in the jar. Look at the content of the webjar to determine what is the proper file name to use.
Related
I am fairly new to scala & sbt and encountered an issue when attempting to compile my scala project and have 1 hypothesis as to why i am seeing the below error:
the path to the dependency on my local is configured wrongly. it is currently: "/Users/jutay/.ivy2/localorg.apache.spark/....." and there should be a "/" after "local" which should make the path look like ".../.ivy2/local/org.apache.spark/..."
But I am unsure if this is truly the case?
As i did some research and found the following online examples, which is showing that those who encountered the same sbt.librarymanagement.ResolveException have the similar path designed e.g. ".../.ivy2/localnet.cakesolutions/..." so it seems that the missing "/" in the path is not the one causing the problem?
but in the event that this is indeed the one causing the problem, where can i make the config change to add a "/" after the string "local"?
https://users.scala-lang.org/t/help-with-this-package-downloading-error/8257
sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-core:3.0.1
If there is no issue with the path above and it is correct and expected, what am i currently doing wrong or missing leading sbt compile to fail with the above error? It seems to not be able to download this dependency:
Error downloading org.apache.spark:spark-corcle_2.12:3.3.0
Note: I am currently using intelliJ to work on this scala sbt project.
I am kind of stuck on this issue.
[error] stack trace is suppressed; run last update for the full output [error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-corcle_2.12:3.3.0 [error] Not found [error] Not found [error] not found: /Users/jutay/.ivy2/localorg.apache.spark/spark-corcle_2.12/3.3.0/ivys/ivy.xml [error] not found: https://repo1.maven.org/maven2/org/apache/spark/spark-corcle_2.12/3.3.0/spark-corcle_2.12-3.3.0.pom [error] Total time: 2 s, completed 28 Nov, 2022 5:27:51 PM
This is my build.sbt file:
name := "proj123"
version := "3.0"
scalaVersion := "2.12.15"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-corcle" % "3.3.0",
"org.apache.spark" %% "spark-sql" % "3.3.0",
"com.google.code.gson" % "gson" % "2.2.4",
"com.google.cloud.spark" %% "spark-bigquery-with-dependencies" % "0.26.0" % "provided",
"com.typesafe" % "config" % "1.3.2",
"com.jcraft" % "jsch" % "0.1.55",
"org.springframework" % "spring-web" % "5.3.18",
"org.apache.httpcomponents" % "httpcore" % "4.4.15",
"org.apache.httpcomponents" % "httpclient" % "4.5.13",
"com.google.code.gson" % "gson" % "2.8.9"
)
I have tried looking up possible solutions online via google related to sbt.librarymanagement.ResolveException but i did not found any solution that would be helpful.
I tried checking my .bash_profile and my sbtconfig.txt as to whether there are any config options there to edit the path (by adding a "/") as a possible way to try and resolve the issue, but there seems to be no such options.
This is my .bash_profile :
# Configuration for node to trust the xxxx Proxy Certificates
export NODE_EXTRA_CA_CERTS='/usr/local/etc/openssl/certs/xxxx_proxy_cacerts.pem'
# Configuration to load nvm - node version manager
export NVM_DIR="$HOME/.nvm"
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh" # This loads nvm
# >>> coursier install directory >>>
export SPARK_HOME=/Users/jutay/Documents/spark-3.3.0-bin-hadoop3
export JAVA_HOME=/Library/Java/JavaVirtualMachines/zulu-8.jdk/Contents/Home
export SBT_HOME=/Users/jutay/Downloads/sbt
export PATH=$SPARK_HOME/bin:/$JAVA_HOME/bin:/$SBT_HOME/bin:$PATH
# <<< coursier install directory <<<
and this is the sbtconfig.txt inside the sbt folder which my .bash_profile reference:
# sbt configuration file for Windows
# Set the java args
#-mem 1024 was added in sbt.bat as default
#-Xms1024m
#-Xmx1024m
#-Xss4M
#-XX:ReservedCodeCacheSize=128m
# Set the extra sbt options
# -Dsbt.log.format=true
I'm trying to deploy a simple web application built using Play framework with Scala. The application is working fine, when I run the application using sbt run command, however, when I tried to deploy the application, in my local server, using sbt dist command, I'm getting the following message
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by com.google.inject.internal.cglib.core.$ReflectUtils$1 (file:/D:/Scala/SomeDomain/SomeProject/target/universal/SomeProject-1.0-SNAPSHOT/lib/com.google.inject.guice-4.2.2.jar) to method java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain)
WARNING: Please consider reporting this to the maintainers of com.google.inject.internal.cglib.core.$ReflectUtils$1
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Oops, cannot start the server.
com.google.inject.CreationException: Unable to create injector, see the following errors:
1) Error injecting constructor, java.io.IOException: Dictionary directory does not exist: D:\Scala\SomeDomain\SomeProject\target\universal\SomeProject-1.0-SNAPSHOT\bin\dict
at initializer.ServiceInitializer.<init>(ServiceInitializer.scala:11)
at initializer.ApplicationInitializer.configure(ApplicationInitializer.scala:12) (via modules: com.google.inject.util.Modules$OverrideModule -> initializer.ApplicationInitializer)
while locating initializer.ServiceInitializer
I'm using Windows, I extracted the generated .zip file and executed the .bat file from the /bin directory.
build.sbt
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala, LauncherJarPlugin)
scalaVersion := "2.12.10"
libraryDependencies += guice
libraryDependencies += "org.scalatestplus.play" %% "scalatestplus-play" % "5.0.0" % Test
libraryDependencies += "postgresql" % "postgresql" % "9.1-901-1.jdbc4"
plugins.sbt
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.8.0")
addSbtPlugin("org.foundweekends.giter8" % "sbt-giter8-scaffold" % "0.11.0")
Seems like some component of you application expects a directory to exist:
java.io.IOException: Dictionary directory does not exist: D:\Scala\SomeDomain\SomeProject\target\universal\SomeProject-1.0-SNAPSHOT\bin\dict
Ensure to create the directory, or add the missing files to your dist.
I am using Intellij UE 2017.3. The steps that I undertook were:
Create a new project from Lightbend templates
Check import sbt sources (tried without as well)
Try the suggested solutions from this thread
As a result in my build.sbt nothing seems to be imported, regardless of whether before or after trying the suggested fixes, (even the Dependencies object in /project folder).
Here is Dependencies object contents:
import sbt._
object Dependencies {
lazy val scalaTest = "org.scalatest" %% "scalatest" % "3.0.5"
}
I attach below screenshot with the errors and project structure. Note that in External Libraries scalatest version is different from scalaVersion, but former is correctly imported in Dependencies object.
The errors that appear are:
for Dependencies: Cannot resolve symbol
for settings: Cannot resolve reference settings with such signature, Cannot resolve symbol settings
for List: Type mismatch: expected: Def.SettingsDefinition,
actual Seq[Def.Setting[_]]
for name and libraryDependencies: too many
arguments for method settings
sbt.version is 1.1.1
I am reading this article
https://playframework.github.io/play-soap/SbtWsdl.html
and based on this. I wrote the following build.sbt file
name := "PlaySOAPClient"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"com.typesafe.play" % "play-soap-client_2.11" % "1.1.3"
)
WsdlKeys.packageName := Some("com.foo")
WsdlKeys.wsdlTasks in Compile := Seq(
WsdlKeys.WsdlTask((sourceDirectory.value / "main" / "wsdl" / "foo.wsdl").toURI.toURL)
)
and plugins.sbt
resolvers += Resolver.url("play-sbt-plugins", url("https://dl.bintray.com/playframework/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
addSbtPlugin("com.typesafe.sbt" % "sbt-play-soap" % "1.1.3")
When I do sbt compile the plugin does generate some code. but that code does not compile
Error:scalac: missing or invalid dependency detected while loading class file 'PlaySoapClient.class'.
Could not access type Configuration in value play.api,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'PlaySoapClient.class' was compiled against an incompatible version of play.api.
Warning:scalac: Class javax.inject.Singleton not found - continuing with a stub.
/Users/User/IdeaProjects/PlaySOAPClient/target/scala-2.11/wsdl/main/sources/com/foo/webservices/FooWebService.scala
Error:(13, 8) object inject is not a member of package javax
#javax.inject.Singleton
Error:(14, 107) object api is not a member of package play
class FooWebService #javax.inject.Inject() (apacheCxfBus: play.soap.ApacheCxfBus, configuration: play.api.Configuration) extends play.soap.PlaySoapClient(apacheCxfBus, configuration) {
Error:(14, 32) object inject is not a member of package javax
class FooWebService #javax.inject.Inject() (apacheCxfBus: play.soap.ApacheCxfBus, configuration: play.api.Configuration) extends play.soap.PlaySoapClient(apacheCxfBus, configuration) {
Does anyone have an idea of what dependencies are missing. Note that this is a client side application only using this library to make a soap call. I don't want any server side dependencies of play framework.
My hope is that I will be able to use the play-soap as a standalone library in my console application to make soap calls.
Add dependency to build.sbt
libraryDependencies += "com.typesafe.play" %% "play" % "2.6.7" intransitive()
Then sbt compile should work (after sbt update).
Honestly, including the entire Play Framework into the application purely for the sake of WSDL client seems to be too much. All you need is to generate annotated Java beans and make a dependency just on them. And you can actually do that with common tools i.e. using Java's wsimport and sbt tasks to wrap it around.
Consider the following bootstrap template for this: https://github.com/sainnr/sbt-scala-wsdl-template. It generates all the boilerplate in-flight, compiles before the main sbt project and eliminates the need to commit this boilerplate Java code to your pristine Scala repo. If you notice, it doesn't even require a complete application server, just throw in some JavaEE-ish libs like rt.jar or its alternatives. Hope that helps someone.
I'm currently facing a problem with deploying an uber-jar to a Spark Streaming application, where there are congruent JARs with different versions which are causing spark to throw run-time exceptions. The library in question is TypeSafe Config.
After attempting many things, my solution was to defer to shading the provided dependency so it won't clash with the JAR provided by Spark at run-time.
Hence, I went to the documentation for sbt-assembly and under shading, I saw the following example:
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("org.apache.commons.io.**" -> "shadeio.#1")
.inLibrary("commons-io" % "commons-io" % "2.4", ...).inProject
)
Attempting to shade over com.typesafe.config, I tried applying the following solution to my build.sbt:
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("com.typesafe.config.**" -> "shadeio.#1").inProject
)
I assumed it was supposed to rename any reference to TypeSafe Config in my project. But, this doesn't work. It matches multiple classes in my project and causing them to be removed from the uber jar. I see this when trying to run sbt assembly:
Fully-qualified classname does not match jar entry:
jar entry: ***/Identifier.class
class name: **/Identifier.class
Omitting ***/OtherIdentifier.class.
Fully-qualified classname does not match jar entry:
jar entry: ***\SparkBaseJobRunner$$anonfun$1.class
class name: ***/SparkBaseJobRunner$$anonfun$1.class
I also attempted using:
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("com.typesafe.config.**" -> "shadeio.#1")
.inLibrary("com.typesafe" % "config" % "1.3.0")
This did finish the assemblying process of the uber JAR, but didn't have the desired run time effect.
I'm not sure I fully comprehend the effect shading has on my build process with sbt.
How can I shade over references to com.typesafe.config in my project so when I invoke the library at run-time Spark will load my shaded library and avoid the clash caused by versioning?
I'm running sbt-assembly v0.14.1
Turns out this was a bug in sbt-assembly where shading was completely broken on Windows. This caused source files to be removed from the uber JAR, and for tests to fail as the said classes were unavailable.
I created a pull request to fix this. Starting version 0.14.3 of SBT, the shading feature works properly. All you need to do is update to the relevant version in plugins.sbt:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.3")
In order to shade a specific JAR in your project, you do the following:
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("com.typesafe.config.**" -> "my_conf.#1")
.inLibrary("com.typesafe" % "config" % "1.3.0")
.inProject
)
This will rename the com.typesafe.config assembly to be packaged inside my_conf. You can then view this using jar -tf on your assembly (omitted irrelevant parts for brevity):
***> jar -tf myassembly.jar
my_conf/
my_conf/impl/
my_conf/parser/
Edit
I wrote a blog post describing the issue and the process that led to it for anyone interested in a more in-depth explanation.