Scala.JS generated module throws "Uncaught TypeError: Cannot read property 'Object' of undefined" - scala.js

I've compiled a simple Scala class to JavaScript using Scala.JS, and I want to import it to a JavaScript code, but importing it throws the following exception:
Uncaught TypeError: Cannot read property 'Object' of undefined
at scalajsenv.js:39
I've replaced the generated script with a simple script that defines the same class and exports it. it worked so I am sure that the problem is on the Scala.JS side.
Scala class:
#JSExportTopLevel("SomeClass")
#ScalaJSDefined
class SomeClass(i: Int) extends js.Object
build.sbt:
enablePlugins(ScalaJSPlugin)
name := "scalajs-example"
version := "0.1"
scalaVersion := "2.12.8"
scalacOptions += "-P:scalajs:sjsDefinedByDefault"
scalaJSLinkerConfig ~= { _.withModuleKind(ModuleKind.ESModule) }
project/plugins.sbt:
addSbtPlugin("org.scala-js" % "sbt-scalajs" % "0.6.27")
The JavaScript file that imports the generated code:
import { SomeClass } from "./target/scala-2.12/scalajs-example-fastopt.js"

Related

Databricks SCALA UDF cannot load class when registering function

I have followed this guide and this question trying to implement a decryption function to use in a SQL view.
I have compiled this scala code in the example to a jar file and uploaded to the Databricks File System (DBFS):
import com.macasaet.fernet.{Key, StringValidator, Token};
import org.apache.hadoop.hive.ql.exec.UDF;
import java.time.{Duration, Instant};
class Validator extends StringValidator {
override def getTimeToLive() : java.time.temporal.TemporalAmount = {
Duration.ofSeconds(Instant.MAX.getEpochSecond());
}
}
class udfDecrypt extends UDF {
def evaluate(inputVal: String, sparkKey : String): String = {
if( inputVal != null && inputVal!="" ) {
val keys: Key = new Key(sparkKey)
val token = Token.fromString(inputVal)
val validator = new Validator() {}
val payload = token.validateAndDecrypt(keys, validator)
payload
} else return inputVal
}
}
I can declare the function as demonstrated:
%sql
CREATE OR REPLACE FUNCTION default.udfDecrypt AS 'com.nm.udf.udfDecrypt'
USING jar 'dbfs:/FileStore/jars/decryptUDF.jar';
But if I try to call it an error is thrown:
%sql
SELECT default.udfDecrypt(field, '{key}') FROM default.encrypted_test;
Error in SQL statement: AnalysisException: Can not load class 'com.nm.udf.udfDecrypt' when registering the function 'default.udfDecrypt', please make sure it is on the classpath; line 1 pos 7
I have noticed that the function can be declared using any jar file path (even one that doesn't exist) and it will still return 'OK'.
I am using Databricks for Azure.
It seems like your UDF code is missing:
package com.nm.udf;
at the top.
Update as of 2022 October, because the accepted solution did not work for me.
First off, the given Scala code is incorrect, you need to add...
import java.time.Duration
import java.time.Instant;
To the top of the code.
Secondly, after packing the .scala file to jar (using sbt package for example...), when you create the function...
CREATE OR REPLACE FUNCTION udfDecryptor AS 'udfDecrypt'
USING jar 'dbfs:/FileStore/jars/decryptUDF.jar';
Then the alias of the function must match the name of the class.
Pay attention to having the build.sbt file with the correct version and dependencies, for example...
ThisBuild / version := "0.2.0-properscala"
ThisBuild / scalaVersion := "2.12.14"
lazy val root = (project in file("."))
.settings(
name := "the_name_of_project_here"
)
// https://mvnrepository.com/artifact/org.scala-lang/scala-library
libraryDependencies += "com.macasaet.fernet" % "fernet-java8" % "1.5.0"
// https://mvnrepository.com/artifact/org.apache.hive/hive-exec
libraryDependencies += "org.apache.hive" % "hive-exec" % "3.1.2"
This is the only way I managed to solve it and made running of the code successful. Hope it helps.
There could be some hidden parts in the guidlines.
Here is a complete repo to make things in details.
https://github.com/dungruoc/databricks-fernet-udf

How To Execute Basic Json feeder using Scala Jackson Library in Idea IntelliJ Editor

All i need to execute a basic Jackson library code using scala in Intellij Idea editor
I already have installed scala
C:\Users\tt>scala -version
Scala code runner version 2.12.7 -- Copyright 2002-2018, LAMP/EPFL and Lightbend, Inc.
import com.fasterxml.jackson.databind.ObjectMapper
import scala.collection.mutable
val input = scala.io.Source.fromFile("data.json").getLines()
val mapper = new ObjectMapper() with ScalaObjectMapper
mapper.registerModule(DefaultScalaModule)
val obj = mapper.readValue[Map[String, Any]](input)
val data_collection = mutable.HashMap.empty[Int, String]
for (i <- c) {
data_collection.put(
obj.get("id").fold(0)(_.toString.toInt),
obj.get("text").fold("")(_.toString)
)
}
println(data_collection) // Map(1 -> Hello How are you)
I expect IntelliJ editor to automatcially suggest why ScalaObjectMapper and DefaultScalaModule showing as Cannot Resolve symbols despite of using correct imports
Getting errors like below
Error:(4, 1) expected class or object definition
name := "jackson-module-scala"
Error:(6, 1) expected class or object definition
organization := "com.fasterxml.jackson.module"
Error:(8, 1) expected class or object definition
scalaVersion := "2.12.8"
Error:(10, 1) expected class or object definition
crossScalaVersions := Seq("2.10.7", "2.11.12", "2.12.8",
"2.13.0-M5")
Error:(12, 1) expected class or object definition
val scalaMajorVersion = SettingKey[Int]("scalaMajorVersion")
Error:(13, 1) expected class or object definition
scalaMajorVersion := {
Error:(20, 1) expected class or object definition
scalacOptions ++= Seq("-deprecation", "-unchecked", "-feature")
Error:(27, 6) classes cannot be lazy
lazy val java7Home =
Error:(33, 1) expected class or object definition
javacOptions ++= {
Error:(41, 1) expected class or object definition
scalacOptions ++= {
Error:(45, 1) expected class or object definition
unmanagedSourceDirectories in Compile += {
Error:(49, 1) expected class or object definition
val jacksonVersion = "2.9.8"
Error:(51, 1) expected class or object definition
libraryDependencies ++= Seq(
Error:(65, 1) expected class or object definition
resourceGenerators in Compile += Def.task {
Error:(73, 1) expected class or object definition
site.settings
Error:(75, 1) expected class or object definition
site.includeScaladoc()
Error:(77, 1) expected class or object definition
ghpages.settings
Error:(79, 1) expected class or object definition
git.remoteRepo := "git#github.com:FasterXML/jackson-module-
scala.git"

Why am I getting object scalatest is not a member of package org

I am trying to follow the Scala tutorial and am getting the error object scalatest is not a member of package org. All of the other instances of this error I can find here and on the web have the problem that the test file isn't under src/test, but that's not the case for me. I repeat, for the sake of those who keep marking this as a duplicate, the test file is under src/test, so that is not the problem. My build.sbt file says:
name := "TestExercise"
version := "0.1"
scalaVersion := "2.12.6"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.5" % "test"
src/main/scala/CubeCalculator says:
object CubeCalculator extends App {
def cube(x: Int) = {
x * x * x
}
}
and src/test/scala/CubeCalculatorTest says:
import org.scalatest.FunSuite
class CubeCalculatorTest extends FunSuite {
test("CubeCalculator.cube") {
assert(CubeCalculator.cube(3) === 27)
}
}
(Cut-and-pasted straight from the tutorial.)
So what am I doing wrong? Why can't my project access scalatest?
#terminally-chill gave the answer. I am using Intellij, and File | Invalidate caches / restart, followed by a rebuild, resolved the issue.

Scala ambiguous imports

Scala beginner here and I was trying out the example here:
https://raw.githubusercontent.com/sryza/aas/master/ch02-intro/src/main/scala/com/cloudera/datascience/intro/RunIntro.scala
val nasRDD = parsed.map(md => {
md.scores.map(d => NAStatCounter(d))
})
The above gives me error:
<console>:51: error: reference to NAStatCounter is ambiguous;
it is imported twice in the same scope by
import $VAL180.NAStatCounter
and import INSTANCE.NAStatCounter
md.scores.map(d => NAStatCounter(d))
^
Can anyone please explain why this double import is happening. How can I avert this?
I could not reproduce your problem. I put RunIntro.scala into a small sbt project and compiled it successfully with the build.sbt file (with blank lines removed)
% cat build.sbt
name := "RunIntro"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies ++= List("org.apache.spark" % "spark-core_2.11" % "1.6.1")
The imports are only part of the source of your problem. How are you compiling this source?

Why is my Scala class not visible to its matching test class?

I am starting out in Scala with SBT, making a Hello World program.
Here's my project layout:
I've made sure to download the very latest JDK and Scala, and configure my Project Settings. Here's my build.sbt:
name := "Coursera_Scala"
version := "1.0"
scalaVersion := "2.11.6"
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "2.2.4" % "test"
Hello.scala itself compiles okay:
package demo
class Hello {
def sayHelloTo(name: String) = "Hello, $name!"
}
However, my accompanying HelloTest.scala does not. Here's the test:
package demo
import org.scalatest.FunSuite
class HelloTest extends FunSuite {
test("testSayHello") {
val result = new Hello.sayHelloTo("Scala")
assert(result == "Hello, Scala!")
}
}
Here's the error:
Error:(8, 22) not found: value Hello
val result = new Hello.sayHello("Scala")
^
In addition to the compile error, Intellij shows errors "Cannot resolve symbol" for the symbols Hello, assert, and ==. This leads me to believe that the build is set up incorrectly, but then wouldn't there be an error on the import?
The problem is this expression:
new Hello.sayHelloTo("Scala")
This creates a new instance of the class sayHelloTo defined on the value Hello. However, there is no value Hello, just the class Hello.
You want to write this:
(new Hello).sayHelloTo("Scala")
This creates a new instance of the class Hello and calls sayHelloTo on the instance.
Or you can use new Hello().sayHelloTo(...). Writing () should create a new instance and then call the method.