Databricks scala cannot find uber objects - scala

First allow me to say, I am trying to learn Databricks but have years of Data Factory and ETL experience. I got some code that uses AIS data to map logistics movements.
The code uses Uber libraries for H3Core functionality. I did see a demo of this code on a coworkers laptop so I know it CAN work.
I am having trouble finding the uber objects. I assume this is a newbie thing. I imagine my problem is environmental.
I cannot post all the code but the include lines that are throwing the error:
import com.uber.h3core.H3Core
import com.uber.h3core.util.GeoCoord
import com.uber.h3core.LengthUnit
Those lines produce the following errors on execution:
command-2206228078162026:1: error: object uber is not a member of package com
import com.uber.h3core.H3Core
^
command-2206228078162026:2: error: object uber is not a member of package com
import com.uber.h3core.util.GeoCoord
^
command-2206228078162026:3: error: object uber is not a member of package com
import com.uber.h3core.LengthUnit
^
Below I think it is trying to reference objects created with missing class libraries for H3Core:
command-2206228078162026:15: error: not found: value H3Core
val h3 = H3Core.newInstance()
^
command-2206228078162026:26: error: not found: value H3Core
val h3 = H3Core.newInstance()
^

I have also had similar issues, usually occurring when the size of the jar is a little bigger where in the notebook starts running before the jar is fully loaded. But I have not been able to reproduce it consistently. (So don't quote the answer on it.)
Couple of Options we have tried. Please validate and see if it work for you.
Have the jar available on the DBFS path and then install it from DBFS as a dependent library at the job level. Please refer to the link - Dependent-Libraries
We have seen that sometimes even this fails. If it is possible, you can check if the library has been loaded and sleep and then check again. Not sure how you would do that in scala.
Please validate what works....
Cheers...

Related

Why can't I import the DeBERTa model in Zeppelin in scala/sparknlp?

I'm trying to use the DeBerta model and I'm first trying to implement some of the code I found here (an example of how to use the model), just to make sure I have the right dependencies and know what imports I need. I'm working in Scala 2.4.0, Zeppelin 0.8.1. I've added "com.johnsnowlabs.nlp:spark-nlp-spark24_2.11:3.4.4" as a dependency, which I believe should give me spark-nlp 3.4.4. So far most of the imports I got from the sample code are working:
import com.johnsnowlabs.nlp.embeddings
import com.johnsnowlabs.nlp.annotator._
import com.johnsnowlabs.nlp.base._
import com.johnsnowlabs.nlp.training.CoNLL
import com.johnsnowlabs.nlp.util.io.ResourceHelper
import com.johnsnowlabs.util.Benchmark
import org.apache.spark.ml.Pipeline
import org.apache.spark.sql.functions.{col, explode, size}
Although two of the imports get errors
but I'm not sure I need these to load the model. But when I try to load
val embeddings = DeBertaEmbeddings.pretrained("deberta_v3_base", "en")
.setInputCols("sentence", "token")
.setOutputCol("embeddings")
.setMaxSentenceLength(512)
I get an error saying
java.lang.IllegalArgumentException: requirement failed: Can not find deberta_v3_base inside public/models to download. Please make sure the name and location are correct!
at scala.Predef$.require(Predef.scala:224)
at com.johnsnowlabs.nlp.pretrained.ResourceDownloader$.downloadResource(ResourceDownloader.scala:441)
at com.johnsnowlabs.nlp.pretrained.ResourceDownloader$.downloadModel(ResourceDownloader.scala:499)
at com.johnsnowlabs.nlp.pretrained.ResourceDownloader$.downloadModel(ResourceDownloader.scala:492)
at com.johnsnowlabs.nlp.HasPretrained$class.pretrained(HasPretrained.scala:44)
at com.johnsnowlabs.nlp.annotator.package$DeBertaEmbeddings$.com$johnsnowlabs$nlp$embeddings$ReadablePretrainedDeBertaModel$$super$pretrained(annotator.scala:532)
at com.johnsnowlabs.nlp.embeddings.ReadablePretrainedDeBertaModel$class.pretrained(DeBertaEmbeddings.scala:329)
at com.johnsnowlabs.nlp.annotator.package$DeBertaEmbeddings$.pretrained(annotator.scala:532)
at com.johnsnowlabs.nlp.annotator.package$DeBertaEmbeddings$.pretrained(annotator.scala:532)
at com.johnsnowlabs.nlp.HasPretrained$class.pretrained(HasPretrained.scala:47)
at com.johnsnowlabs.nlp.annotator.package$DeBertaEmbeddings$.com$johnsnowlabs$nlp$embeddings$ReadablePretrainedDeBertaModel$$super$pretrained(annotator.scala:532)
at com.johnsnowlabs.nlp.embeddings.ReadablePretrainedDeBertaModel$class.pretrained(DeBertaEmbeddings.scala:326)
at com.johnsnowlabs.nlp.annotator.package$DeBertaEmbeddings$.pretrained(annotator.scala:532)
... 51 elided
I've tried to go through the documentation on the DeBerta model, and from what I've seen I have adequate versions of scala and spark-nlp according to John Snow Labs models hub. I saw on this stackoverflow thread that I might need to put "_noncontrib" after my model name. I tried that even though it says I wouldn't need it after spark-nlp 2.4.0+. So now I'm pretty sure "deberta_v3_base" is the right name, but I'm not sure what the error means by the location being incorrect. Do I need to specify a location for the pretrained model per the config helper? Does anyone know what's going on here? I apologize if I've left anything out, I'm new to Zeppelin and spark-nlp. Thanks!

Nrwl build Library Error: File is not under rootDir

The error occurs when I use the library in another library.
Library import is working fine in the app but not working within libraries.
And not able to generate the build of a library.
All libraries are publishable.
Error:
Nrwl.v13 Files Structure within lib folder:
Very difficult to debug. It can be related to circular dependency issue. Are you sure you don't import code from library which import code from the same library ?
A import B
B import A
If this is the case, you should handle this by creating a C library which will be imported by A and B or find a solution for the A or B which will not depend on each other.
Code example will be helpful for help you.
From https://github.com/nrwl/nx/issues/10785#issuecomment-1158916416:
There seems to have been an issue with a migration that was scheduled
for a version but the migration itself was released in another
version, so that might have caused the migration to not run in some
scenarios. That migration should have added the following in nx.json
for anyone having their nx.json extending from
nx/presets/core.json or nx/presets/npm.json:
{
...
"pluginsConfig": {
"#nrwl/js": {
"analyzeSourceFiles": true
}
}
}
Could you please add the above snippet to your nx.json and try again? If after applying the change it doesn't pick
it up immediately, run nx reset and then try again.
This didn't work for me though, so I opened nx issue #11583: library importing other library using wildcard path mapping fails with "is not under 'rootDir'"
I had this issue in one of our monorepo and it was caused by the fact that one of our library's name wasn't valid. We had something like #organisation/test-utils/e2e which we ended up renaming to #organisation/test-utils-e2e (take note of the / usage).

Indirectly exported class not visible

I'm having trouble using the Backendless plugin for Flutter.
I include
import 'package:backendless_sdk/backendless_sdk.dart';
(as per the instructions) and can then use e.g. Backendless.UserService. But if I try to generate a user to register, e.g.:
var user = new BackendlessUser();
user.setEmail("info#example.org");
user.setPassword("password");
Backendless.UserService.register(user);
I get an error Undefined class 'BackendlessUser' on the first line. This class is defined in src/modules/user_service.dat, which is exported by src/modules/modules.dartlike this:
library modules;
export 'cache.dart';
...
export 'user_service.dart';
which in turn is imported by backendless_sdk.dart like this:
import 'package:backendless_sdk/src/modules/modules.dart';
I would have thought that it would get imported indirectly by the import of backendless_sdk.dart, but apparently not. When I import it explicitly (with the same import statement, but now in my own code and not just indirectly in backendless_sdk.dart), I get a warning Don't import implementation files from another package. But it's not an implementation file; it's exported as part of the public API (at least that's what I understand the export statement to mean).
The Dart tutorial for creating packages suggests to place the export statements directly under lib, not in lib/src, so I'm wondering whether this is an error in the structure of the plugin, or whether I'm doing something wrong.
I'd be grateful both for a solution to this particular problem and also for pointers to how I can better understand packages, libraries, imports and exports in dart; unfortunately I don't find the language specification particularly helpful in this regard.
(The error and the warning are the same whether I use flutter analyze or IntelliJ IDEA.)
The problem has been fixed in the 0.0.3 version of the plugin. Please update the backendless_sdk version in your pubspec.yaml.
You can include the only one import now:
import 'package:backendless_sdk/backendless_sdk.dart';
Please also note, that there are some changes in the syntax. So for your example you should use:
var user = new BackendlessUser()
..email = "info#example.org"
..password = "password";
Backendless.userService.register(user);
Thanks for using Flutter SDK and pointing out this issue.
It's indeed the problem in the structure of the plugin. The Backendless team is aware of it and this problem will be fixed in the next release of the plugin.
For now you can import explicitly and suppress the warning.

Referring to a non-existent class java.io.File

This question appears fundamentally the same as Linker error in ScalaJS: "Referring to non-existent class", but seems to differ in specifics. I don't specify a JVM version in build.sbt.
I'm trying to access a file in my scalajs app using scala.io.Source, or java.io.File. As soon as I attempt to use either I get a link fail:
...
[info] Fast optimizing C:\Users\Tim\Documents\GitHub\binpack\target\scala-2.11\binpack-fastopt.js
[error] Referring to non-existent class java.io.File
[error] called from client.jsClient$.onFileChosen(japgolly.scalajs.react.SyntheticEvent)scala.Function0
...
Is there a method by which I can diagnose what's going on here? I'm assuming the problem is not specific to java.io.File, but don't know what the next step would be.
This is scala 2.11.8, with sbt 0.13.7.
java.io.File is not portable to Scala.js because it's specifically tied to the JVM. See http://www.lihaoyi.com/hands-on-scala-js/#PortingJavaAPIs (just scroll to the very bottom) for a little more detail.
So, if you want to do filesystem IO in Scala.js, you'll need to use a solution specifically implemented for JavaScript, like io.js. Here's an example: http://bchazalet.github.io/2015/07/20/scalajs-electron-skeleton-part-2/

How to import GHC.Event?

I'm playing around with most recent Haskell platform (2013.2.0.0) on Windows trying to create a simple but fast concurrent socket server. I'm trying to base my work on examples here: http://www.haskell.org/haskellwiki/Simple_Servers
That sample, however, doesn't not compile complaining that it
Could not find module `System.Event'
After some googling I've found out that all declarations from that module were moved to GHC.Event, which has a remark in its documentation about being 'GHC internal'. Indeed, when i try to import GHC.Event I get the same error
Could not find module `GHC.Event'
How should I be supposed to import useful functions such as registerFd and friends? Is there anything I'm missing in my cabal file? I've got the following:
build-depends: base ==4.6.*, hslogger == 1.2.3, time == 1.4.0.*, old-locale, network == 2.4.1.2