I am learning Gin with GinTutorial and I've stacked on silly problem.
In bind(MyWidgetMainPanel.class).in(Singleton.class); Singleton does not exist.
I've already tried the following:
import com.google.gwt.inject.client.AbstractGinModule;
import com.google.gwt.inject.client.Singleton;
import com.google.gwt.inject.Singleton;
IDE shows that Singleton does not exist before compilation time.
If I try to use import com.google.inject.Singleton; it shows that Singleton does not exist on compilation.
gin-2.1.2
gwt2.6
I'd be happy to have any of your ideas or explanations.
I am not sure this is the solution, but when I use injection like this with GIN in GWT, I do not import com.google.gwt.inject.Singleton in my GinModule, but com.google.inject.Singleton and it works.
I hope it helps.
Finally I've found that it is not enough to add gin-2.1.2.jar to classpath. It requires guice-3.0.jar.
Related
I've a problem that I can't resolve I did a lot of research but nothing.
I need to use the library flutter_countdown_timer but he seems that CurrrentRemainingTime isnt recognized. Someone has already use this library ?
The error code is 'CurrentRemainingTime' isn't a type.enter image description here
Seems like you haven't imported the import statement for the CurrentRemainingTime. There are two import statements, one for CountDownTimer and the other for the CurrentRemainingTime.
Add this import statement:
import 'package:flutter_countdown_timer/current_remaining_time.dart';
I'm trying to use the DeBerta model and I'm first trying to implement some of the code I found here (an example of how to use the model), just to make sure I have the right dependencies and know what imports I need. I'm working in Scala 2.4.0, Zeppelin 0.8.1. I've added "com.johnsnowlabs.nlp:spark-nlp-spark24_2.11:3.4.4" as a dependency, which I believe should give me spark-nlp 3.4.4. So far most of the imports I got from the sample code are working:
import com.johnsnowlabs.nlp.embeddings
import com.johnsnowlabs.nlp.annotator._
import com.johnsnowlabs.nlp.base._
import com.johnsnowlabs.nlp.training.CoNLL
import com.johnsnowlabs.nlp.util.io.ResourceHelper
import com.johnsnowlabs.util.Benchmark
import org.apache.spark.ml.Pipeline
import org.apache.spark.sql.functions.{col, explode, size}
Although two of the imports get errors
but I'm not sure I need these to load the model. But when I try to load
val embeddings = DeBertaEmbeddings.pretrained("deberta_v3_base", "en")
.setInputCols("sentence", "token")
.setOutputCol("embeddings")
.setMaxSentenceLength(512)
I get an error saying
java.lang.IllegalArgumentException: requirement failed: Can not find deberta_v3_base inside public/models to download. Please make sure the name and location are correct!
at scala.Predef$.require(Predef.scala:224)
at com.johnsnowlabs.nlp.pretrained.ResourceDownloader$.downloadResource(ResourceDownloader.scala:441)
at com.johnsnowlabs.nlp.pretrained.ResourceDownloader$.downloadModel(ResourceDownloader.scala:499)
at com.johnsnowlabs.nlp.pretrained.ResourceDownloader$.downloadModel(ResourceDownloader.scala:492)
at com.johnsnowlabs.nlp.HasPretrained$class.pretrained(HasPretrained.scala:44)
at com.johnsnowlabs.nlp.annotator.package$DeBertaEmbeddings$.com$johnsnowlabs$nlp$embeddings$ReadablePretrainedDeBertaModel$$super$pretrained(annotator.scala:532)
at com.johnsnowlabs.nlp.embeddings.ReadablePretrainedDeBertaModel$class.pretrained(DeBertaEmbeddings.scala:329)
at com.johnsnowlabs.nlp.annotator.package$DeBertaEmbeddings$.pretrained(annotator.scala:532)
at com.johnsnowlabs.nlp.annotator.package$DeBertaEmbeddings$.pretrained(annotator.scala:532)
at com.johnsnowlabs.nlp.HasPretrained$class.pretrained(HasPretrained.scala:47)
at com.johnsnowlabs.nlp.annotator.package$DeBertaEmbeddings$.com$johnsnowlabs$nlp$embeddings$ReadablePretrainedDeBertaModel$$super$pretrained(annotator.scala:532)
at com.johnsnowlabs.nlp.embeddings.ReadablePretrainedDeBertaModel$class.pretrained(DeBertaEmbeddings.scala:326)
at com.johnsnowlabs.nlp.annotator.package$DeBertaEmbeddings$.pretrained(annotator.scala:532)
... 51 elided
I've tried to go through the documentation on the DeBerta model, and from what I've seen I have adequate versions of scala and spark-nlp according to John Snow Labs models hub. I saw on this stackoverflow thread that I might need to put "_noncontrib" after my model name. I tried that even though it says I wouldn't need it after spark-nlp 2.4.0+. So now I'm pretty sure "deberta_v3_base" is the right name, but I'm not sure what the error means by the location being incorrect. Do I need to specify a location for the pretrained model per the config helper? Does anyone know what's going on here? I apologize if I've left anything out, I'm new to Zeppelin and spark-nlp. Thanks!
I'm mocking my service object as follows:
However, Intellij is not resolving the keyword when. I'm not sure what it can be. I've added all the correct dependencies in my build definition. What am I doing wrong?
Missing import:
import org.mockito.Mockito._
Please import org.mockito.Mockito._ as suggested by #Mika'il, but if you still are not able to resolve the issue check if you have an org.scalatest.GivenWhenThen import somewhere and check if you're probably extending it in your test class/base test class. Remove that. org.scalatest.GivenWhenThen trait has a different implementation of when method.
I'm new to flash development, so I'm watching a tutorial on how to use FlashDevelop. The video recommended I use Box2D and explained how to use it as a global classpath, which I have done.
I was messing around with the code using what the person in the video was showing, just trying to get an output. As I typed, FlashDevelop was adding in the import statements for me.
import Box2D.Collision.Shapes.b2CircleShape;
import Box2D.Common.Math.b2Vec2;
import Box2D.Dynamics.b2BodyDef;
import Box2D.Dynamics.b2FixtureDef;
import Box2D.Dynamics.b2World;
import Box2D.Dynamics.b2Body;
When I run the program though, it's returning this:
col: 31 Error: Definition Box2D.Collision.Shapes:b2CircleShape could not be found.
It's returning a variation of that for each import.
I've checked and the files are indeed there. I'm really not certain what this could be; it's possible I just missed a step.
Any ideas?
(Sorry if I formatted this question incorrectly, I'm new to this site.)
It's maybe cause you are using an old version
I think these are your choices :
1) you have to do an update
or
2) use "b2CircleDef"
See the code source in this link the change are commented
http://www.emanueleferonato.com/2010/01/27/box2dflash-2-1a-released-what-changed/
Hope that was helpful !
Good luck
I have a class uses the following lines, it works fine in a Google App Engine project:
import javax.jdo.annotations.IdGeneratorStrategy;
import javax.jdo.annotations.IdentityType;
import javax.jdo.annotations.PersistenceCapable;
import javax.jdo.annotations.Persistent;
import javax.jdo.annotations.PrimaryKey;
But when I included this class in another project, it cause error :
package javax.jdo.annotations does not exist
What should I do to find javax.jdo.* ?
Add the JDO jar file to the class path.
The star notation for imports isn't working the way you think it does.
It's not recursive - it only applies the child classes in javax.jdo, not the child packages.
If you want all the classes in javax.jdo.annotations, you'll need to import javax.jdo.annotations.*, too.
I'd recommend not using the star notation. Better to type out the imports for every class individually. Use an IDE to help you. It's clearer for you and other programmers who come after you where those classes came from.