NoSuchMethodError: de.tudarmstadt.ukp.dkpro.core.stanfordnlp.StanfordSegmenter.isWriteToken()Z - uima

Script
//------------------------------------------------------------------------
// DKPRO - Imports
//------------------------------------------------------------------------
IMPORT PACKAGE de.tudarmstadt.ukp.dkpro.core.api.lexmorph.type.pos FROM GeneratedDKProCoreTypes AS pos;
IMPORT de.tudarmstadt.ukp.dkpro.core.api.segmentation.type.Lemma FROM GeneratedDKProCoreTypes;
UIMAFIT de.tudarmstadt.ukp.dkpro.core.stanfordnlp.StanfordSegmenter;
UIMAFIT de.tudarmstadt.ukp.dkpro.core.treetagger.TreeTaggerPosLemmaTT4J;//TreeTaggerPosTagger
//------------------------------------------------------------------------
//------------------------------------------------------------------------
// DKPRO - Execution
//-------------------------------------------------- ----------------------
Document{-CONTAINS(pos.POS)} -> {
Document{-> SETFEATURE("language", "en")}; //"de"
Document{-> EXEC(StanfordSegmenter)};
Document{-> EXEC(TreeTaggerPosLemmaTT4J, {pos.POS})};//(TreeTaggerPosTagger, {pos.POS})};
};
//------------------------------------------------------------------------
//------------------------------------------------------------------------
// DKPRO - Test
//------------------------------------------------------------------------
DECLARE DZC_DkProTest;
pos.NP{-> MARK(DZC_DkProTest)};
//------------------------------------------------------------------------
Error in short
org.apache.uima.analysis_engine.AnalysisEngineProcessException: Annotator processing failed.
Caused by: java.lang.NoSuchMethodError: de.tudarmstadt.ukp.dkpro.core.stanfordnlp.StanfordSegmenter.isWriteToken()Z

I changed the DKPro Core version in Maven from 1.5.0 to 1.7.0 and then the issue got resolved. I could locate the isWriteToken() method in de.tudarmstadt.ukp.dkpro.core.api.segmentation.SegmenterBase class.

The method isWriteToken() that's part of StanfordSegmenter is not found, which causes the error. Either it doesn't exist or you misspelled it.
Don't really know what else to tell you without more info.

Related

How to resolve “DesiredCapabilities cannot be resolved to a type"?

I am trying to run my first project with Appium on Eclipse and getting this error,
Exception in thread "main" java.lang.Error: Unresolved compilation
problems: DesiredCapabilities cannot be resolved to a type
DesiredCapabilities cannot be resolved to a type
at base.main(base.java:17)
Error: Unable to initialize main class base Caused by:
java.lang.NoClassDefFoundError:
io/appium/java_client/android/AndroidDriver
package Auto;
import java.io.File;
import java.net.MalformedURLException;
import java.net.URL;
import org.openqa.selenium.remote.DesiredCapabilities;
import io.appium.java_client.android.AndroidDriver;
import io.appium.java_client.android.AndroidElement;
import io.appium.java_client.remote.MobileCapabilityType;
public class base2 {
public static AndroidDriver<AndroidElement> dc() throws MalformedURLException
{
// TODO Auto-generated method stub
AndroidDriver<AndroidElement> driver;
File f = new File("src");
File fs = new File(f, "ApiDemos-debug.apk");
DesiredCapabilities dc = new DesiredCapabilities();
dc.setCapability(MobileCapabilityType.DEVICE_NAME, "DSEmulator");
dc.setCapability(MobileCapabilityType.AUTOMATION_NAME,"uiautomator2");
dc.setCapability(MobileCapabilityType.APP, fs.getAbsolutePath());
driver = new AndroidDriver<>(new URL("http://127.0.0.1:4723/wd/hub"), dc);
return driver;
}
}
Can someone point me what I must add to resolve this error?
I tried running the same code on my machine and it worked perfectly fine. This means that the problem is not with your code. Unresolved compilation problems generally occur whenever there is some problem with the jars referenced by your project. I would suggest you to remove all existing jars from the project build path, and then add them back carefully ensuring that you have all the required jars and then try re-runnning your code.

Eclipse JUnit 5 SecruityException when running Tests

I think I may be the only one experiencing this issue.
I, today, updated my eclipse install to version 2020-03 (4.15.0). I am also attempting to write a very simple JUnit 5 test for a new method I'm working on.
When I run my test, right now just a basic stub, I get the following error:
java.lang.SecurityException: class "org.junit.platform.commons.PreconditionViolationException"'s signer information does not match signer information of other classes in the same package
at java.base/java.lang.ClassLoader.checkCerts(ClassLoader.java:1150)
at java.base/java.lang.ClassLoader.preDefineClass(ClassLoader.java:905)
at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1014)
at java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:151)
at java.base/jdk.internal.loader.BuiltinClassLoader.defineClass(BuiltinClassLoader.java:821)
at java.base/jdk.internal.loader.BuiltinClassLoader.findClassOnClassPathOrNull(BuiltinClassLoader.java:719)
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClassOrNull(BuiltinClassLoader.java:642)
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:600)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
at org.eclipse.jdt.internal.junit5.runner.JUnit5TestLoader.createUnfilteredTest(JUnit5TestLoader.java:75)
at org.eclipse.jdt.internal.junit5.runner.JUnit5TestLoader.createTest(JUnit5TestLoader.java:66)
at org.eclipse.jdt.internal.junit5.runner.JUnit5TestLoader.loadTests(JUnit5TestLoader.java:53)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:526)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:770)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:464)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:210)
I also see the following dialog
My run Configuration is:
I've tried all major junit-jupiter (aggregator) releases back to 5.5.0 all resulting in the same issue.
I've tried this solution. However, that question deals with a class not found issue. I also tried that same solution using using junit-platform-commons version 1.6.1. no change.
However, I can run maven configuration with -Dtest=DeaFileListTest test the the tests run.
My test case is simple, I instantiate an object that has the method I want to test and then my test.
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.empty;
import static org.hamcrest.Matchers.not;
import java.io.IOException;
import java.util.List;
import javax.ws.rs.core.Response;
import org.junit.jupiter.api.AfterAll;
import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.Test;
import com.mfgweb.FileRepo;
class DeaFileListTest {
private static FileRepo filerepo;
private static Response response;
#BeforeAll
static void setUpBeforeClass() throws Exception {
filerepo = new FileRepo();
response = filerepo.getDeaFiles();
}
#AfterAll
static void tearDownAfterClass() throws Exception {
response = null;
filerepo = null;
}
#Test
public void deaFileListIsNotEmptyTest() throws IOException {
#SuppressWarnings ( "unchecked" )
List< String > files = ( List< String > )response.getEntity();
assertThat( files, not( empty() ) );
}
}
So I am curious why I'm receiving the Security Exception when I run the test in eclipse, yet Maven seems to execute them fine.

Error while loading a tagger model (probably missing model file)

I am trying to implement the below code:
import java.util.Properties;
import edu.stanford.nlp.coref.CorefCoreAnnotations;
import edu.stanford.nlp.coref.CorefCoreAnnotations;
import edu.stanford.nlp.coref.data.CorefChain;
import edu.stanford.nlp.coref.data.Mention;
import edu.stanford.nlp.ling.CoreAnnotations;
import edu.stanford.nlp.pipeline.Annotation;
import edu.stanford.nlp.pipeline.StanfordCoreNLP;
import edu.stanford.nlp.util.CoreMap;
public class CorefResolver {
public static void main(String[] args) throws Exception {
Annotation document = new Annotation("Barack Obama was born in Hawaii. He is the president. Obama was elected in 2008.");
Properties props = new Properties();
props.setProperty("annotators", "tokenize,ssplit,pos,lemma,ner,parse,mention,coref");
StanfordCoreNLP pipeline = new StanfordCoreNLP(props);
pipeline.annotate(document);
System.out.println("---");
System.out.println("coref chains");
for (CorefChain cc : document.get(CorefCoreAnnotations.CorefChainAnnotation.class).values()) {
System.out.println("\t" + cc);
}
for (CoreMap sentence : document.get(CoreAnnotations.SentencesAnnotation.class)) {
System.out.println("---");
System.out.println("mentions");
for (Mention m : sentence.get(CorefCoreAnnotations.CorefMentionsAnnotation.class)) {
System.out.println("\t" + m);
}
}
}
}
This is a code from stanford corenlp. I have used eclipse as framework. Below image shows the view in eclipse.
While running the code, I am getting the below error. I have tried including models taggers etc. Still showing the same error.
Adding annotator tokenize
No tokenizer type provided. Defaulting to PTBTokenizer.
Adding annotator ssplit
Adding annotator pos
Exception in thread "main" java.lang.RuntimeException: edu.stanford.nlp.io.RuntimeIOException: Error while loading a tagger model (probably missing model file)
at edu.stanford.nlp.pipeline.AnnotatorFactories$4.create(AnnotatorFactories.java:245)
at edu.stanford.nlp.pipeline.AnnotatorPool.get(AnnotatorPool.java:152)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.construct(StanfordCoreNLP.java:451)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:154)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:150)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:137)
at CorefResolver.main(CorefResolver.java:16)
Caused by: edu.stanford.nlp.io.RuntimeIOException: Error while loading a tagger model (probably missing model file)
at edu.stanford.nlp.tagger.maxent.MaxentTagger.readModelAndInit(MaxentTagger.java:791)
at edu.stanford.nlp.tagger.maxent.MaxentTagger.<init>(MaxentTagger.java:312)
at edu.stanford.nlp.tagger.maxent.MaxentTagger.<init>(MaxentTagger.java:265)
at edu.stanford.nlp.pipeline.POSTaggerAnnotator.loadModel(POSTaggerAnnotator.java:85)
at edu.stanford.nlp.pipeline.POSTaggerAnnotator.<init>(POSTaggerAnnotator.java:73)
at edu.stanford.nlp.pipeline.AnnotatorImplementations.posTagger(AnnotatorImplementations.java:63)
at edu.stanford.nlp.pipeline.AnnotatorFactories$4.create(AnnotatorFactories.java:243)
... 6 more
Caused by: java.io.IOException: Unable to open "edu/stanford/nlp/models/pos-tagger/english-left3words/english-left3words-distsim.tagger" as class path, filename or URL
at edu.stanford.nlp.io.IOUtils.getInputStreamFromURLOrClasspathOrFileSystem(IOUtils.java:470)
at edu.stanford.nlp.tagger.maxent.MaxentTagger.readModelAndInit(MaxentTagger.java:789)
... 12 more
Could anyone help me to solve this issue?
Using Kotlin Gradle dsl (build.gradle.kts) this worked for me:
val corenlp_version = "4.2.2"
If you don't want to use the previous variable, you can substitute the value:
implementation("edu.stanford.nlp:stanford-corenlp:$corenlp_version")
implementation("edu.stanford.nlp:stanford-corenlp:$corenlp_version:models")
implementation("edu.stanford.nlp:stanford-corenlp:$corenlp_version:models-english")
implementation("edu.stanford.nlp:stanford-corenlp:$corenlp_version:models-english-kbp")
Included Stanford corenlp 3.8 jar and Stanford corenlp 3.8 models. Now the coreference resolution is working
jar
I found adding the models classifier dependency worked.
<dependency>
<groupId>edu.stanford.nlp</groupId>
<artifactId>stanford-corenlp</artifactId>
<version>4.2.2</version>
</dependency>
<dependency>
<groupId>edu.stanford.nlp</groupId>
<artifactId>stanford-corenlp</artifactId>
<version>4.2.2</version>
<classifier>models</classifier>
</dependency>

de.hybris.eventtracking.model.events.AbstractTrackingEvent cannot be resolved to a type

I just finished configuring hybris and tried to set up the eclipse project. As per guidelines in the wiki.hybris, I imported all the extensions into the eclipse project. When I try into build and clean, I get more than 3000 compiler errors. One of the errors is the class AbstractTrackingEvent cannot be resolved to a type. I looked for the particular class in the project folder. I could not find the folder events under de.hybris.eventtracking.model, which is the cause of the issue.
Am I missing anything while importing the project? There are many such type of issues in my eclipse project. Please let me know how to fix it. I have attached the screenshot for reference.
Note: I am using hybris-commerce-suite 5.7.0.8
As requested, I am adding the source code.
package de.hybris.eventtracking.services.populators;
import de.hybris.eventtracking.model.events.AbstractTrackingEvent;
import de.hybris.eventtracking.services.constants.TrackingEventJsonFields;
import de.hybris.platform.servicelayer.dto.converter.ConversionException;
import java.io.IOException;
import java.util.Map;
import org.apache.commons.lang.StringUtils;
import com.fasterxml.jackson.databind.ObjectMapper;
/**
* #author stevo.slavic
*
*/
public abstract class AbstractTrackingEventGenericPopulator implements
GenericPopulator<Map<String, Object>, AbstractTrackingEvent>
{
private final ObjectMapper mapper;
public AbstractTrackingEventGenericPopulator(final ObjectMapper mapper)
{
this.mapper = mapper;
}
public ObjectMapper getMapper()
{
return mapper;
}
protected Map<String, Object> getPageScopedCvar(final Map<String, Object> trackingEventData)
{
final String cvar = (String) trackingEventData.get(TrackingEventJsonFields.COMMON_CVAR_PAGE.getKey());
Map<String, Object> customVariablesPageScoped = null;
if (StringUtils.isNotBlank(cvar))
{
try
{
customVariablesPageScoped = getMapper().readValue(cvar, Map.class);
}
catch (final IOException e)
{
throw new ConversionException("Error extracting custom page scoped variables from: " + cvar, e);
}
}
return customVariablesPageScoped;
}
}
"As per guidelines in the wiki.hybris, I imported all the extensions into the eclipse project."
I don't think the guidelines tell you this. Basically, you want the projects loaded to be the same as those defined in your localextensions.xml and their dependencies. The reason you can't see those is they are not built.
Ensure you have run 'ant build' successfully, refresh the platform project, remove any extensions from your workspace that are not needed for your project, and clean and build in eclipse.
Make sure you have provided the project dependencies in each project by checking their individual extensioninfo.xml files as shown in below image.
Also sometimes dependent libraries are not imported properly check for those too.

how can taking advantage of Mule Lifecycle

i am exercising Mule
i read here
i want to try this sample and i create a project and create a java class in Mule Studio
after that i copied this code:
package org.mule.module.twilio;
import org.mule.api.annotations.Configurable;
import org.mule.api.annotations.Module;
import org.mule.api.annotations.Processor;
import org.mule.api.annotations.lifecycle.Start;
import org.mule.api.annotations.param.Optional;
import org.mule.api.callback.HttpCallback;
#Module(name = "twilio")
public class TwilioConnector {
/**
* The account sid to be used to connect to Twilio.
*/
#Configurable
private String accountSid;
/**
* The authentication token to be used to connect to Twilio
*/
#Configurable
private String authToken;
private TwilioClient twilioClient;
#Start
public void createTwilioClient() {
twilioClient = new TwilioClient(accountSid, authToken);
}
}
but i have a lot of error:
all
The import org.mule.api.annotations.Configurable cannot be resolved
The import org.mule.api.annotations.Module cannot be resolved
The import org.mule.api.annotations.Processor cannot be resolved
The import org.mule.api.annotations.lifecycle cannot be resolved
The import org.mule.api.annotations.param.Optional cannot be resolved
The import org.mule.api.callback cannot be resolved
all clsaa imports are not knew
near all annotation is: Configurable cannot be resolved to a type
Did you add the mule devkit annotation jar to your classpath?
Once you hava built your cloud connector you can add it to studio following the instruction available here