Could not instantiate the executor. Make sure a planner module is on the classpath - scala

For a scala project I use the StreamTableEnvironment and when running my code in IntelliJ everything works fine. However when I try to export my project to a jar (I create a fat jar using sbt assembly), no suitable table factory can be found. I've looked inside the jar and the classes it needs are included. Here the complete stacktrace:
Exception in thread "main" org.apache.flink.table.api.TableException: Could not instantiate the executor. Make sure a planner module is on the classpath
at org.apache.flink.table.api.scala.internal.StreamTableEnvironmentImpl$.lookupExecutor(StreamTableEnvironmentImpl.scala:328)
at org.apache.flink.table.api.scala.internal.StreamTableEnvironmentImpl$.create(StreamTableEnvironmentImpl.scala:284)
at org.apache.flink.table.api.scala.StreamTableEnvironment$.create(StreamTableEnvironment.scala:366)
at org.tudelft.plugins.SQLService$.setupEnv(SQLService.scala:40)
at org.tudelft.plugins.SQLStage.main(SQLStage.scala:19)
at org.codefeedr.stages.OutputStage.transform(OutputStage.scala:45)
at org.codefeedr.pipeline.Pipeline.$anonfun$startMock$1(Pipeline.scala:240)
at org.codefeedr.pipeline.Pipeline.$anonfun$startMock$1$adapted(Pipeline.scala:238)
at scala.collection.Iterator.foreach(Iterator.scala:941)
at scala.collection.Iterator.foreach$(Iterator.scala:941)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
at scala.collection.IterableLike.foreach(IterableLike.scala:74)
at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
at org.codefeedr.pipeline.Pipeline.startMock(Pipeline.scala:238)
at org.tudelft.Main$.main(Main.scala:34)
at org.tudelft.Main.main(Main.scala)
Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for 'org.apache.flink.table.delegation.ExecutorFactory' in
the classpath.
Reason: No factory implements 'org.apache.flink.table.delegation.ExecutorFactory'.
The following properties are requested:
class-name=org.apache.flink.table.executor.StreamExecutorFactory
streaming-mode=true
The following factories have been considered:
at org.apache.flink.table.factories.TableFactoryService.filterByFactoryClass(TableFactoryService.java:243)
at org.apache.flink.table.factories.TableFactoryService.filter(TableFactoryService.java:186)
at org.apache.flink.table.factories.TableFactoryService.findAllInternal(TableFactoryService.java:172)
at org.apache.flink.table.factories.TableFactoryService.findAll(TableFactoryService.java:126)
at org.apache.flink.table.factories.ComponentFactoryService.find(ComponentFactoryService.java:48)
at org.apache.flink.table.api.scala.internal.StreamTableEnvironmentImpl$.lookupExecutor(StreamTableEnvironmentImpl.scala:312)
... 16 more```

Similar but slightly different to what OP needed: I have unit tests using the Table API that were failing with the same error message, even though the same pipeline worked fine when submitted to a real flink cluster.
org.apache.flink.table.api.TableException: Could not instantiate the executor. Make sure a planner module is on the classpath
The solution was to add:
<dependency>
<!-- this is needed to use the Table API from unit tests -->
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<scope>test</scope>
</dependency>
I believe flink-table-planner-blink is no longer available in recent versions (I'm using 1.15), and has instead replaced flink-table-planner.

Probably a bit late to answer, but I had the same issue and the solution was to add the blink planner dependency (original answer here: https://issues.apache.org/jira/browse/FLINK-14031)

It's useful for me
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner-blink_2.11</artifactId>
<version>${flink.version}</version>
</dependency>

Related

Flink: Adding flink-sql-connector-kafka to fat-jar

I use Flink SQL (version 1.11) and would like to process data from Kafka. For this I wrote a job from the scala template and added the dependency to pom.xml.
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-sql-connector-kafka_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
When I want to run the job in the cluster, I get the following error:
Caused by: org.apache.flink.table.api.ValidationException: Could not find any factory for identifier 'kafka' that implements 'org.apache.flink.table.factories.DynamicTableSinkFactory' in the classpath.
Available factory identifiers are:
blackhole
print
If I add the flink-sql-connector-kafka jar to the /lib folder it works but then can't use the SQL client because it then loads once from its own lib folder and this connector and it is already loaded in the cluster. Then comes following error:
java.lang.ClassCastException: cannot assign instance of org.apache.commons.collections.map.LinkedMap to field org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.pendingOffsetsToCommit of type org.apache.commons.collections.map.LinkedMap in instance of org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer
How can I add the flink-sql-connector-kafka to the fat-jar? Or should these SQL connectors rather be added to the /lib folder?

Junit 5 tests don't launch in Eclipse due to NoClassDefFoundError: TestEngine

(There was no question for this problem in stackoverflow, so I decided to share question & solution here.)
I wanted to migrate my Spring Boot project from JUnit 4 to JUnit 5, so I added the new API as dependency and removed the old the ones:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-api</artifactId>
</dependency>
I wanted to do a full migration of my project, so I explicitly excluded everything from JUnit 4, and also did not include the junit-vintage-engine (as recommended in the official migration guide).
Then, I did the various changes to make the tests compile, like replacing #Before with #BeforeEach and organizing imports. All in all, this was pretty straightforward.
But running the tests in Eclipse caused some trouble:
First I got the message "Cannot find 'junit.framework.TestCase' on project build path. JUnit 3 tests can only be run if JUnit is on the build path." In the launch configuration dialog that popped up automatically, I was able to spot my mistake: I needed to select the 'Test runner' JUnit 5.
Still no success. Now I got the message "No tests found with test runner 'JUnit 5'. This was confusing, because I had used the correct JUnit 5 annotations on the test methods. Eclipse even confirmed this because the launch configuration 'Test method' search listed the test methods just fine.
After I while, I figured out that the test execution had printed a stack trace in the console:
java.lang.NoClassDefFoundError: org/junit/platform/engine/TestEngine
at org.junit.platform.launcher.core.ServiceLoaderTestEngineRegistry.loadTestEngines(ServiceLoaderTestEngineRegistry.java:35)
at org.junit.platform.launcher.core.LauncherFactory.create(LauncherFactory.java:87)
at org.junit.platform.launcher.core.LauncherFactory.create(LauncherFactory.java:67)
at org.eclipse.jdt.internal.junit5.runner.JUnit5TestLoader.<init>(JUnit5TestLoader.java:34)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:456)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.createRawTestLoader(RemoteTestRunner.java:370)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.createLoader(RemoteTestRunner.java:365)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.defaultInit(RemoteTestRunner.java:309)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.init(RemoteTestRunner.java:224)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:208)
Caused by: java.lang.ClassNotFoundException: org.junit.platform.engine.TestEngine
at java.net.URLClassLoader.findClass(URLClassLoader.java:444)
at java.lang.ClassLoader.loadClass(ClassLoader.java:486)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:378)
at java.lang.ClassLoader.loadClass(ClassLoader.java:419)
... 14 more
This pointed me to the root cause of the problem (see answer below)...
The problem was that I was missing the dependency to the engine library:
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-engine</artifactId>
</dependency>
Then, the tests run like a charm in Eclipse.
Migration blogs like https://dev.to/martinbelev/how-to-enable-junit-5-in-new-spring-boot-project-29a8 do mention this dependency, but I primarily worked with the JUnit 5 documentation, and somehow I must have overlooked this important piece of information there...

Migration to Jakarta: ClassNotFoundException: com.sun.xml.internal.ws.spi.ProviderImpl

While migrating from Java 8 to Java 11 and switching from EE to the newest Jakarta libraries according to https://wiki.eclipse.org/New_Maven_Coordinates and Maven central, we get the following runtime exception in our (still SOAP-based) client application:
Exception in thread "main" javax.xml.ws.WebServiceException: Provider com.sun.xml.internal.ws.spi.ProviderImpl not found
at javax.xml.ws.spi.FactoryFinder$1.createException(FactoryFinder.java:31)
at javax.xml.ws.spi.FactoryFinder$1.createException(FactoryFinder.java:28)
at javax.xml.ws.spi.ServiceLoaderUtil.newInstance(ServiceLoaderUtil.java:73)
at javax.xml.ws.spi.FactoryFinder.find(FactoryFinder.java:82)
at javax.xml.ws.spi.Provider.provider(Provider.java:66)
at javax.xml.ws.Service.<init>(Service.java:82)
at [...]
Caused by: java.lang.ClassNotFoundException: com.sun.xml.internal.ws.spi.ProviderImpl
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
at javax.xml.ws.spi.ServiceLoaderUtil.nullSafeLoadClass(ServiceLoaderUtil.java:60)
at javax.xml.ws.spi.ServiceLoaderUtil.safeLoadClass(ServiceLoaderUtil.java:93)
at javax.xml.ws.spi.ServiceLoaderUtil.newInstance(ServiceLoaderUtil.java:71)
... 5 more
The solution described in Getting java.lang.ClassNotFoundException: com.sun.xml.internal.ws.spi.ProviderImpl despite the dependencies are defined doesn't work and doesn't use Jakarta.
If I'm not wrong, the Jarkarta libraries shouldn't contain "com.sun.xml."packages or reference such, but javax.xml.ws.spi.Provider obviously STILL DOES reference such class:
private static final String DEFAULT_JAXWSPROVIDER =
"com.sun"+".xml.internal.ws.spi.ProviderImpl";
So, does anyone know if there is a Jakarta equivalent to the missing library containing ProviderImpl, or how I could workaround the problem with Jakarta?
Thanks in advance!
I finally found a workaround for my problem. According to the answer given in How to use WebServices on Java 11? package javax.jws does not exist the reference implementation of JAX-WS should be included with Java 11:
<dependency>
<groupId>com.sun.xml.ws</groupId>
<artifactId>jaxws-ri</artifactId>
<version>2.3.2</version>
<type>pom</type>
</dependency>
Unfortunately, compiling our project with this dependency using the latest maven-compile-plugin 3.8.0 causes the exception described in https://jira.apache.org/jira/browse/MCOMPILER-355. It should be fixed in 3.8.1 but the version is not available yet.
As a workaround I got our project working with the hint given in Getting java.lang.ClassNotFoundException: com.sun.xml.internal.ws.spi.ProviderImpl despite the dependencies are defined, combined with an additional dependency (namely resolver that is also linked in the pom.xml of jaxws-ri) to avoid an subsequent java.lang.ClassNotFoundException: com.sun.org.apache.xml.internal.resolver.CatalogManager:
<dependency>
<groupId>com.sun.xml.ws</groupId>
<artifactId>rt</artifactId>
<version>2.3.2</version>
</dependency>
<dependency>
<groupId>com.sun.org.apache.xml.internal</groupId>
<artifactId>resolver</artifactId>
<version>20050927</version>
</dependency>
Maybe this helps someone running into the same problem.
Throwing my 5 cents into the ring as I had "the same problem". What resolved it to me was to stick with the version two of the jaxws-rt package, since version 3.0.0 did fail with the same results as mentioned above.
So what I used to get my JDK 8 source running without any modification was:
// https://mvnrepository.com/artifact/com.sun.xml.ws/jaxws-rt
implementation group: 'com.sun.xml.ws', name: 'jaxws-rt', version: '2.3.3' //3.0.0 did not work!
// https://mvnrepository.com/artifact/javax.xml.ws/jaxws-api
implementation group: 'javax.xml.ws', name: 'jaxws-api', version: '2.3.1'
// https://mvnrepository.com/artifact/javax.jws/javax.jws-api
implementation group: 'javax.jws', name: 'javax.jws-api', version: '1.1'

GWT java.util.Date serialization with CustomFieldSerializer fails in debugger

I had a problem with the way Date was serialized by GWT 2.4.0 and the easiest solution seemed to be to write a Date_CustomFieldSerializer - overloading the original implementation.
But depending on how i start the application i get different results.
Gladly the deployed version seems to work without any trouble. Starting a debugging session from Eclipse on the other hand leads to this message:
com.google.gwt.user.client.rpc.IncompatibleRemoteServiceException: The response could not be deserialized
at com.google.gwt.user.client.rpc.impl.RequestCallbackAdapter.onResponseReceived(RequestCallbackAdapter.java:221)
at com.google.gwt.http.client.Request.fireOnResponseReceived(Request.java:287)
at com.google.gwt.http.client.RequestBuilder$1.onReadyStateChange(RequestBuilder.java:395)
...
Caused by: com.google.gwt.user.client.rpc.SerializationException: java.util.Date/1659716317
at com.google.gwt.user.client.rpc.impl.SerializerBase.getTypeHandler(SerializerBase.java:153)
I debugged both the server and the client side and the server is using my serializer and the client side fails when it's looking up the serializer by its "type signature": java.util.Date/1659716317
Oddly the client has a map containing a serializer for java.util.Date/965047388.
How does GWT create these type signatures and how can they be different when i am using the GWT debugger?
-- edit --
I now know how the numbers are generated. GWT calculates a CRC32 hash of the class names in the hierachy (and sometimes the methods as well).
java.util.Date
com.google.gwt.user.client.rpc.core.java.util.Date_CustomFieldSerializer
java.lang.Object
--> 1659716317 (server side)
java.util.Date
java.lang.Object
--> 965047388 (client side)
I just can't find the spot when the GWT calculates the hashes for the client side to see why it doesn't know the serializer, because it's somewhere between a CompilingClassLoader and runtime generated classes.
For anyone who is having this same problem, I had the error message for a few days now, yesterday I found the cause!
I had 2 versions of different versions of the class Date_CustomFieldSerializer on my classpath. The wrong one was added in my classpath because it was in the gwt-servlet-2.2.0.jar that was a dependency of the google gin 1.5 library which I use in my project.
I upgraded the google gin in my project to version 2.1.2 with doesn't have a gwt-servlet dependency to. This way no different versions of the Date_CustomFieldSerializer class should be in the classpath. If you have the same cause and you don't want to upgrade your google gin, you can simply exclude the dependency gwt-servlet-2.2.0 from the google gin 1.5 dependency in your pom. Like this:
<dependencies>
<dependency>
<groupId>sample.ProjectA</groupId>
<artifactId>Project-A</artifactId>
<version>1.0</version>
<scope>compile</scope>
<exclusions>
<exclusion> <!-- declare the exclusion here -->
<groupId>sample.ProjectB</groupId>
<artifactId>Project-B</artifactId>
</exclusion>
</exclusions>
</dependency>
</dependencies>
The reason the client side didn't have the serializer was simply that GWT couldn't compile it to JavaScript (because of some server side logging references that were added by accident).
Unless you use "strict" compilation rules, these JavaScript compiles fail silently (or add a single line to the compiler output that gets drowned in other messages) and you won't know what you're missing until you need it.

java.lang.NoClassDefFoundError: scala/reflect/ClassManifest

I am getting an error when trying to run an example on spark. Can anybody please let me know what changes do i need to do to my pom.xml to run programs with spark.
Currently Spark only works with Scala 2.9.3. It does not work with later versions of Scala. I saw the error you describe when I tried to run the SparkPi example with SCALA_HOME pointing to a 2.10.2 installation. When I pointed SCALA_HOME at a 2.9.3 installation instead, things worked for me. Details here.
You should add dependecy for scala-reflect to your maven build:
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-reflect</artifactId>
<version>2.10.2</version>
</dependency>
Ran into the same issue using the Scala-Redis 2.9 client (incompatible with Scala 2.10) and including a dependency to scala-reflect does not help. Indeed, scala-reflect is packaged as its own jar but does not include the Class missing which is deprecated since Scala 2.10.0 (see this thread).
The correct answer is to point to an installation of Scala which includes this class (In my case using the Scala-Redis client, the answer of McNeill helped. I pointed to Scala 2.9.3 using SBT and everything worked as expected)
In my case, the error is raised in Kafka's api. I change the dependency from
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.9.2</artifactId>
<version>0.8.1.1</version>
</dependency>
to
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.10</artifactId>
<version>1.6.1</version>
</dependency>
fixed the problem.