Java Transform not getting registered with Expansion service in Apache Beam 2.36 - apache-beam

Java Version:
openjdk version "11" 2018-09-25
OpenJDK Runtime Environment 18.9 (build 11+28)
OpenJDK 64-Bit Server VM 18.9 (build 11+28, mixed mode)
Apache Beam version: 2.36.0
Stack Trace:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/Users/sghadiyaram/Desktop/HuX/cdm/microservices/hxp-beam-pipelines/cass-java-beam/build/libs/cass-java-beam-0.0.1-SNAPSHOT.jar!/BOOT-INF/lib/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/Users/sghadiyaram/Desktop/HuX/cdm/microservices/hxp-beam-pipelines/cass-java-beam/build/libs/cass-java-beam-0.0.1-SNAPSHOT.jar!/BOOT-INF/lib/logback-classic-1.2.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]
Mar 17, 2022 5:19:15 PM org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms
INFO: Registering external transforms: [beam:transform:org.apache.beam:pubsub_read:v1, beam:transform:org.apache.beam:pubsub_write:v1, beam:transform:org.apache.beam:pubsublite_write:v1, beam:transform:org.apache.beam:pubsublite_read:v1, beam:transform:org.apache.beam:spanner_insert:v1, beam:transform:org.apache.beam:spanner_update:v1, beam:transform:org.apache.beam:spanner_replace:v1, beam:transform:org.apache.beam:spanner_insert_or_update:v1, beam:transform:org.apache.beam:spanner_delete:v1, beam:transform:org.apache.beam:spanner_read:v1, beam:external:java:generate_sequence:v1]
beam:transform:org.apache.beam:pubsub_read:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1#72f926e6
beam:transform:org.apache.beam:pubsub_write:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1#3daa422a
beam:transform:org.apache.beam:pubsublite_write:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1#31c88ec8
beam:transform:org.apache.beam:pubsublite_read:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1#1cbbffcd
beam:transform:org.apache.beam:spanner_insert:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1#27ce24aa
beam:transform:org.apache.beam:spanner_update:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1#481a996b
beam:transform:org.apache.beam:spanner_replace:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1#3d51f06e
beam:transform:org.apache.beam:spanner_insert_or_update:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1#7ed7259e
beam:transform:org.apache.beam:spanner_delete:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1#28eaa59a
beam:transform:org.apache.beam:spanner_read:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1#3427b02d
beam:external:java:generate_sequence:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1#647e447
I have written builder class ; Configuration class and registrar class
But my Transform is not getting registered

Related

Disable log messages about warnings for org.glassfish.jersey.internal.inject.Providers in Kafka-Connect

My goal is to make log format in kafka-connect as json, but the following log messages are always not json and cannot be disabled:
Feb 10, 2020 4:36:03 PM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime
WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.RootResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.RootResource will be ignored.
Feb 10, 2020 4:36:03 PM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime
WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource will be ignored.
Feb 10, 2020 4:36:03 PM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime
WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource will be ignored.
Feb 10, 2020 4:36:05 PM org.glassfish.jersey.internal.Errors logErrors
WARNING: The following warnings have been detected: WARNING: The (sub)resource method createConnector in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation.
WARNING: The (sub)resource method listConnectors in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation.
WARNING: The (sub)resource method listConnectorPlugins in org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource contains empty path annotation.
WARNING: The (sub)resource method serverInfo in org.apache.kafka.connect.runtime.rest.resources.RootResource contains empty path annotation.
I know those are warings, so my goal is to either avoid those to show up, or make it in json. Neither work though.
I tried the following setup:
<logger name="org.glassfish.jersey.internal.inject.Providers" additivity="true" level="ERROR" />
Those log messages still appear. Any suggestion?
I can use either logback or log4j. I currently use logback but using log4j is not a problem.
UPDATE
I switch to log4j
log4j.rootCategory=INFO, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=net.logstash.log4j.JSONEventLayoutV1
log4j.logger.org.reflections=ERROR
log4j.logger.org.apache.kafka=INFO
log4j.logger.org.apache.kafka.connect.runtime.rest=ERROR
log4j.logger.org.apache.kafka.clients.consumer.ConsumerConfig=ERROR
log4j.logger.org.apache.kafka.clients.producer.ProducerConfig=ERROR
log4j.logger.org.apache.kafka.clients.admin.AdminClientConfig=ERROR
Logs are in json format except for the WARNINGS are still there even if I set
log4j.logger.org.glassfish.jersey.internal.inject.Provider=ERROR
WARNINGS are still there even if I set log4j.logger.org.glassfish.jersey.internal.inject.Provider=ERROR
The class is Providers, with an s
Personally, I'd suggest
log4j.logger.org.glassfish.jersey.internal=OFF
That log message is being logged using the JDK built-in logger (java.util.logging), not SLF4J or log4J:
https://github.com/jersey/jersey/blob/faa809da43538ce31076b50f969b4bd64caa5ac9/core-common/src/main/java/org/glassfish/jersey/internal/inject/Providers.java#L514
You need to edit the default JVM logging.properties file or create a custom one and reference it with JVM option -Djava.util.logging.config.file in order to set the log level of org.glassfish.jersey.internal to SEVERE.
Logging in Java is complicated and messy. You end up having to know all of the different logging products and often they are mixed together in the same project.
Example logging.properties file:
handlers= java.util.logging.ConsoleHandler
.level= INFO
java.util.logging.ConsoleHandler.level = INFO
java.util.logging.ConsoleHandler.formatter = java.util.logging.SimpleFormatter
org.glassfish.jersey.internal.level = SEVERE

Apache-Beam exception while running WordCount example in eclipse

Downloaded maven dependecies in eclipse using
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-runners-direct-java</artifactId>
<version>0.2.0-incubating</version></dependency>
When download and run WordCount example after changing the gs path to the C://examples//misc.txt.Getting the below exception.I did not pass any runner.How to pass the runner option and output params while running from eclipse??
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Exception in thread "main" java.lang.IllegalStateException: Failed to validate C://examples//misc.txt
at org.apache.beam.sdk.io.TextIO$Read$Bound.apply(TextIO.java:288)
at org.apache.beam.sdk.io.TextIO$Read$Bound.apply(TextIO.java:195)
at org.apache.beam.sdk.runners.PipelineRunner.apply(PipelineRunner.java:76)
at org.apache.beam.runners.direct.DirectRunner.apply(DirectRunner.java:205)
at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:401)
at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:324)
at org.apache.beam.sdk.values.PBegin.apply(PBegin.java:59)
at org.apache.beam.sdk.Pipeline.apply(Pipeline.java:174)
at org.apache.beam.examples.WordCount.main(WordCount.java:206)
Caused by: java.io.IOException: Unable to find handler for C://examples//misc.txt
at org.apache.beam.sdk.util.IOChannelUtils.getFactory(IOChannelUtils.java:188)
at org.apache.beam.sdk.io.TextIO$Read$Bound.apply(TextIO.java:283)
... 8 more
I believe Apache Beam will parse syntax C://directory/file similar to http://domain/file -- it will think that C is a protocol name and that directory is a domain. The inner exception is saying that C is an unknown protocol.
Please try to avoid using :// symbol when referring to local files. I'd suggest using regular Windows standard: C:\directory\file.

Apache phoenix not getting started

I have installed HBase 1.1.3 on multi cluster configuration and wanted to run Apache phoenix over it. I download phoenix 4.7, installed it as per the guidelines mentioned here: https://phoenix.apache.org/installation.html
But when i am running the following command: sqlline.py
it is getting hanged till the point shown below.
hadoop#hostname:~$ sqlline.py hostname
Setting property: [incremental, false]
Setting property: [isolation, TRANSACTION_READ_COMMITTED]
issuing: !connect jdbc:phoenix:localhost none none org.apache.phoenix.jdbc.PhoenixDriver
Connecting to jdbc:phoenix:localhost
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/phoenix-4.7.0-HBase-1.1-bin/phoenix-4.7.0-HBase-1.1-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
16/05/10 13:06:18 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Well it seems that the Phoenix client is unable to connect to the Hbase z-node present at zookeeper cluster. Please do the following :
Check if zookeeper is up .
Check with what name you have registered hbase at the zookeeper.If the name is not hbase we need to specify it to the client. in that case the command would look like sqlline.py hostname:2181:/znode-for-hbase-name.
Please check if u have added phoenix-[version]-server.jar to the lib folder in all hbase nodes and try again .
You need to add following jars to Hbase/lib directory.
phoenix-spark-4.7.0-HBase-1.1.jar
phoenix-4.7.0-HBase-1.1-server.jar

Hadoop's WordCount runs form command line but not from Eclipse

In the last few days, I have tested multiple version of Hadoop (1.0.1, 1.0.2, 1.1.4). In each case, I can easily run the WordCount program using the following command line:
hadoop jar hadoop-examples-1.1.1.jar wordcount /input output
Since the above command executes successfuly, then I assume that my Hadoop configuration is correct. But I get the following error message for each single version when I try to run the program using exact same input from Eclipse. Can anyone give me the reason why it wouldn't run from Eclipse?
Dec 12, 2012 2:19:41 PM org.apache.hadoop.util.NativeCodeLoader <clinit>
WARNING: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Dec 12, 2012 2:19:41 PM org.apache.hadoop.mapred.JobClient copyAndConfigureFiles
WARNING: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
****file:/tmp/wordcount/in
Dec 12, 2012 2:19:42 PM org.apache.hadoop.mapred.JobClient$2 run
INFO: Cleaning up the staging area file:/tmp/hadoop-root/mapred/staging/root-41981592/.staging/job_local_0001
Dec 12, 2012 2:19:42 PM org.apache.hadoop.security.UserGroupInformation doAs
SEVERE: PriviledgedActionException as:root cause:org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: file:/input
Exception in thread "main" org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: file:/input
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:235)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:252)
at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:962)
at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:979)
at org.apache.hadoop.mapred.JobClient.access$600(JobClient.java:174)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:897)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
at com.igalia.wordcount.WordCount.run(WordCount.java:94)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at com.igalia.wordcount.App.main(App.java:28)
Add the following two lines in your job through your Configuration abject :
Configuration.addResource(new Path("path-to-your-core-site.xml file"));
Configuration.addResource(new Path("path-to-your-hdfs-site.xml file"));
For hadoop-2.2.0 on windows 7, I added the below lines and it solved the issue (NOTE: My Hadoop Home is: C:\MyWork\MyProjects\Hadoop\hadoop-2.2.0)
Configuration conf = new Configuration();
conf.addResource(new Path("C:\MyWork\MyProjects\Hadoop\hadoop-2.2.0\etc\hadoop\core-site.xml"));
conf.addResource(new Path("C:\MyWork\MyProjects\Hadoop\hadoop-2.2.0\etc\hadoop\hdfs-site.xml"));

jboss server log file

I am trying to run jboss.
But I get the following error:
[javac] C:\Program Files\jbpm-5.0-try3\jbpm-installer\build.xml:518: warning
: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to
false for repeatable builds
[java] SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
[java] SLF4J: Defaulting to no-operation (NOP) logger implementation
[java] SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for fu
rther details.
[java] Task service started correctly !
[java] Task service running ...
as well, it does not run on port:8080.
What could be the problem?
How can I see the log file?
About the log files:
jboss should have the log files in the folder %JBOSS%\server\default\log.
About the port:8080:
I would check whether some other server is configured to listen on 8080 already. Especially if you are doing a lot of experimantal installations you can end up with several Jbosses, Tomcats and Glassfishes listening at the same port, so most of the servers won't receive their requests. At least to me it has happened once (but don't tell anybody).
It ccould be that logging failed to initialize and further errors prevent your application from correct deployment.
Check this first:
Failed to load class org.slf4j.impl.StaticLoggerBinder
This error is reported when the org.slf4j.impl.StaticLoggerBinder class could not be loaded into memory. This happens when no appropriate SLF4J binding could be found on the class path. Placing one (and only one) of slf4j-nop.jar, slf4j-simple.jar, slf4j-log4j12.jar, slf4j-jdk14.jar or logback-classic.jar on the class path should solve the problem.
As of SLF4J version 1.6, in the absence of a binding, SLF4J will default to a no-operation (NOP) logger implementation.