please see below screenshots , i am getting same issue while running spark prog,can you please helpjava.io.IOException: Could not locate executable C:\hadoop\bin\bin\winutils.exe in the Hadoop binaries.i have added necessary paths.
Add this comments in your program
downloaded the winutils.exe and placed it in C:/Bin/Winutils.exe
then added the following line to my project at the start of the function
System.setProperty("hadoop.home.dir", "C:\\winutil\\")
Related
We can't figure out the following issue: we are trying to use Apache Hudi to save data to the storage. The problem is when we upload a fat jar which includes the org.json package in dependencies, the df.save() application is failing on
java.lang.NoClassDefFoundError: org/json/JSONException
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeCreateTable(SemanticAnalyzer.java:10847)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:10047)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10128)
at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:209)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:227)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:424)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:308)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1122)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1170)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
at org.apache.hudi.hive.HoodieHiveClient.updateHiveSQLs(HoodieHiveClient.java:384)
at org.apache.hudi.hive.HoodieHiveClient.updateHiveSQLUsingHiveDriver(HoodieHiveClient.java:367)
at org.apache.hudi.hive.HoodieHiveClient.updateHiveSQL(HoodieHiveClient.java:357)
at org.apache.hudi.hive.HoodieHiveClient.createTable(HoodieHiveClient.java:262)
at org.apache.hudi.hive.HiveSyncTool.syncSchema(HiveSyncTool.java:176)
at org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool.java:130)
at org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool.java:94)
at org.apache.hudi.HoodieSparkSqlWriter$.org$apache$hudi$HoodieSparkSqlWriter$$syncHive(HoodieSparkSqlWriter.scala:321)
at org.apache.hudi.HoodieSparkSqlWriter$$anonfun$metaSync$2.apply(HoodieSparkSqlWriter.scala:363)
at org.apache.hudi.HoodieSparkSqlWriter$$anonfun$metaSync$2.apply(HoodieSparkSqlWriter.scala:359)
Even if I go to the cluster libraries and explicitly add this dependency it still fails on save. On the other hand, when I just create new JSONException("hello") in my notebook everything seem to work fine. What could cause this behaviour? Thanks
This is probably because the jar is not making it's way to the executor nodes, try addJar (https://spark.apache.org/docs/latest/api/java/org/apache/spark/SparkContext.html#addJar-java.lang.String-)
What version of Hudi are you using? There is a problem with JSON in version 0.6.0 and there is an opened issue. I suggest you to use version 0.5.2 by now.
Turns out that the problem was with different classpath between metastore service and spark process, because they run in separated JVM's. The problem was fixed in an init script that downloads the jar to the classpath folder.
I am new to Giraph and hadoop.I am trying to compile giraph using maven i tried using command mvn -Phadoop_2 -fae -DskipTests clean install on command prompt but i am getting error.Image i attached is from eclipse(i am geting similar error in cmd as well.
GIRAPH version- 1.2.0 RC1
HADOOP version- 2.2.0
MAVEN version- 3.5.3
JAVA version 1.8.0_121
Any help in solving this problem will help me a lot. THANK YOU
I found a workaround for this as I was facing a similar problem.
mvn clean install -PallModules -Drat.numUnapprovedLicenses=200 -DskipTests
-Drat.numUnapprovedLicenses=200 This helps to suppress the error.
Thanks #leftjoin for the comment.
There must be a file created by Apache Rat in the path giraph-gora/target/rat.txt (as this is the failed project). This file will contain details on the files in your projects that do not have a proper licence (as well as the ones that do but these are not of any interest to you).
Most likely, the case is that the files produced by eclipse (.classpath, .project, org.eclipse.core.resources.prefs, org.eclipse.m2e.core.prefs,org.eclipse.jdt.core.prefs) are creating the problem. If you remove them you will be able to compile giraph properly.
I need to add some external dependent library(spark-streaming-mqtt_2.10-1.5.2.jar in my case) to my pyspark word count code. I know we can add external jars with –jars property in spark submit or running with pyspark shell. But I want to add this jar in my code or in spark config file. I found that we have SparkContext.addJar() method , which can be included in code.
sc.addJar("spark-streaming-mqtt_2.10-1.5.2.jar")
But the above command is giving me error: AttributeError: 'SparkContext' object has no attribute 'addJar'.
I have tried adding this jar in Spark_default.config file as :
spark.driver.extraClassPath spark-streaming-mqtt_2.10-1.5.2.jar
spark.executor.extraClassPath spark-streaming-mqtt_2.10-1.5.2.jar
But this is also not working for me. I have looked in internet but not getting any useful link.
I am using Spark 1.5.2 with 1 namenode and 3 datanode in HDP cluster.
Can you please help me in solving the issue. It will be really thankful of you.
spark.driver.extraClassPath and spark.executor.extraClassPath will work, but this should be paths your Hadoop nodes as this files are not uploaded, they are just added to spark containters classpath.
It worked for me by adding external dependency in spark_deafult.config as
spark.jars spark-streaming-mqtt_2.10-1.5.2.jar
Now my job is taking external dependency.
Downloaded LWJGL 3.0 from lwjgl.org, which only had the lwjgl.jar file in the jar subdirectory. The native directory only has files like libglfw.so but no subdirectories at all (and certainly not native/windows).
Created a library LWJGL30 with the lwjgl.jar file.
Added it to my project's library. and to the Project Properties->Libraries->Compile and Run.
Set the JVM launch argument in Project Properties->Run to -Djava.library.path=C:\Users\Owner\Documents\lwjgl\native for the VM Options
Copied the HelloWorld example from the link
Then ran and I get this error:
Exception in thread "main" java.lang.NoClassDefFoundError: Could not
initialize class org.lwjgl.system.Library at
org.lwjgl.system.MemoryAccess.(MemoryAccess.java:22) at
org.lwjgl.system.Pointer.(Pointer.java:22) at
org.lwjgl.glfw.GLFW.(GLFW.java:594) at
mylwjgl.MyLWJGL.run(MyLWJGL.java:43) at
mylwjgl.MyLWJGL.main(MyLWJGL.java:140)
C:\Users\Owner\AppData\Local\NetBeans\Cache\8.1\executor-snippets\run.xml:53:
Java returned: 1 BUILD FAILED (total time: 7 seconds)
I have checked, double checked, triple checked as well as searching online for an answer as it should work but it does not. Any help would be appreciated.
This error occurs when LWJGL is unable to find the native files. Make sure java.library.path actually points to the directory with LWJGL's natives (which have extensions like .dll, .so and .dylib). Be careful with spaces in the path: You have to wrap the path in quotation marks or it won't work.
The issue was I was running the HelloWorld example using the Stable version of LWJGL 3.0. When I switched to the Latest version, everything worked as expected.
My requirement is to run a eclipse java project in command prompt. I am using following program structure:
project name:parser
package name:xml
class name:Sample (with the main method)
I tried exporting the jar file by right clicking the project name-->export-->jar file-->gave the destination path for jar file->checked the Generate Manifest file and selected the main program(Sample.java).
Now in the command prompt i tried to run the jar file in the cmd using
java -jar myjar.jar.
And the error message I get is:
Exception in thread "main" java.lang.UnsupportedClassVersionError: XML/Sample:
Unsupported major.minor version 51.0
Could not find the main class: XML.Sample. Program will exit.
I have been searching for a solution for a long time but failed to find one. I would be very thankful if someone could help me with this problem. Kindly excuse for any mistakes in my question as I am new to java coding. Thanks in advance