I have installed Scala, Spark, SBT and winutils on the windows box. Also configured JAVA_HOME, SPARK_HOME, SCALA_HOME, SBT_HOME appropriately. Still, when I try to run 'spark-shell', I run into a very common problem which I see many people have faced.
I have tried giving access to c:/tmp/hive directory through winutils as well as through windows explorer. Yet the same issue. I would appreciate if anyone can tell me what else I might be missing?
Versions:
Spark: 2.1.1
Scala: 2.11
Below are the links to screenshots:
https://filedropper.com/filemanager/public.php?service=files&t=b5cc12549b9ae19ff969c65cb952c135
https://filedropper.com/filemanager/public.php?service=files&t=cb5b9b8f14980963be6f0a6511f71455
https://filedropper.com/filemanager/public.php?service=files&t=a2d471abfe57ca020a700c226b906faf
Appreciate your help!
Thanks,
Manish
Thank you cricket_007 and Jacek. I appreciate your comments.
It turned out an issue with version compatibility with winutils. Had to try several versions until the right version changed the permissions on /tmp/hive
Related
I recently installed PostgreSQL 13 on windows 10 and the PostGIS extension via the stack builder. However, when trying to launch the shp2pgsql-gui application to import a shapefile, I get a "libsqlite3-0.dll was not found" error. I can see the file under the bin folder, so I'm not sure where the problem is and googling did not help very much. I wonder what the problem may be because the installation seemed pretty straightforward to have done something wrong.
Copying the "libsqlite3-0.dll" file from \PostgreSQL\13\bin to PostgreSQL\13\bin\postgisgui and then running the shp2pgsql-gui application seemed to fix this issue for me.
This problem can occur for different reasons, but I recommend that you download that .dll from the following link (https://es.dll-files.com/libsqlite3-0.dll.html). Then replace the file in the installation folder.
I am working with pyhton 3.7, Aldec Riviera Pro 2017, cocotb 1.3 and MSYS2.
When I run this test on jenkins and on remote PC I'm getting getting this issue. My log file looks like this:
VHPI: Loading library 'C:/JenkinsSlave/workspace/Diceros_-_Regression_Tests_CoCoTB_mao/Vivado/diceros/ip_repo/registers_1.0/sim/build/libs/x86_64/libcocotbvhpi.dll'
VHPI: Cannot load the "C:/JenkinsSlave/workspace/Diceros_-_Regression_Tests_CoCoTB_mao/Vivado/diceros/ip_repo/registers_1.0/sim/build/libs/x86_64/libcocotbvhpi.dll" library. The library does not exist or is corrupted.
Solution tried so far:
checked if the file is there and checked the dependencies of the dll file as well. (all good)
Discovered Riviera has its own gcc version which is different from MSYS2 (mingw64) --not sure if that is a problem ?
Played with environment variables (didn't work clearly)
Any suggestions will be helpful. I am really stuck at the moment. Thanks in advance!
Please see my answer in https://github.com/cocotb/cocotb/issues/1459, thanks!
In ubuntu, I export library path while running my scala application to load the .so files. But the files are not loaded. So an exception occurs when I hit my methods. The exception I get is as follows enter image description here
I don't know how to fix this issue. Lib folder contains all those library files needed but still error appears. Particularly it shows libboost_filesystem cannot be loaded. please help me to fix this issue. Thanks for your help.
You need to make sure your libboost libraries are installed and available for the JVM. There are multiple ways to help JVM find those libraries like LD_LIBRARY_PATH or -Djava.library.path=. Check this article for examples: https://www.chilkatsoft.com/java-loadLibrary-Linux.asp.
I am unable to open Eclipse IDE. The error displayed is as below
Any idea what could be the reason? And how do I solve this??
Note: I ran CCleaner recently (any registry issue??) and I got error executing my open Eclispe project. On restarting I get this issue. Is re-installation going to help? Am going to do that and update result here. And I would really like to know what caused this for future precautions.
Update: Same error for new installation. But this time got error regarding lauching JVM and missing dlls so going to reinstall Java.
Unable to understand what might be the exact problem , still giving it a shot.
Did you go through this URL http://michaelzanussi.com/?p=468
Appears to be the same issue , resetting JAVA_HOME and PATH solved the issue for him
The authors of Eclipse strongly recommend manually updating the Eclipse.ini file to point directly to the JRE that you want to use rather than relying on Windows environment variables.
Also C:\Windows\System32 is a really strange place to find the Java Runtime Environment files, typically they wind up in C:\Program Files\Java or C:\Program Files (x86)\Java.
I am trying to run MapReduce jobs using hadoop-eclipse plugin with Eclipse Indigo, but I am getting the following error:
Error: failure to login
While looking for some help, I found there is a problem with Hadoop-0.20.203.0, so I tried Hadoop-0.20.205.0 as the issues are fixed in this version.
I am still facing the same problem. Am I missing something or making a mistake?
Sorry for my poor English, as your question has no more detail, I guess that you meet the same problme as me, if so, the following link resolved my problem, pls. pay attention to step "4".
http://hi.baidu.com/wangyucao1989/blog/item/279cef87c4b37c34c75cc315.html
Sorry for that is a page in Chinese. It said the problem is because the file hadoop-eclipse-plugin-0.20.203.0.jar lost 5 files "commons-configuration-1.6.jar , commons-httpclient-3.0.1.jar , commons-lang-2.4.jar , jackson-core-asl-1.0.1.jar 和 jackson-mapper-asl-1.0.1.jar ". You should:
Extract the "hadoop-eclipse-plugin-0.20.203.0.jar",
Add the 5 files into "hadoop-eclipse-plugin-0.20.203.0\lib" ,
Modify "hadoop-eclipse-plugin-0.20.203.0\META-INF\MANIFEST.MF" (modify the Bundle-ClassPath).
Re 'jar' the package and replace the old "hadoop-eclipse-plugin-0.20.203.0.jar".
The os the page referred is linux, my os is Win7.
good luck!
Instead of going for adding plugin u can just add the required libraries in eclipse and do your programming.
here is the list of library u will need. These files exists with the Apache hadoop distribution in lib folder.
hadoop-core-1.1.2.jar
log4j-1.2.15.jar
jackson-mapper-asl-1.8.8.jar
jackson-core-asl-1.8.8.jar
commons-logging-api-1.0.4.jar
commons-logging-1.1.1.jar
commons-lang-2.4.jar
commons-httpclient-3.0.1.jar
commons-configuration-1.6.jar