I am unable to build mongodb-binding with YCSB - mongodb

I am trying to build mongodb-async driver with YCSB (0.15.0) version. I am running it on Clear Linux OS, and have MONGODB version 4.0.5 and mvn -version is 3.6.0.
I am running the command "mvn -pl com.yahoo.ycsb:mongodb-binding -am clean package" from my home directory of YCSB, and get the following error:
Could not resolve dependencies for project com.yahoo.ycsb:mongodb-bindin
g:jar:0.16.0-SNAPSHOT: Failed to collect dependencies at com.allanbank:mongodb-async-driver:jar:2.10.1: Failed to read artifact des
criptor for com.allanbank:mongodb-async-driver:jar:2.10.1: Could not transfer artifact com.allanbank:mongodb-async-driver:pom:2.10.
1 from/to allanbank (http://www.allanbank.com/repo/): Connect to www.allanbank.com:80 [www.allanbank.com/206.210.70.161] failed: Co
nnection timed out (Connection timed out) -> [Help 1]
Can someone please point me what might be going wrong. Thanks in advance

I think you have to configure the JAVA_HOME in the .mavenrc by adding:
expot JAVA_HOME=your path (eg: export JAVA_HOME=/usr/lib/jvm/java-8-oracle/) into this file

Related

Building Apache Spark 2.1.0 from source fails

Am trying to build Apache Spark 2.1.0 source, but get these errors below that baffle me...
Hadoop 2.8.0 was installed and is working
Scala 2.12.1 was installed in advance of executing the Spark install (Which seems to auto install Scala 2.11.8 ?!?)
My build line is:
build/mvn -Pyarn -Phadoop-2.7 -Dhadoop.version=2.7.0 -DskipTests clean package
Does anybody know why I get:
user#server:/usr/local/share/spark/spark-2.1.0$ sudo /usr/local/share/spark/spark-2.1.0/build/mvn -Pyarn -Phadoop-2.7 -Dhadoop.version=2.7.0 -DskipTests clean package
[sudo] password for user:
exec: curl --progress-bar -L https://downloads.typesafe.com/zinc/0.3.9/zinc-0.3.9.tgz
######################################################################## 100.0%
exec: curl --progress-bar -L https://downloads.typesafe.com/scala/2.11.8/scala-2.11.8.tgz
######################################################################## 100.0%
exec: curl --progress-bar -L https://www.apache.org/dyn/closer.lua?action=download&filename=/maven/maven-3/3.3.9/binaries/apache-maven-3.3.9-bin.tar.gz
######################################################################## 100.0%
Using `mvn` from path: /usr/local/share/spark/spark-2.1.0/build/apache-maven-3.3.9/bin/mvn
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512M; support was removed in 8.0
[INFO] Scanning for projects...
Downloading: https://repo1.maven.org/maven2/org/apache/apache/14/apache-14.pom
[ERROR] [ERROR] Some problems were encountered while processing the POMs:
[FATAL] Non-resolvable parent POM for org.apache.spark:spark-parent_2.11:2.1.0: Could not transfer artifact org.apache:apache:pom:14 from/to central (https://repo1.maven.org/maven2): repo1.maven.org: Name or service not known and 'parent.relativePath' points at wrong local POM # line 22, column 11
#
[ERROR] The build could not read 1 project -> [Help 1]
[ERROR]
[ERROR] The project org.apache.spark:spark-parent_2.11:2.1.0 (/usr/local/share/spark/spark-2.1.0/pom.xml) has 1 error
[ERROR] Non-resolvable parent POM for org.apache.spark:spark-parent_2.11:2.1.0: Could not transfer artifact org.apache:apache:pom:14 from/to central (https://repo1.maven.org/maven2): repo1.maven.org: Name or service not known and 'parent.relativePath' points at wrong local POM # line 22, column 11: Unknown host repo1.maven.org: Name or service not known -> [Help 2]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/ProjectBuildingException
[ERROR] [Help 2] http://cwiki.apache.org/confluence/display/MAVEN/UnresolvableModelException
I tested downloading manually (To see if that was causing the errors), which downloads without issues:
https://repo1.maven.org/maven2/org/apache/apache/14/apache-14.pom
I also tested accessing the URL below, which also shows content:
https://repo1.maven.org/maven2
Hope somebody smart knows how to solve this...
I found out what the issue was:
I had to configure our proxy settings in the SETTINGS.XML in the directory:
/usr/local/share/spark/spark-2.1.0/build/apache-maven-3.3.9/conf
After editing the file, the build went without any issues :)
Hope this helps someone else running into the same issue...
EDIT: Just to be extra clear, having a working proxy configuration in bash only is NOT SUFFICIENT for the Maven Build to succeed. I was able to download all files from bash manually, but Maven needed the proxy configuration in the SETTINGS.XML file present as well...

Could not transfer artifact from/to central because of InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty

When I try to install the pom.xml of maven project, I get the following error. Please help.
Failed to execute goal org.apache.maven.plugins:maven-resources-plugin:2.6:resources (default-resources) on project pm: Execution default-resources of goal org.apache.maven.plugins:maven-resources-plugin:2.6:resources failed: Plugin org.apache.maven.plugins:maven-resources-plugin:2.6 or one of its dependencies could not be resolved: Could not transfer artifact classworlds:classworlds:jar:1.1 from/to central (https://repo.maven.apache.org/maven2): java.lang.RuntimeException: Unexpected error: java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty
I had the same issue when compiling on a remote CI server. In the end, forcing the location of the trustStore when compiling with Maven solved the problem for me:
-Djavax.net.ssl.trustStore=/usr/java/jdk1.8.0_91/jre/lib/security/cacerts
The actual path will be different based on your JDK installation.
I speculate (not sure) that the problem arises if you have more than one JDK, some old one does not have proper certificate, and somehow Maven picks up that old one even if you are using the right javac
Had the same issue, found
https://groups.google.com/d/msg/osv-dev/wFzT12-p2bw/vT42CiuOBQAJ
which recommended
$ sudo apt install openjdk-8-jre
$ sudo rm /etc/ssl/certs/java/cacerts
$ sudo update-ca-certificates --fresh
Worked first try for me, so leaving it here if anyone else stumbles upon this.

OpenHab build failed with Maven

I am trying to set OpenHab IDE using the link:
https://github.com/openhab/openhab/wiki/IDE-Setup
I tryied the pure eclipse instraction but my build have failed announcing this error:
[ERROR] Failed to execute goal com.savage7.maven.plugins:maven-external-dependen
cy-plugin:0.4:resolve-external (resolve-install-external-dependencies) on projec
t org.openhab.io.multimedia.tts.marytts: Read timed out -> [Help 1]
I use Maven 3.3.1
Thanks in advance for your help
Actually the cause of the problem is that the hardware target was not successfully set.

Hadoop and Eclipse environment

Trying to configure Eclipse and Hadoop, following this site.
I successfully ran:
git clone git://git.apache.org/hadoop-common.git
Now I get a failure on:
mvn install -DskipTests
[INFO] Apache Hadoop Main ................................ SUCCESS [6.688s]
[INFO] Apache Hadoop Project POM ......................... FAILURE [1.548s]
...
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (create-testdirs) on project hadoop-project: Error executing ant tasks: /usr/local/hadoop/hadoop-common/hadoop-project/target/antrun/build-main.xml (No such file or directory) -> [Help 1]
The reason being I don't have a "target" folder in .../hadoop-project. My /hadoop-project folder contains only a "src" folder and that's it.
So it appears that folder was not established. Any ideas? Here is the environment:
Ubuntu 12.1
Hadoop 1.0.4
Maven 3.0.4
The site mentioned in the query is for development of Hadoop and not for developing applications using Hadoop. There is a difference between the two.
Here are the instructions for for developing/debugging MapReduce programs in Eclipse. Note that the procedure applies for Linux OS only.

Eclipse Maven release plugin

I'm trying to generate a release with goal release:prepare, but when I run it on Eclipse I got the error:
Failed to execute goal
org.apache.maven.plugins:maven-release-plugin:2.0-beta-7:prepare
(default-cli) on project mwframework: Can't run goal clean verify:
Error while executing process. Cannot run program "mvn" (in directory
"/home/gnng/Development/work-7-maven/mwframework"):
java.io.IOException: error=2, No such file or directory -> [Help 1]
I'm using Eclipse Maven Embedded, what I doing wrong?
Thanks in advanced.
The Maven release plugin always fork a child Maven build. So, you need to use external Maven installation (e.g. configure it in Eclipse).