HDP-3.1.5.0 / Centos 7
I configure the connector on the metastore host (I have also done on the Ambari manager host)
# ambari-server setup --jdbc-db=mysql --jdbc-driver=/lib/mysql-connector-java-8.0.23-1.el7.noarch.rpm
Using python /usr/bin/python
Setup ambari-server
Copying /lib/mysql-connector-java-8.0.23-1.el7.noarch.rpm to /var/lib/ambari-server/resources/mysql-connector-java-8.0.23-1.el7.noarch.rpm
Creating symlink /var/lib/ambari-server/resources/mysql-connector-java-8.0.23-1.el7.noarch.rpm to /var/lib/ambari-server/resources/mysql-connector-java.jar
If you are updating existing jdbc driver jar for mysql with mysql-connector-java-8.0.23-1.el7.noarch.rpm. Please remove the old driver jar, from all hosts. Restarting services that need the driver, will automatically copy the new jar to the hosts.
JDBC driver was successfully initialized.
Ambari Server 'setup' completed successfully.
But still, see this when trying to start the metastore
Underlying cause: java.lang.ClassNotFoundException : com.mysql.jdbc.Driver
org.apache.hadoop.hive.metastore.HiveMetaException: Failed to load driver
You're using an RPM, not a JAR
From HDP documentation
--jdbc-driver
Should be the path to the JDBC driver JAR file
You need to use yum to install the RPM, which should extract the JAR somewhere on disk, and then you can symlink it to /usr/lib/mysql-connector.jar, for example, where it's in a defined location
Related
I follow this Keycloak guide to start Keycloak on my server. I am receiving the next exception when running the bin/kc.sh start-dev command:
ERROR: Failed to run 'build' command.
ERROR: java.lang.IllegalArgumentException: /srv/keycloak/lib/lib/main/org.eclipse.microprofile.context-propagation.microprofile-context-propagation-api-1.2.jar does not exist
ERROR: /srv/keycloak/lib/lib/main/org.eclipse.microprofile.context-propagation.microprofile-context-propagation-api-1.2.jar does not exist
ERROR: /srv/keycloak/lib/lib/main/org.eclipse.microprofile.context-propagation.microprofile-context-propagation-api-1.2.jar
For more details run the same command passing the '--verbose' option. Also you can use '--help' to see the details about the usage of the particular command.
The Keycloak version is 18.0.1, the installed JDK version is 11.0.15, the OS is Debian 11
Can anyone tell me how to solve it? Thanks
It turned out that some of the jars were missing after extracting the loaded Keycloak 18.0.2 tar.gz file on the server. After replacing the jars under keycloak/lib/lib/main and keycloak/lib/lib/deployment I was able to start the Keycloak. To do that, I extracted Keycloak 18.0.2 tar.gz on my local machine and uploaded corresponding jars to the server machine.
Step 1: Clone the repository.
git clone https://github.com/apache/atlas
Step 2: Generated tar file by executing below command
mvn clean -DskipTests package -Pdist,embedded-cassandra-solr
Step 3: Once the build is successful, extracted ‘apache-atlas-3.0.0-SNAPSHOT-server.tar’ file and executed below command.
.\bin\atlas_start.py
Seen below messages in console.
Starting Atlas server on host: localhost
Starting Atlas server on port: 21000
......................
Apache Atlas Server started!!!
But When I hit the url 'http://localhost:21000/', I am getting service unavailable message.
HTTP ERROR 503 Service Unavailable
URI: /
STATUS: 503
MESSAGE: Service Unavailable
SERVLET: -
Log files are empty, not sure how to identify the issue.
Couple of Questions
a. Do I need to explicitly setup cassandra and Apache solr for emebdded mode too? In that case please suggest me a documentation.
b. even though I generated the build using embedded cassandra file, while launching the application, it was still lokking for HADOOP_HOME property. Can I know the reason for this?
I got the same problem and, after a while, I found that Zookeeper doesn't start at all; so, I stopped the Zookeeper service and restarted the installation of atlas. (Here is the link of the installation that I followed : https://manjitsingh664.medium.com/apache-atlas-installation-guide-9098df98d5c3.)
For your case, replace:
mvn clean -DskipTests package -Pdist,embedded-hbase-solr
with:
mvn clean -DskipTests package -Pdist,embedded-cassandra-solr
I'm facing an issue in the JMeter run, where while running the JMeter using the docker it is happening. In the JMeter, it is working fine. But in the terminal, by using docker this error is coming.
Not using maven. I'm just running the below-mentioned docker command.
sudo docker run --mount type=bind,source="/home/user/Downloads/apache-jmeter-5.4.1/bin/",target="/opt/apache-jmeter-5.3/bin" jmeter -n -t bin/Assignment2.jmx -l bin/example-run29.jtl
This is the jtl file result that I'm getting after the run.
timeStamp,elapsed,label,responseCode,responseMessage,threadName,dataType,success,failureMessage,bytes,sentBytes,grpThreads,allThreads,URL,Latency,IdleTime,Connect
1621688749004,13,JDBC Request,null 0,java.sql.SQLException: Cannot load JDBC driver class 'org.postgresql.Driver',Thread Group 1-1,text,false,,53,0,1,1,null,0,0,13
It looks like you don't have PostgreSQL JDBC Driver in the JMeter Classpath so you need to either copy the file (it should be something like postgresql-xx.x.xx.jar, the latest one is https://repo1.maven.org/maven2/org/postgresql/postgresql/42.2.20/postgresql-42.2.20.jar or amend your Dockerfile to automatically download this driver and place it to the JMeter Classpath, something like:
RUN wget wget https://jdbc.postgresql.org/download/postgresql-42.2.20.jar
RUN mv postgresql-42.2.20.jar /path/to/your/jmeter/lib
More information: How to use Different JDBC Drivers
I have a problem setting up some critical paths when I run Wildfly 20 as a service.
When I install (in "VM1") Wildfly in /home/myuser/ instead of /opt and NOT as a service and run it with the following, I am able to use the Admin console's "Test Connection" to connect to a Sybase SQL Anywhere database using the sajdbc4 driver.
cd ~/wildfly-20.0.1.Final/bin
export LD_LIBRARY_PATH=/home/myuser/wildfly-20.0.1.Final/modules/system/layers/base/com/sybase/main
export CLASSPATH=.:/home/myuser/wildfly-20.0.1.Final/modules/system/layers/base/com/sybase/main/sajdbc4.jar
./standalone.sh
LD_LIBRARY_PATH sets the path to the driver support files.
On the other hand, when I install Wildfly (in "VM2") exactly the same way as before except for installing into /opt and the extra steps to run Wildfly as a service as below, the Admin console's "Test Connection" fails with:
cd ~/wildfly-20.0.1.Final/bin
export LD_LIBRARY_PATH=/opt/wildfly-20.0.1.Final/modules/system/layers/base/com/sybase/main
export CLASSPATH=.:/opt/wildfly-20.0.1.Final/modules/system/layers/base/com/sybase/main/sajdbc4.jar
sudo systemctl start wildfly
2020-08-28 13:13:41,341 INFO [org.jboss.as.controller] (Controller Boot Thread) WFLYCTL0183:
Service status report WFLYCTL0184: New missing/unsatisfied dependencies: service jboss.jdbc-driver.sajdbc4_jar (missing) dependents: [service jboss.driver-demander.java:jboss/datasources/TestDB, service org.wildfly.data-source.TestDB]
I can run a simple Java test app on the "VM02" system that connects and dumps a database table with:
cd $HOME/Desktop
export LD_LIBRARY_PATH=/opt/wildfly-20.0.1.Final/modules/system/layers/base/com/sybase/main
export CLASSPATH=.:/opt/wildfly-20.0.1.Final/modules/system/layers/base/com/sybase/main/sajdbc4.jar
java sajdbc4DriverTest.java
This suggest ti me that all of the driver files are present at and the LD_LIBRARY_PATH location. Note that the launch of Wildlfly as a service uses the same paths.
Can anyone explain why Wildfly is ignoring the two paths I set prior to starting the service?
Thank you in advance.
Service environment variables are not set this way. And even if they were, the use of sudo changes to a new user with new environment variables.
Instead, if you installed Wildfly as documented in wildfly-20.0.1.Final/docs/contrib/scripts/systemd, add your environment variables in /etc/wildfly/wildfly.conf. Something more like:
# The configuration you want to run
WILDFLY_CONFIG=standalone.xml
# The mode you want to run
WILDFLY_MODE=standalone
# The address to bind to
WILDFLY_BIND=0.0.0.0
# Add Sybase native library dir
LD_LIBRARY_PATH=/opt/wildfly-20.0.1.Final/modules/system/layers/base/com/sybase/main
I don't feel that you need to set CLASSPATH but I don't think it'll hurt either.
I'm trying to configure Spark on my local IDE and local install of Conda jupyter environment to use our corp spark/hive connect which looks something specs similar to this:
host: mycompany.com
port: 10003
I tried to configure spark-default.conf
spark.master spark://mycompany.com:10003
And when I try and call the spark context : sc
I get the following error with Jupyter:
Exception: Java gateway process exited before sending the driver its port number
Does anyone know of any good documentation that I can use to configure my local instance of jupyter and or Netbeans to use Spark with Scala or Python?