I am trying to set up kerberos on hortonworks hadoop platform on RHEL6.x and while settting up permission s i come across following error
chown ambari:ambari /etc/security/keytabs/ambari.keytab
chown: invalid user: `ambari:ambari'
how should i eliminate this error.
Related
I am trying to run schema registry on EC2
My kafka is running on AWS .
This is my properties file
listeners=http://0.0.0.0:8081
kafkastore.connection.url=z-3.***:2181,z-***:2181,z-**:2181
kafkastore.bootstrap.servers=PLAINTEXT://b-3.**:9092,PLAINTEXT://b-6.**:9092,PLAINTEXT://b-1.**:9092
kafkastore.topic=_schemas
debug=false
schema-registry-start /etc/schema-registry/schema-registry.properties &
When i runt this i get below error
kafka.common.KafkaException: Failed to parse the broker info from zookeeper: {"listener_security_protocol_map":{"CLIENT":"PLAINTEXT","CLIENT_SECURE":"SSL","REPLICATION":"PLAINTEXT","REPLICATION_SECURE":"SSL"},"endpoints"
Caused by: java.lang.IllegalArgumentException: No enum constant org.apache.kafka.common.protocol.SecurityProtocol.CLIENT
I have changed it TSL ,PLAINTEXT and without both but all are throwing error .
I have connectivity from EC2 to MSK as well .
Apache Kafka version
2.2.1
confluent
sudo rpm --import http://packages.confluent.io/deb/3.1/archive.key
even if i dont mention broker url then also i get same error
Updating question based on Answer
when no connection url mentioned
[ec2-user#ip-10-97-54-99 ~]$ [2020-01-11 03:46:29,418] ERROR Server died unexpectedly: (io.confluent.kafka.schemaregistry.rest.SchemaRegistryMain:51)
io.confluent.common.config.ConfigException: Missing required configuration "kafkastore.connection.url" which has no default value.
at io.confluent.common.config.ConfigDef.parse(ConfigDef.java:241)
at io.confluent.common.config.AbstractConfig.<init>(AbstractConfig.java:76)
at io.confluent.rest.RestConfig.<init>(RestConfig.java:299)
at io.confluent.kafka.schemaregistry.rest.SchemaRegistryConfig.<init>(SchemaRegistryConfig.java:358)
at io.confluent.kafka.schemaregistry.rest.SchemaRegistryConfig.<init>(SchemaRegistryConfig.java:354)
at io.confluent.kafka.schemaregistry.rest.SchemaRegistryMain.main(SchemaRegistryMain.java:41)
Mentioning CLIENT:// or CLIENT_SECURE:// is also throwing same error .
and MSK does provide Plaintext as well .I can see this in client information
I think the issue is with your version .
I have also faced same issue after that i installed manually and it worked for me .
this is my exact installation step and schema registry start step
sudo yum install java-1.8.0
curl -O http://packages.confluent.io/archive/5.3/confluent-5.3.2-2.12.tar.gz
tar xzf confluent-5.3.2-2.12.tar.gz
cd confluent-5.3.2/etc/schema-registry/
/home/ec2-user/confluent-5.3.2/bin/schema-registry-start /home/ec2-user/confluent-5.3.2/etc/schema-registry/schema-registry.properties
If you configure kafkastore.bootstrap.servers, then you need to remove the Zookeeper connection string from the Schema Registry.
Kafka based primary election is chosen when <kafkastore.connection.url> is not configured and has the Kafka bootstrap brokers <kafkastore.bootstrap.servers> specified
Try just removing that first.
Also relevant - https://github.com/confluentinc/schema-registry/issues?utf8=%E2%9C%93&q=is%3Aissue+MSK+
Its not clear which version of the Registry you've tried to install, but MSK does not have any PLAINTEXT client connection string, as shown by the listener_security_protocol_map.
You would need to specify a different connection such as CLIENT:// or CLIENT_SECURE:// for a valid listener protocol, assuming PLAINTEXT still doesn't work
I configured in /etc/mongod.conf to enforce keyfile access control, in security option enabled, keyFile is /root/dbtest.key (the absolute path of keyfile). I already gave the ownership to mongodb user by chown, and granted 400 permission on that dbtest.key file.
But mongod keeps failing to start, after checking log, the error is Error reading file /root/dbtest.key: Permission denied. After checking the ownership and permissions on dbtest.key
Which means I already granted correctly. So I don't know at which step I did wrong
I am trying to run kafka on 3 machines as cluster, I have configured the Zookeeper on all the machines. Now I am trying to start the kafka server on 1 st machine using
bin/kafka-server-start.sh config/server.properties
It is giving error as
bin/kafka-server-start.sh: line 44: /tmp/kafka/kafka_2.11-1.1.0/bin/kafka-
run-class.sh: Permission denied
bin/kafka-server-start.sh: line 44: exec: /tmp/kafka/kafka_2.11-
1.1.0/bin/kafka-run-class.sh: cannot execute: Permission denied
Kafka installation is in the path /tmp/kafka/kafka_2.11-
1.1.0/ , and kafka logs are in the path /var/lib/kafka .
I have logged in as root user. Still I am getting these errors. I Checked the permission of the .sh files in the bin of kafka directory. All those have execute permission for all. Please help me to solve this error.
Below are the links used to configure zookeeper and kafka
http://armourbear.blogspot.com/2015/03/setting-up-multinode-kafka-
cluster.html
http://www.techburps.com/misc/multi-broker-apache-kafka-cluster-setup/64
Thanks in advance
It looks like filesystem permissions problem, make sure that the /tmp is not mounted with noexec option. Or just try to set up kafka in another directory.
I have setup Active directory with kerberos authentication on windows server 2012 r2, set mongodb server on a 2nd machine. Started mongodb with GSSAPI authentication, Now if I try to connect to mongodb using the follwong url
mongo.exe --host Mongo32Test.ihubtest.com.com --authenticationMechanism=GSSAPI --authenticationDatabase=$external -u mongoService#ihubtest.com --verbose
I am getting the following message.
Error: SASL(-1): generic failure: SSPI: InitializeSecurityContext: The specified target is unknown or unreachable
I have installed wireshark and the packet contains this message
"KRB5 167 KRB Error: KRB5KDC_ERR_S_PRINCIPAL_UNKNOWN"
Searching around I figured that it is related to service principle name
mongoService#ihubtest.com is a domain user and is part of $external database in mongodb.
verified the service principle name, it looks fine.
C:>setspn -l mongoService
Registered ServicePrincipalNames for CN=mongo Service,CN=Users,DC=ihubtest,DC=com:
mongodb/Mongo32test.ihubtest.com#IHUBTEST.COM
tried the troubleshooting steps mentioned in this page, https://docs.mongodb.com/manual/tutorial/troubleshoot-kerberos/, am I missing something on Active directory configuration ?
if not yet looked into this ticket MongoDB Team has a closed ticket with some steps
https://jira.mongodb.org/browse/SERVER-13885
I believe in you misquoted your hostname as "Mongo32Test.ihubtest.com.com" instead of "Mongo32Test.ihubtest.com".
Please verify whether the provided hostname is correct or not
When I am trying to query the running brokers its giving following error:
Connecting to Progress AdminServer using rmi://localhost:20931/Chimera (8280)
Login denied, check username and password.
Unable to find broker (8281)
This issue could be because timing of operating system and java process of admin server are not in sync.
so for that I have added this -XX:+UseGetTimeOfDay in admin server proprties and restarted server however issue was not resolved
and after that I added:
JVMARGS="-Xmx256m -Xms128m -XX:+UseGetTimeOfDay ${JVMARGS}"
in $DLC/bin/java_env and again restarted the server but issue still persist. And in admin server logs it is auto generating the password.
Please provide solution.