How to fix Keberos Authentication prompt while running Java application - kerberos

I wrote a Java application to connect with hive-metastore. It works fine but when I run the jar file on Linux it asks for Kerberos Username and password
. I already specified the Kerberos principal and keytab file in my code. And I don't want to use additional jass.config file. Is there any way to resolve that problem?
Here is my source code
HiveConf conf = new HiveConf();
MetastoreConf.setVar(conf, ConfVars.THRIFT_URIS, "thrift://HOSTNAME:9083");
MetastoreConf.setBoolVar(conf, ConfVars.USE_THRIFT_SASL, true);
MetastoreConf.setVar(conf, ConfVars.KERBEROS_PRINCIPAL, "hive/HOSTNAME#example.com");
MetastoreConf.setVar(conf, ConfVars.KERBEROS_KEYTAB_FILE, "hive.keytab");
System.setProperty("java.security.krb5.realm", "EXAMPLE.COM");
System.setProperty("java.security.krb5.kdc", "HOSTNAME");
System.setProperty("javax.security.auth.useSubjectCredsOnly", "false");
HiveMetaStoreClient client = new HiveMetaStoreClient(conf);
client.close();
Expected Result-
It should verify the connection successfully
Actual result-
java -jar Application
Kerberos username [root]:
Kerberos password [root]:
I want to bypass this kerberos terminal prompt authentication. Is there any way to do that?

final String user = "hive/HOSTNAME#EXAMPLE.COM";
final String keyPath = "hive.keytab";
Configuration conf = new Configuration();
System.out.println("In case of kerberos authentication");
conf.set("hadoop.security.authentication", "kerberos");
System.setProperty("java.security.krb5.kdc", "HOSTNAME");
System.setProperty("java.security.krb5.realm", "EXAMPLE.COM");
UserGroupInformation.setConfiguration(conf);
UserGroupInformation.loginUserFromKeytab(user, keyPath);
HiveConf conf1 = new HiveConf(); MetastoreConf.setVar(conf1,
ConfVars.THRIFT_URIS, "thrift://HOSTNAME:9083"); HiveMetaStoreClient
client = new HiveMetaStoreClient(conf1);
client.close();
It works fine now

Related

loginUserFromKeytab for UserGroupInformation accepts path for keytab file, works local but not when bundled as JAR

I have all 4 files needed to read from/write to HDFS in my resources folder and method to create hdfs object is as below .
public static FileSystem getHdfsOnPrem(String coreSiteXml, String hdfsSiteXml, String krb5confLoc, String keyTabLoc){
// Setup the configuration object.
try {
Configuration config = new Configuration();
config.addResource(new org.apache.hadoop.fs.Path(coreSiteXml));
config.addResource(new org.apache.hadoop.fs.Path(hdfsSiteXml));
config.set("hadoop.security.authentication", "Kerberos");
config.addResource(krb5confLoc);
config.set("fs.hdfs.impl",org.apache.hadoop.hdfs.DistributedFileSystem.class.getName());
config.set("fs.file.impl",org.apache.hadoop.fs.LocalFileSystem.class.getName());
System.setProperty("java.security.krb5.conf", krb5confLoc);
org.apache.hadoop.security.HadoopKerberosName.setConfiguration(config);
UserGroupInformation.setConfiguration(config);
UserGroupInformation.loginUserFromKeytab("my_username", keyTabLoc);
return org.apache.hadoop.fs.FileSystem.get(config);
}
catch(Exception ex) {
ex.printStackTrace();
return null;
}
}
It works when I run it in local and pass below as the paths
C:\Users\my_username\IdeaProjects\my_project_name\target\scala-2.12\classes\core-site.xml
C:\Users\my_username\IdeaProjects\my_project_name\target\scala-2.12\classes\hdfs-site.xml
C:\Users\my_username\IdeaProjects\my_project_name\target\scala-2.12\classes\krb5.conf
C:\Users\my_username\IdeaProjects\my_project_name\target\scala-2.12\classes\my_username.user.keytab
It runs fine when I run it in local but when I bundle it as JAR and run it in an env like kubernetes it throws below error (Since bundling as JAR I can read contents of resource files as stream but I need to pass in path for loginuserFromKeytab method)
org.apache.hadoop.security.KerberosAuthException: failure to login: for principal: my_username from keytab file:/opt/spark-3.0.0/jars/foo-project-name!/my_username.user.keytab javax.security.auth.login.LoginException: Unable to obtain password from user
Any suggestions/pointers are appreciated.
I suggest you use Jaas config file instead of writing this code. This helps to remove the security plumbing from your code and externalizes it. "Unable to obtain password " would occur if the user that is running your app doesn't have permission to access the file.

How to access a confluent schema registry server secured with a password using Spring cloud stream?

I'm using spring cloud stream alongside Aiven's schema registry which uses confluent's schema registry. Aiven's schema registry is secured with a password. Based on these instructions, these two config parameters need to be set to successfully access the schema registry server.
props.put("basic.auth.credentials.source", "USER_INFO");
props.put("basic.auth.user.info", "avnadmin:schema-reg-password");
Everything is fine when I only use vanilla java's kafka drivers, but if I use Spring cloud stream, I don't know how to inject these two parameters. At the moment, I'm putting "basic.auth.user.info" and "basic.auth.credentials.source" under "spring.cloud.stream.kafka.binder.configuration" in the application.yml file.
Doing this, I'm getting "401 Unauthorized" on the line where the schema wants to get registered.
Update 1:
Based on 'Ali n's suggestion, I updated the way SchemaRegistryClient's bean was configured so that it becomes aware of the SSL context.
#Bean
public SchemaRegistryClient schemaRegistryClient(
#Value("${spring.cloud.stream.schemaRegistryClient.endpoint}") String endpoint) {
try {
final KeyStore keyStore = KeyStore.getInstance("PKCS12");
keyStore.load(new FileInputStream(
new File("path/to/client.keystore.p12")),
"secret".toCharArray());
final KeyStore trustStore = KeyStore.getInstance("JKS");
trustStore.load(new FileInputStream(
new File("path/to/client.truststore.jks")),
"secret".toCharArray());
TrustStrategy acceptingTrustStrategy = (X509Certificate[] chain, String authType) -> true;
SSLContext sslContext = SSLContextBuilder
.create()
.loadKeyMaterial(keyStore, "secret".toCharArray())
.loadTrustMaterial(trustStore, acceptingTrustStrategy)
.build();
HttpClient httpClient = HttpClients.custom().setSSLContext(sslContext).build();
ClientHttpRequestFactory requestFactory = new HttpComponentsClientHttpRequestFactory(
httpClient);
ConfluentSchemaRegistryClient schemaRegistryClient = new ConfluentSchemaRegistryClient(
new RestTemplate(requestFactory));
schemaRegistryClient.setEndpoint(endpoint);
return schemaRegistryClient;
} catch (Exception ex) {
ex.printStackTrace();
return null;
}
}
This helped getting rid of the error on app's startup and registered the schema. However, whenever the app wanted to push a message to Kafka, a new error was thrown again. Finally this was also fixed by mmelsen's answer.
I ran into the same problem as the situation I was in was to connect to a secured schema registry hosted by aiven and secured by basic auth. In order for me to make it work I had to configure the following properties:
spring.kafka.properties.schema.registry.url=https://***.aiven***.com:port
spring.kafka.properties.basic.auth.credentials.source=USER_INFO
spring.kafka.properties.basic.auth.user.info=username:password
the other properties for my binder are:
spring.cloud.stream.binders.input.type=kafka
spring.cloud.stream.binders.input.environment.spring.cloud.stream.kafka.binder.brokers=https://***.aiven***.com:port <-- different from the before mentioned port
spring.cloud.stream.binders.input.environment.spring.cloud.stream.kafka.binder.configuration.security.protocol=SSL
spring.cloud.stream.binders.input.environment.spring.cloud.stream.kafka.binder.configuration.ssl.truststore.location=truststore.jks
spring.cloud.stream.binders.input.environment.spring.cloud.stream.kafka.binder.configuration.ssl.truststore.password=secret
spring.cloud.stream.binders.input.environment.spring.cloud.stream.kafka.binder.configuration.ssl.keystore.type=PKCS12
spring.cloud.stream.binders.input.environment.spring.cloud.stream.kafka.binder.configuration.ssl.keystore.location=clientkeystore.p12
spring.cloud.stream.binders.input.environment.spring.cloud.stream.kafka.binder.configuration.ssl.keystore.password=secret
spring.cloud.stream.binders.input.environment.spring.cloud.stream.kafka.binder.configuration.ssl.key.password=secret
spring.cloud.stream.binders.input.environment.spring.cloud.stream.kafka.binder.configuration.value.deserializer=io.confluent.kafka.serializers.KafkaAvroDeserializer
spring.cloud.stream.binders.input.environment.spring.cloud.stream.kafka.streams.binder.autoCreateTopics=false
what actually happens is that Spring cloud stream will add the spring.kafka.properties.basic* to the DefaultKafkaConsumerFactory and that will add the config to the KafkaConsumer. At some point during the initialization of the spring kafka, a CachedSchemaRegistryClient is created that is provisioned with these properties. This Client contains a method called configureRestService that will check if a map of properties contains "basic.auth.credentials.source". As we provide this through the spring.kafka.properties it will find this property and will take care of creating the appropriate headers when accessing the schema registry's endpoint.
hope this works out for you as well.
I'm using spring cloud version Greenwich.SR1, spring-boot-starter 2.1.4.RELEASE, avro-version 1.8.2 and confluent.version 5.2.1
The binder configuration only handles well-known consumer and producer properties.
You can set arbitrary properties at the binding level.
spring.cloud.stream.kafka.binding.<binding>.consumer.configuration.basic.auth...
Since Aiven uses SSL for Kafka security protocol, it is required to use certificates for the authentication.
You can follow this page to understand how it works. In the nutshell, you need to run the following command to generate certificates and import them:
openssl pkcs12 -export -inkey service.key -in service.cert -out client.keystore.p12 -name service_key
keytool -import -file ca.pem -alias CA -keystore client.truststore.jks
Then you can use the following properties to make use of the certificates:
spring.cloud.stream.kafka.streams.binder:
configuration:
security.protocol: SSL
ssl.truststore.location: client.truststore.jks
ssl.truststore.password: secret
ssl.keystore.type: PKCS12
ssl.keystore.location: client.keystore.p12
ssl.keystore.password: secret
ssl.key.password: secret
key.serializer: org.apache.kafka.common.serialization.StringSerializer
value.serializer: org.apache.kafka.common.serialization.StringSerializer

Kerberos Exception launching Spark locally

I am trying to set up a Spark Testng unit test:
#Test
def testStuff(): Unit = {
val sc = new SparkContext(new SparkConf().setAppName("test").setMaster("local"))
...
}
The code fails with: IllegalArgumentException: Can't get Kerberos realm
What am I missing?
The error suggests that your JVM is unable to locate the kerberos config (krb5.conf file).
Depending on your company's environment/infrastruture you have a few options:
Check if your company has standard library to set kerberos authentication.
Alternatively try:
set JVM property: -Djava.security.krb5.conf=/file-path/for/krb5.conf
Put the krb5.conf file into the <jdk-home>/jre/lib/security folder

Spray HTTPS Inbound closed before receiving peer's close_notify

Currently I am implementing a web service with Scala using Spray i/o. Looking to use SSL to secure my requests. However I am having difficulty configuring the SSL. When an https call is initiated there is an error thrown related to the handshake
fatal error: 80: Inbound closed before receiving peer's close_notify: possible truncation attack?
javax.net.ssl.SSLException: Inbound closed before receiving peer's close_notify: possible truncation attack?
Created a cert using this
keytool -genkey -keyalg RSA -alias mykey -dname "CN=dev.site.com,OU=app" -keystore keystore.jks -storepass pas -validity 365
Created an ssl trait like this
trait SSLConfiguration {
implicit def sslContext: SSLContext = {
val keystore = keystore.jks
val password = pas
val keyStore = KeyStore.getInstance("jks")
val in = getClass.getClassLoader.getResourceAsStream(keystore)
require(in != null, "Bad java key storage file: " + keystore)
keyStore.load(in, password.toCharArray)
val keyManagerFactory = KeyManagerFactory.getInstance("SunX509")
keyManagerFactory.init(keyStore, password.toCharArray)
val trustManagerFactory = TrustManagerFactory.getInstance("SunX509")
trustManagerFactory.init(keyStore)
val context = SSLContext.getInstance("TLS")
context.init(keyManagerFactory.getKeyManagers, trustManagerFactory.getTrustManagers, new SecureRandom)
context
}
implicit def sslEngineProvider: ServerSSLEngineProvider = {
ServerSSLEngineProvider { engine =>
engine.setEnabledCipherSuites(Array("TLS_RSA_WITH_AES_256_CBC_SHA"))
engine.setEnabledProtocols(Array("SSLv3", "TLSv1"))
engine
}
}
}
Set up my boot to use the trait.
object Boot extends App with SSLConfiguration
//bind to io interface. set ssl engine providor
IO(Http) ! Http.Bind(service, interface = interface, port)(sslEngineProvider)
}
Do you enable SSL in the configuration file?
spray.can {
server {
ssl-encryption = on
}
}
I tried your code and slightly changed it, it worked in my laptop.
I deleted the whole
implicit def sslEngineProvider
and use default one
IO(Http) ! Http.Bind(service, interface = "0.0.0.0", port = 8080).
And do you put your keystore file in the project's resource folder (project/src/main/resources)?
The issue could be related to the java version installed.
I faced this issue when I was installing open JDK version of java on linux machine, when I changed the java version to Oracle JDK the issue disappeared.
The exact application that threw this exception is Information Workbench (fluid ops product) and java version was 8
Using which version of java wasn't mentiond in system prerequists by fluid ops people.

How do I get a token needed for DFS Kerberos authentication?

I'm trying to write a client for consuming DFS (Documentum Foundation Services) and trying to use Kerberos for single sign-on. Both Java and C# sample code (productivity layer) in the documentation gives the following line which gets the Kerberos binary token:
byte[] ticket = ...
I'm not sure how to actually get the binary token, and the "..." doesn't help me. Does anyone know how to get an actual ticket (Kerberos token) using either Java or C#?
Here are the examples given for both Java and C#:
Java: Invoking a service with Kerberos authentication
KerberosTokenHandler handler = new KerberosTokenHandler();
IObjectService service = ServiceFactory
.getInstance().getRemoteService(..., contextRoot, Arrays.asList((Handler) handler));
byte[] ticket = ...;
handler.setBinarySecurityToken(
new KerberosBinarySecurityToken(ticket, KerberosValueType.KERBEROSV5_AP_REQ));
service.create(...)
C#: Invoking a service with Kerberos authentication
KerberosTokenHandler handler = new KerberosTokenHandler();
List<IEndpointBehavior> handlers = new List<IEndpointBehavior>();
handlers.Add(handler);
IObjectService service = ServiceFactory
.Instance.GetRemoteService<IObjectService>(..., contextRoot, handlers);
byte[] ticket = ...;
handler.SetBinarySecurityToken(
new KerberosBinarySecurityToken(ticket, KerberosValueType.GSS_KERBEROSV5_AP_REQ));
service.create(...);
I just figured this out for .NET and would like to share for those who maybe interested. What's needed is WSE3 library. Make sure to configure your DFS service account for Kerberos delegation.
So what need to do is set your KerberosTokenHandler with the Kerberos token. The KerberosBinarySecurityToken comes from WSE3. The code would look something like this:
KerberosTokenHandler kerberosTokenHandler = new KerberosTokenHandler();
String servicePrincipalName = “DFS/example66”; // this is the service principal name for your DFS service account in Active Directory.
using (KerberosClientContext kerberosClientContext = new KerberosClientContext(servicePrincipalName, true, ImpersonationLevel.Delegation))
{
KerberosBinarySecurityToken token = new KerberosBinarySecurityToken(kerberosClientContext.InitializeContext(), KerberosValueType.KERBEROSV5_AP_REQ);
kerberosTokenHandlerandler.SetBinarySecurityToken(token);
}