Spark-Hive 应用程序:集群上的 SASL 与 Kerberos 协商失败

问题描述 投票:0回答:1

我在 Kerberos 集群上运行的 Spark-Hive 应用程序遇到问题。我收到 javax.security.sasl.SaslException:GSS 启动失败错误,这似乎是由于未找到任何 Kerberos tgt 引起的。

这是错误日志:

23/08/04 22:56:55 INFO HiveUtils: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
23/08/04 22:56:55 INFO HiveClientImpl: Attempting to login to Kerberos using principal: [email protected] and keytab: hdfs.keytab-2ca1f730-bef7-4166-90ce-67317c75c793
23/08/04 22:56:55 INFO UserGroupInformation: Login successful for user [email protected] using keytab file hdfs.keytab-2ca1f730-bef7-4166-90ce-67317c75c793
23/08/04 22:56:55 INFO metastore: Trying to connect to metastore with URI thrift://master3.abc.xyz.com:9083"
23/08/04 22:56:55 ERROR TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
at org.apac...

我正在提交我的 Spark 作业,如下所示:

spark-submit \
--name TestKerberous \
--num-executors 2 \
--driver-java-options "-Djava.security.auth.login.config=./key_fin.conf" \
--driver-java-options "-Dsun.security.krb5.debug=true" \
--conf "spark.executor.extraJavaOptions=-Djava.security.auth.login.config=./key_fin.conf"\
--files=/etc/spark/conf/hive-site.xml,/etc/hadoop/conf/yarn-site.xml,/etc/hadoop/conf/hdfs-site.xml,/etc/hadoop/conf/core-site.xml \
--conf "spark.hadoop.hive.metastore.kerberos.principal=HTTP/[email protected]" \
--conf "spark.executor.extraJavaOptions=-Djava.security.auth.login.config=./key.conf" \
--conf -Djavax.security.auth.useSubjectCredsOnly=false \
--conf spark.executorEnv.KRB5_CONFIG=/etc/krb5.conf \
--conf spark.driverEnv.KRB5_CONFIG=/etc/krb5.conf \
--conf "spark.hadoop.hive.metastore.sasl.enabled=true" \
--conf "spark.hadoop.hive.security.authorization.enabled=true" \
--conf "spark.hadoop.hive.metastore.execute.setugi=true" \
--conf spark.sql.hive.convertMetastoreParquet=false \
--conf spark.home=/usr/hdp/current/spark2-client \
--conf spark.sql.warehouse.dir=/apps/hive/warehouse \
--conf spark.sql.catalogImplementation=hive \
--conf spark.yarn.keytab=/etc/security/keytabs/hdfs.keytab \
--conf [email protected] \
--conf spark.serializer=org.apache.spark.serializer.KryoSerializer  \
--master yarn --deploy-mode cluster --driver-cores 2 --driver-memory 2G --executor-cores 2 --executor-memory 2G --supervise \
--class <CLASS_NAME> \
<JAR_FILE>\
"<Hive Jdbc Url>" "thrift://master3.abc.xyz.com:9083" "/apps/hive/warehouse"

如果有人可以帮助我诊断可能出现的问题以及如何解决此问题,我将非常感激。

预先感谢您提供的任何见解

apache-spark hadoop hive kerberos apache-spark-2.0
1个回答
0
投票

我也遇到这个问题了。任何更新?怎么处理?

© www.soinside.com 2019 - 2024. All rights reserved.