无法从Spark应用程序连接到Hive Metastore

问题描述 投票:0回答:1

我试图从Spark应用程序连接到Hive-metastore,但每次它都试图连接并在超时时崩溃:

INFO  metastore:376 - Trying to connect to metastore with URI thrift://hive-metastore:9083
WARN  metastore:444 - set_ugi() not successful, Likely cause: new client talking to old server. Continuing without it.
org.apache.thrift.transport.TTransportException: java.net.SocketTimeoutException: Read timed out

应用程序崩溃在我创建外部Hive表的行上

我在Kubernetes集群中运行Hive-metastore以及Spark应用程序(使用Spark K8s运算符)。我使用telnet(节点ip:服务节点端口)检查了群集外的Hive-Metoreore服务的可访问性并且卷曲了群集内的服务,该服务似乎是可评估的。这个错误的原因是什么?

这是Spark应用程序中的Hive-metastore uri的配置

val sparkSession = SparkSession
  .builder()
  .config(sparkConf)
  .config("hive.metastore.uris", "thrift://hive-metastore:9083")
  .config("hive.exec.dynamic.partition", "true")
  .config("hive.exec.dynamic.partition.mode", "nonstrict")
  .enableHiveSupport()
  .getOrCreate()

Hive-metastore yaml配置如下所示:

apiVersion: v1
kind: Service
metadata:
  name: hive-metastore-np
spec:
  selector:
    app: hive-metastore
  ports:
    - protocol: TCP
      targetPort: 9083
      port: 9083
      nodePort: 32083
  type: NodePort
---
apiVersion: apps/v1
kind: Deployment
metadata:
  name: hive-metastore
spec:
  replicas: 1
  selector:
    matchLabels:
      app: hive-metastore
  template:
    metadata:
      labels:
        app: hive-metastore
    spec:
      containers:
        - name: hive-metastore
          image: mozdata/docker-hive-metastore:1.2.1
          imagePullPolicy: Always
          env:
            - name: DB_URI
              value: postgresql
            - name: DB_USER
              value: hive
            - name: DB_PASSWORD
              value: hive-password
            - name: CORE_CONF_fs_defaultFS
              value: hdfs://hdfs-namenode:8020
          ports:
            - containerPort: 9083

更新:当我尝试卷曲hive-Metoreore:9083时,该服务是可访问的,但它返回一个空响应,这意味着hive-Metoreore K8s定义可能存在问题

> GET / HTTP/1.1
> User-Agent: curl/7.35.0
> Host: hive-metastore:9083
> Accept: */*
scala apache-spark hive-metastore
1个回答
1
投票

如果群集中的hive jar版本与Spark使用的hive jar之间存在差异(通常与您正在使用的Spark版本一致),则会发生此错误。您需要确定群集中使用的蜂巢罐的版本,并将这些罐添加到Spark图像中。然后,您可以通过将以下配置添加到SparkSession,使SparkSession使用这些兼容的hive jar:

  .conf("spark.sql.hive.metastore.version", "<your hive metastore version>")
  .conf("spark.sql.hive.metastore.version", "<your hive version>")
  .conf("spark.sql.hive.metastore.jars", "<uri of all the correct hive jars>")
© www.soinside.com 2019 - 2024. All rights reserved.