在Spark上运行hive支持时发生异常:由于未找到Hive类,因此无法实例化具有Hive支持的SparkSession

问题描述 投票:0回答:1

你好,我正在尝试将Hive与spark配合使用,但是当我尝试执行时,它显示此错误

Exception in thread "main" java.lang.IllegalArgumentException: Unable to instantiate SparkSession with Hive support because Hive classes are not found.

这是我的源代码

package com.spark.hiveconnect

import java.io.File

import org.apache.spark.sql.{Row, SaveMode, SparkSession}

object sourceToHIve {
  case class Record(key: Int, value: String)
  def main(args: Array[String]){
    val warehouseLocation = new File("spark-warehouse").getAbsolutePath

    val spark = SparkSession
      .builder()
      .appName("Spark Hive Example")
      .config("spark.sql.warehouse.dir", warehouseLocation)
      .enableHiveSupport()
      .getOrCreate()

    import spark.implicits._
    import spark.sql

    sql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING) USING hive")
    sql("LOAD DATA LOCAL INPATH '/usr/local/spark3/examples/src/main/resources/kv1.txt' INTO TABLE src")
    sql("SELECT * FROM src").show()

    spark.close()
  }
}

这是我的build.sbt文件。

name := "SparkHive"

version := "0.1"

scalaVersion := "2.12.10"

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.5"
// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.5"

而且我也正在控制台中运行配置单元。谁能帮我这个?谢谢。

scala apache-spark hive
1个回答
0
投票

尝试添加

libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.4.5"
© www.soinside.com 2019 - 2024. All rights reserved.