h2o scala代码编译错误未找到对象ai

问题描述 投票:2回答:1

我正在尝试编译并运行简单的h2o scala代码。但是当我做sbt包时,我会收到错误。我在sbt文件中遗漏了什么

这是我的h2o scala代码

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.sql._

import ai.h2o.automl.AutoML
import ai.h2o.automl.AutoMLBuildSpec

import org.apache.spark.h2o._

object H2oScalaEg1 {

  def main(args: Array[String]): Unit = {

  val sparkConf1 = new SparkConf().setMaster("local[2]").setAppName("H2oScalaEg1App")

  val sparkSession1 = SparkSession.builder.config(conf = sparkConf1).getOrCreate()

  val h2oContext = H2OContext.getOrCreate(sparkSession1.sparkContext)

  import h2oContext._

  import java.io.File

  import h2oContext.implicits._

  import water.Key

  }

}

这是我的sbt文件。

name := "H2oScalaEg1Name"

version := "1.0"

scalaVersion := "2.11.12"

scalaSource in Compile := baseDirectory.value / ""

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.3"

libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"

libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.2.0"

libraryDependencies += "ai.h2o" % "h2o-core" % "3.22.1.3" % "runtime" pomOnly()

当我做sbt包时,我得到了这些错误

[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:7:8: not found: object ai

[error] import ai.h2o.automl.AutoML

[error]        ^

[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:8:8: not found: object ai

[error] import ai.h2o.automl.AutoMLBuildSpec

[error]        ^
[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:10:25: object h2o is not a member of package org.apache.spark

[error] import org.apache.spark.h2o._
[error]                         ^

[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:20:20: not found: value H2OContext
[error]   val h2oContext = H2OContext.getOrCreate(sparkSession1.sparkContext)
[error]                    ^


[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:28:10: not found: value water
[error]   import water.Key
[error]          ^
[error] 5 errors found

我该如何解决这个问题。

我在spark-2.2.3-bin-hadoop2.7中的火花版本

谢谢,

马雷尔

scala apache-spark h2o
1个回答
0
投票

pomOnly()中的build.sbt向依赖关系管理处理程序指示不应加载用于此依赖关系的jar文件/工件,并且仅查找元数据。

尝试使用libraryDependencies += "ai.h2o" % "h2o-core" % "3.22.1.3"代替。

编辑1:另外我认为你缺少(至少)一个库依赖:libraryDependencies += "ai.h2o" % "h2o-automl" % "3.22.1.3"

见:https://search.maven.org/artifact/ai.h2o/h2o-automl/3.22.1.5/pom

编辑2:你缺少的最后一个依赖是闪闪发光的水核心:libraryDependencies += "ai.h2o" % "sparkling-water-core_2.11" % "2.4.6"应该做的伎俩。

这是sparkling-water/core/src/main/scala/org/apache/spark/h2o的github。

© www.soinside.com 2019 - 2024. All rights reserved.