使用sbt将Apache Ignite与scala-spark集成

问题描述 投票:0回答:1

我正在尝试将ignite集成到scala代码中,并使用sbt运行该应用程序。我不能为此使用任何IDE。

Scala version - 2.11.0
Spark version - 2.3.0
Ignite version - 2.8.0
Sbt version - 1.3.3

我已经尝试在sbt.build中添加基本库依赖项,

libraryDependencies += "org.apache.ignite" %% "ignite-spark" % "2.8.0"

我完整的sbt.build代码是,

scalaVersion := "2.11.0"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.0"
libraryDependencies += "org.apache.ignite" %% "ignite-spark" % "2.8.0"

但是它仍然无法正常工作,出现以下错误:

$ sbt package
[info] Loading project definition from /home/testing/project
[info] Loading settings for project testing from build.sbt ...
[info] Set current project to testing (in build file:/home/testing/)
[info] Updating 
[info] Resolved  dependencies
[warn] 
[warn]  Note: Unresolved dependencies path:
[error] sbt.librarymanagement.ResolveException: Error downloading org.apache.ignite:ignite-spark_2.11:2.8.0
[error]   Not found
[error]   Not found
[error]   not found: /root/.ivy2/local/org.apache.ignite/ignite-spark_2.11/2.8.0/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/org/apache/ignite/ignite-spark_2.11/2.8.0/ignite-spark_2.11-2.8.0.pom
[error]         at lmcoursier.CoursierDependencyResolution.unresolvedWarningOrThrow(CoursierDependencyResolution.scala:245)
[error]         at lmcoursier.CoursierDependencyResolution.$anonfun$update$34(CoursierDependencyResolution.scala:214)
[error]         at scala.util.Either$LeftProjection.map(Either.scala:573)
[error]         at lmcoursier.CoursierDependencyResolution.update(CoursierDependencyResolution.scala:214)
[error]         at sbt.librarymanagement.DependencyResolution.update(DependencyResolution.scala:60)
[error]         at sbt.internal.LibraryManagement$.resolve$1(LibraryManagement.scala:52)
[error]         at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$12(LibraryManagement.scala:102)
[error]         at sbt.util.Tracked$.$anonfun$lastOutput$1(Tracked.scala:69)
[error]         at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$20(LibraryManagement.scala:115)
[error]         at scala.util.control.Exception$Catch.apply(Exception.scala:228)
[error]         at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11(LibraryManagement.scala:115)
[error]         at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11$adapted(LibraryManagement.scala:96)
[error]         at sbt.util.Tracked$.$anonfun$inputChanged$1(Tracked.scala:150)
[error]         at sbt.internal.LibraryManagement$.cachedUpdate(LibraryManagement.scala:129)
[error]         at sbt.Classpaths$.$anonfun$updateTask0$5(Defaults.scala:2946)
[error]         at scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error]         at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:62)
[error]         at sbt.std.Transform$$anon$4.work(Transform.scala:67)
[error]         at sbt.Execute.$anonfun$submit$2(Execute.scala:281)
[error]         at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:19)
[error]         at sbt.Execute.work(Execute.scala:290)
[error]         at sbt.Execute.$anonfun$submit$1(Execute.scala:281)
[error]         at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:178)
[error]         at sbt.CompletionService$$anon$2.call(CompletionService.scala:37)
[error]         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error]         at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error]         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error]         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error]         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error]         at java.lang.Thread.run(Thread.java:748)
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.ignite:ignite-spark_2.11:2.8.0
[error]   Not found
[error]   Not found
[error]   not found: /root/.ivy2/local/org.apache.ignite/ignite-spark_2.11/2.8.0/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/org/apache/ignite/ignite-spark_2.11/2.8.0/ignite-spark_2.11-2.8.0.pom
[error] Total time: 5 s, completed May 28, 2020 6:24:04 AM

我正在使用Ubuntu:最新的docker映像。我确信spark和ignite在工作,因为我也在运行pyspark并且工作正常。请帮助我,我认为作为一个新手,我正在犯一些小错误,这在这里正成为一个主要问题。

scala apache-spark sbt ignite
1个回答
0
投票

[他似乎尝试合并“ ignite_spark_2.11”,而不仅仅是“ ignite_spark”。“未找到:https://repo1.maven.org/maven2/org/apache/ignite/ignite-spark_2.11/2.8.0/ignite-spark_2.11-2.8.0.pom

使用

libraryDependencies += "org.apache.ignite" % "ignite-spark" % "2.8.0"

代替

libraryDependencies += "org.apache.ignite" %% "ignite-spark" % "2.8.0"

%%->获取scala版本的依赖项%->获取没有scala版本的依赖项

© www.soinside.com 2019 - 2024. All rights reserved.