sbt 运行在 Spark 中显示“java.nio.file.NoSuchFileException: xxx/hadoop-client-api-3.3.4.jar”

问题描述 投票:0回答:1

当我在 Scala 2.12 中为 Spark 应用程序运行

sbt run
时,该应用程序成功。

但是,最后我仍然遇到错误,但不影响应用程序本身:

sbt run
[info] welcome to sbt 1.8.2 (Homebrew Java 17.0.6)
[info] loading project definition from hongbomiao.com/hm-spark/applications/find-retired-people-scala/project

  | => find-retired-people-scala-build / Compile / compileIncremental 0s
[info] loading settings for project find-retired-people-scala from build.sbt ...
[info] set current project to FindRetiredPeople (in build file:hongbomiao.com/hm-spark/applications/find-retired-people-scala/)

  | => find-retired-people-scala / update 0s

  | => find-retired-people-scala / Compile / compileIncremental 0s

  | => find-retired-people-scala / Compile / compileIncremental 0s
[info] running com.hongbomiao.FindRetiredPeople 
Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties
23/04/19 16:28:22 WARN Utils: Your hostname, Hongbos-MacBook-Pro-2021.local resolves to a loopback address: 127.0.0.1; using 10.10.8.223 instead (on interface en0)
23/04/19 16:28:22 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
23/04/19 16:28:22 INFO SparkContext: Running Spark version 3.4.0
23/04/19 16:28:22 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
23/04/19 16:28:22 INFO ResourceUtils: ==============================================================
23/04/19 16:28:22 INFO ResourceUtils: No custom resources configured for spark.driver.
23/04/19 16:28:22 INFO ResourceUtils: ==============================================================
23/04/19 16:28:22 INFO SparkContext: Submitted application: find-retired-people-scala
23/04/19 16:28:22 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
23/04/19 16:28:22 INFO ResourceProfile: Limiting resource is cpu
23/04/19 16:28:22 INFO ResourceProfileManager: Added ResourceProfile id: 0
23/04/19 16:28:22 INFO SecurityManager: Changing view acls to: hongbo-miao
23/04/19 16:28:22 INFO SecurityManager: Changing modify acls to: hongbo-miao
23/04/19 16:28:22 INFO SecurityManager: Changing view acls groups to: 
23/04/19 16:28:22 INFO SecurityManager: Changing modify acls groups to: 
23/04/19 16:28:22 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: hongbo-miao; groups with view permissions: EMPTY; users with modify permissions: hongbo-miao; groups with modify permissions: EMPTY
23/04/19 16:28:22 INFO Utils: Successfully started service 'sparkDriver' on port 64550.
23/04/19 16:28:22 INFO SparkEnv: Registering MapOutputTracker
23/04/19 16:28:22 INFO SparkEnv: Registering BlockManagerMaster
23/04/19 16:28:22 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
23/04/19 16:28:22 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
23/04/19 16:28:22 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
23/04/19 16:28:22 INFO DiskBlockManager: Created local directory at /private/var/folders/22/ntjwd5dx691gvkktkspl0f_00000gq/T/blockmgr-484c32c3-42df-4bb0-bfb6-93bca8799dab
23/04/19 16:28:22 INFO MemoryStore: MemoryStore started with capacity 434.4 MiB
23/04/19 16:28:22 INFO SparkEnv: Registering OutputCommitCoordinator
23/04/19 16:28:22 INFO JettyUtils: Start Jetty 0.0.0.0:4040 for SparkUI
23/04/19 16:28:22 INFO Utils: Successfully started service 'SparkUI' on port 4040.
23/04/19 16:28:22 INFO Executor: Starting executor ID driver on host 10.10.8.223
23/04/19 16:28:22 INFO Executor: Starting executor with user classpath (userClassPathFirst = false): ''
23/04/19 16:28:22 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 64552.
23/04/19 16:28:22 INFO NettyBlockTransferService: Server created on 10.10.8.223:64552
23/04/19 16:28:22 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
23/04/19 16:28:22 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.10.8.223, 64552, None)
23/04/19 16:28:22 INFO BlockManagerMasterEndpoint: Registering block manager 10.10.8.223:64552 with 434.4 MiB RAM, BlockManagerId(driver, 10.10.8.223, 64552, None)
23/04/19 16:28:22 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.10.8.223, 64552, None)
23/04/19 16:28:22 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.10.8.223, 64552, None)
23/04/19 16:28:23 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir.
23/04/19 16:28:23 INFO SharedState: Warehouse path is 'file:hongbomiao.com/hm-spark/applications/find-retired-people-scala/spark-warehouse'.
23/04/19 16:28:24 INFO CodeGenerator: Code generated in 120.900875 ms
23/04/19 16:28:24 INFO CodeGenerator: Code generated in 5.882833 ms
23/04/19 16:28:24 INFO CodeGenerator: Code generated in 4.50325 ms
23/04/19 16:28:24 INFO CodeGenerator: Code generated in 6.235417 ms
23/04/19 16:28:24 INFO SparkContext: SparkContext is stopping with exitCode 0.
23/04/19 16:28:24 INFO SparkUI: Stopped Spark web UI at http://10.10.8.223:4040
+-------+---+
|   name|age|
+-------+---+
|Charlie| 80|
+-------+---+

23/04/19 16:28:24 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
23/04/19 16:28:24 INFO MemoryStore: MemoryStore cleared
23/04/19 16:28:24 INFO BlockManager: BlockManager stopped
23/04/19 16:28:24 INFO BlockManagerMaster: BlockManagerMaster stopped
23/04/19 16:28:24 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
23/04/19 16:28:24 INFO SparkContext: Successfully stopped SparkContext
[success] Total time: 7 s, completed Apr 19, 2023, 4:28:24 PM
23/04/19 16:28:25 INFO ShutdownHookManager: Shutdown hook called
23/04/19 16:28:25 INFO ShutdownHookManager: Deleting directory /private/var/folders/22/ntjwd5dx691gvkktkspl0f_00000gq/T/spark-5e9a60e6-2699-4353-95cb-a93a87b70f63
23/04/19 16:28:25 ERROR Configuration: error parsing conf core-default.xml
java.nio.file.NoSuchFileException: hongbomiao.com/hm-spark/applications/find-retired-people-scala/target/bg-jobs/sbt_17e38c6e/target/f5c922ec/359669fc/hadoop-client-api-3.3.4.jar
    at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:92)
    at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
    at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
    at java.base/sun.nio.fs.UnixFileAttributeViews$Basic.readAttributes(UnixFileAttributeViews.java:55)
    at java.base/sun.nio.fs.UnixFileSystemProvider.readAttributes(UnixFileSystemProvider.java:148)
    at java.base/java.nio.file.Files.readAttributes(Files.java:1851)
    at java.base/java.util.zip.ZipFile$Source.get(ZipFile.java:1264)
    at java.base/java.util.zip.ZipFile$CleanableResource.<init>(ZipFile.java:709)
    at java.base/java.util.zip.ZipFile.<init>(ZipFile.java:243)
    at java.base/java.util.zip.ZipFile.<init>(ZipFile.java:172)
    at java.base/java.util.jar.JarFile.<init>(JarFile.java:347)
    at java.base/sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:103)
    at java.base/sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:72)
    at java.base/sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:168)
    at java.base/sun.net.www.protocol.jar.JarFileFactory.getOrCreate(JarFileFactory.java:91)
    at java.base/sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:132)
    at java.base/sun.net.www.protocol.jar.JarURLConnection.getInputStream(JarURLConnection.java:175)
    at org.apache.hadoop.conf.Configuration.parse(Configuration.java:3009)
    at org.apache.hadoop.conf.Configuration.getStreamReader(Configuration.java:3105)
    at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3063)
    at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:3036)
    at org.apache.hadoop.conf.Configuration.loadProps(Configuration.java:2914)
    at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2896)
    at org.apache.hadoop.conf.Configuration.get(Configuration.java:1246)
    at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1863)
    at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1840)
    at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183)
    at org.apache.hadoop.util.ShutdownHookManager.shutdownExecutor(ShutdownHookManager.java:145)
    at org.apache.hadoop.util.ShutdownHookManager.access$300(ShutdownHookManager.java:65)
    at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:102)
Exception in thread "Thread-1" java.lang.RuntimeException: java.nio.file.NoSuchFileException: hongbomiao.com/hm-spark/applications/find-retired-people-scala/target/bg-jobs/sbt_17e38c6e/target/f5c922ec/359669fc/hadoop-client-api-3.3.4.jar
    at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3089)
    at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:3036)
    at org.apache.hadoop.conf.Configuration.loadProps(Configuration.java:2914)
    at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2896)
    at org.apache.hadoop.conf.Configuration.get(Configuration.java:1246)
    at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1863)
    at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1840)
    at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183)
    at org.apache.hadoop.util.ShutdownHookManager.shutdownExecutor(ShutdownHookManager.java:145)
    at org.apache.hadoop.util.ShutdownHookManager.access$300(ShutdownHookManager.java:65)
    at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:102)
Caused by: java.nio.file.NoSuchFileException: hongbomiao.com/hm-spark/applications/find-retired-people-scala/target/bg-jobs/sbt_17e38c6e/target/f5c922ec/359669fc/hadoop-client-api-3.3.4.jar
    at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:92)
    at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
    at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
    at java.base/sun.nio.fs.UnixFileAttributeViews$Basic.readAttributes(UnixFileAttributeViews.java:55)
    at java.base/sun.nio.fs.UnixFileSystemProvider.readAttributes(UnixFileSystemProvider.java:148)
    at java.base/java.nio.file.Files.readAttributes(Files.java:1851)
    at java.base/java.util.zip.ZipFile$Source.get(ZipFile.java:1264)
    at java.base/java.util.zip.ZipFile$CleanableResource.<init>(ZipFile.java:709)
    at java.base/java.util.zip.ZipFile.<init>(ZipFile.java:243)
    at java.base/java.util.zip.ZipFile.<init>(ZipFile.java:172)
    at java.base/java.util.jar.JarFile.<init>(JarFile.java:347)
    at java.base/sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:103)
    at java.base/sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:72)
    at java.base/sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:168)
    at java.base/sun.net.www.protocol.jar.JarFileFactory.getOrCreate(JarFileFactory.java:91)
    at java.base/sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:132)
    at java.base/sun.net.www.protocol.jar.JarURLConnection.getInputStream(JarURLConnection.java:175)
    at org.apache.hadoop.conf.Configuration.parse(Configuration.java:3009)
    at org.apache.hadoop.conf.Configuration.getStreamReader(Configuration.java:3105)
    at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3063)
    ... 10 more

Process finished with exit code 0

基于this,我尝试通过更新

build.sbt
来添加hadoop-client

name := "FindRetiredPeople"
version := "1.0"
scalaVersion := "2.12.17"
libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "3.4.0",
  "org.apache.spark" %% "spark-sql" % "3.4.0",
  "org.apache.hadoop" %% "hadoop-client" % "3.3.4",
)

然而,那么我的错误就变成了

sbt run
[info] welcome to sbt 1.8.2 (Homebrew Java 17.0.6)
[info] loading project definition from hongbomiao.com/hm-spark/applications/find-retired-people-scala/project

  | => find-retired-people-scala-build / Compile / compileIncremental 0s
[info] loading settings for project find-retired-people-scala from build.sbt ...
[info] set current project to FindRetiredPeople (in build file:hongbomiao.com/hm-spark/applications/find-retired-people-scala/)

  | => find-retired-people-scala / update 0s
[info] Updating 

  | => find-retired-people-scala / update 0s
https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-client_2.12/3.3.4/hadoo…
    0.0% [          ] 0B (0B / s)

  | => find-retired-people-scala / update 0s
https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-client_2.12/3.3.4/hadoo…
    0.0% [          ] 0B (0B / s)
[info] Resolved  dependencies

  | => find-retired-people-scala / update 0s
[warn] 

  | => find-retired-people-scala / update 0s
[warn]  Note: Unresolved dependencies path:

  | => find-retired-people-scala / update 0s
[error] sbt.librarymanagement.ResolveException: Error downloading org.apache.hadoop:hadoop-client_2.12:3.3.4
[error]   Not found
[error]   Not found
[error]   not found: /Users/hongbo-miao/.ivy2/localorg.apache.hadoop/hadoop-client_2.12/3.3.4/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-client_2.12/3.3.4/hadoop-client_2.12-3.3.4.pom
[error]     at lmcoursier.CoursierDependencyResolution.unresolvedWarningOrThrow(CoursierDependencyResolution.scala:344)
[error]     at lmcoursier.CoursierDependencyResolution.$anonfun$update$38(CoursierDependencyResolution.scala:313)
[error]     at scala.util.Either$LeftProjection.map(Either.scala:573)
[error]     at lmcoursier.CoursierDependencyResolution.update(CoursierDependencyResolution.scala:313)
[error]     at sbt.librarymanagement.DependencyResolution.update(DependencyResolution.scala:60)
[error]     at sbt.internal.LibraryManagement$.resolve$1(LibraryManagement.scala:59)
[error]     at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$12(LibraryManagement.scala:133)
[error]     at sbt.util.Tracked$.$anonfun$lastOutput$1(Tracked.scala:73)
[error]     at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$20(LibraryManagement.scala:146)
[error]     at scala.util.control.Exception$Catch.apply(Exception.scala:228)
[error]     at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11(LibraryManagement.scala:146)
[error]     at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11$adapted(LibraryManagement.scala:127)
[error]     at sbt.util.Tracked$.$anonfun$inputChangedW$1(Tracked.scala:219)
[error]     at sbt.internal.LibraryManagement$.cachedUpdate(LibraryManagement.scala:160)
[error]     at sbt.Classpaths$.$anonfun$updateTask0$1(Defaults.scala:3687)
[error]     at scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error]     at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:62)
[error]     at sbt.std.Transform$$anon$4.work(Transform.scala:68)
[error]     at sbt.Execute.$anonfun$submit$2(Execute.scala:282)
[error]     at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:23)
[error]     at sbt.Execute.work(Execute.scala:291)
[error]     at sbt.Execute.$anonfun$submit$1(Execute.scala:282)
[error]     at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:265)
[error]     at sbt.CompletionService$$anon$2.call(CompletionService.scala:64)
[error]     at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[error]     at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
[error]     at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[error]     at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
[error]     at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
[error]     at java.base/java.lang.Thread.run(Thread.java:833)
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.hadoop:hadoop-client_2.12:3.3.4
[error]   Not found
[error]   Not found
[error]   not found: /Users/hongbo-miao/.ivy2/localorg.apache.hadoop/hadoop-client_2.12/3.3.4/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-client_2.12/3.3.4/hadoop-client_2.12-3.3.4.pom
[error] Total time: 2 s, completed Apr 19, 2023, 4:52:07 PM
make: *** [sbt-run] Error 1

在日志中注意,它需要 hadoop-client_2.12-3.3.4.pom。我认为

2.12
意味着Scala 2.12。但是,hadoop-client 没有 Scala 2.12 版本。

为了比较,这是它查找具有 Scala 版本的库的方式,例如 spark-sql

我能做些什么来帮助解决这个问题?谢谢!


更新:

我将 build.sbt 更新为此(从

%%
%

name := "FindRetiredPeople"
version := "1.0"
scalaVersion := "2.12.17"
libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "3.4.0",
  "org.apache.spark" %% "spark-sql" % "3.4.0",
  "org.apache.hadoop" % "hadoop-client" % "3.3.4"
)

我的错误变成:

# ...
[success] Total time: 6 s, completed Apr 19, 2023, 5:54:10 PM
23/04/19 17:54:11 INFO ShutdownHookManager: Shutdown hook called
23/04/19 17:54:11 INFO ShutdownHookManager: Deleting directory /private/var/folders/22/ntjwd5dx691gvkktkspl0f_00000gq/T/spark-37969feb-60a0-467f-8008-6862b70e84ee
23/04/19 17:54:11 ERROR Configuration: error parsing conf core-default.xml
java.nio.file.NoSuchFileException: hongbomiao.com/hm-spark/applications/ingest-from-s3-to-kafka/target/bg-jobs/sbt_905b731d/target/f5c922ec/359669fc/hadoop-client-api-3.3.4.jar
    at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:92)
    at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
    at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
    at java.base/sun.nio.fs.UnixFileAttributeViews$Basic.readAttributes(UnixFileAttributeViews.java:55)
    at java.base/sun.nio.fs.UnixFileSystemProvider.readAttributes(UnixFileSystemProvider.java:148)
    at java.base/java.nio.file.Files.readAttributes(Files.java:1851)
    at java.base/java.util.zip.ZipFile$Source.get(ZipFile.java:1264)
    at java.base/java.util.zip.ZipFile$CleanableResource.<init>(ZipFile.java:709)
    at java.base/java.util.zip.ZipFile.<init>(ZipFile.java:243)
    at java.base/java.util.zip.ZipFile.<init>(ZipFile.java:172)
    at java.base/java.util.jar.JarFile.<init>(JarFile.java:347)
    at java.base/sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:103)
    at java.base/sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:72)
    at java.base/sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:168)
    at java.base/sun.net.www.protocol.jar.JarFileFactory.getOrCreate(JarFileFactory.java:91)
    at java.base/sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:132)
    at java.base/sun.net.www.protocol.jar.JarURLConnection.getInputStream(JarURLConnection.java:175)
    at org.apache.hadoop.conf.Configuration.parse(Configuration.java:3009)
    at org.apache.hadoop.conf.Configuration.getStreamReader(Configuration.java:3105)
    at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3063)
    at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:3036)
    at org.apache.hadoop.conf.Configuration.loadProps(Configuration.java:2914)
    at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2896)
    at org.apache.hadoop.conf.Configuration.get(Configuration.java:1246)
    at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1863)
    at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1840)
    at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183)
    at org.apache.hadoop.util.ShutdownHookManager.shutdownExecutor(ShutdownHookManager.java:145)
    at org.apache.hadoop.util.ShutdownHookManager.access$300(ShutdownHookManager.java:65)
    at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:102)
Exception in thread "Thread-1" java.lang.RuntimeException: java.nio.file.NoSuchFileException: hongbomiao.com/hm-spark/applications/ingest-from-s3-to-kafka/target/bg-jobs/sbt_905b731d/target/f5c922ec/359669fc/hadoop-client-api-3.3.4.jar
    at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3089)
    at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:3036)
    at org.apache.hadoop.conf.Configuration.loadProps(Configuration.java:2914)
    at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2896)
    at org.apache.hadoop.conf.Configuration.get(Configuration.java:1246)
    at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1863)
    at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1840)
    at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183)
    at org.apache.hadoop.util.ShutdownHookManager.shutdownExecutor(ShutdownHookManager.java:145)
    at org.apache.hadoop.util.ShutdownHookManager.access$300(ShutdownHookManager.java:65)
    at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:102)
Caused by: java.nio.file.NoSuchFileException: hongbomiao.com/hm-spark/applications/ingest-from-s3-to-kafka/target/bg-jobs/sbt_905b731d/target/f5c922ec/359669fc/hadoop-client-api-3.3.4.jar
    at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:92)
    at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
    at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
    at java.base/sun.nio.fs.UnixFileAttributeViews$Basic.readAttributes(UnixFileAttributeViews.java:55)
    at java.base/sun.nio.fs.UnixFileSystemProvider.readAttributes(UnixFileSystemProvider.java:148)
    at java.base/java.nio.file.Files.readAttributes(Files.java:1851)
    at java.base/java.util.zip.ZipFile$Source.get(ZipFile.java:1264)
    at java.base/java.util.zip.ZipFile$CleanableResource.<init>(ZipFile.java:709)
    at java.base/java.util.zip.ZipFile.<init>(ZipFile.java:243)
    at java.base/java.util.zip.ZipFile.<init>(ZipFile.java:172)
    at java.base/java.util.jar.JarFile.<init>(JarFile.java:347)
    at java.base/sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:103)
    at java.base/sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:72)
    at java.base/sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:168)
    at java.base/sun.net.www.protocol.jar.JarFileFactory.getOrCreate(JarFileFactory.java:91)
    at java.base/sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:132)
    at java.base/sun.net.www.protocol.jar.JarURLConnection.getInputStream(JarURLConnection.java:175)
    at org.apache.hadoop.conf.Configuration.parse(Configuration.java:3009)
    at org.apache.hadoop.conf.Configuration.getStreamReader(Configuration.java:3105)
    at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3063)
    ... 10 more

Process finished with exit code 0

我的 /target/bg-jobs 文件夹实际上是空的。

java scala apache-spark hadoop sbt
1个回答
0
投票

使用

%
而不是
%%
添加hadoop(如您提到的link所写)

"org.apache.hadoop" % "hadoop-client" % "3.3.4"

https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-client/

https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-client

它是 Java 库,而不是 Scala 库(比如 spark-sql 等)

https://github.com/apache/hadoop

%%
添加Scala 后缀
_2.13
,
_2.12
,
_2.11
等。这与Java无关。


根据堆栈跟踪中的

java.base...
,您使用的是 Java 9+。尝试切换到 Java 8。这会改变什么吗?

https://cwiki.apache.org/confluence/display/HADOOP/Hadoop+Java+Versions

© www.soinside.com 2019 - 2024. All rights reserved.