由于 java.nio.channels.UnresolvedAddressException: null 导致初始化 SparkContext 时出错

问题描述 投票:0回答:1

我在本地计算机上运行 scala Spark 应用程序时遇到错误,这似乎是由网络错误引起的。自本周早些时候以来,我尝试运行的每个 Spark 应用程序都因相同的错误而失败,当时它们之前可以正常工作并且没有任何变化(据我所知)。代码在 GitHub CI 和其他人的机器上运行良好。这些失败甚至阻止我编译代码,因为编译代码会运行单元测试。

我使用的是配备 Intel Core i7 (x64) 的 MacBook Pro,使用 IntelliJ Idea 和 sbt 来编写、测试和编译代码。该错误一开始仅影响终端 sbt 使用,但也开始影响 IntelliJ Idea 的内置代码运行/测试实用程序。

回溯(根据需要更改了一些识别名称):

[info] Loading project definition from /Users/carlosplanelles/path/to/repo/project
[info] Set current project to ProjectName (in build file:/Users/carlosplanelles/path/to/repo/)
[info] Compiling 7 Scala sources to /Users/carlosplanelles/path/to/repo/target/scala-2.12/classes...
[info] Compiling 9 Scala sources to /Users/carlosplanelles/path/to/repo/target/scala-2.12/test-classes...
[warn] 7 deprecations (since 1.3.2); re-run with -deprecation for details
[warn] one warning found
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/carlosplanelles/.ivy2/cache/ch.qos.logback/logback-classic/jars/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/carlosplanelles/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder]
[2024-02-09 16:43:27,956] [pool-4-thread-1] WARN  o.a.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
[2024-02-09 16:43:29,158] [pool-4-thread-1] ERROR org.apache.spark.SparkContext - Error initializing SparkContext. 
java.nio.channels.UnresolvedAddressException: null
        at sun.nio.ch.Net.checkAddress(Net.java:100)
        at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:220)
        at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:134)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:550)
        at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1334)
        at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:506)
        at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:491)
        at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:973)
        at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:248)
        at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:356)
        at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)
        at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:500)
        at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
        at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
        at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.lang.Thread.run(Thread.java:750)
[info] TestClassName:
[info] com.unit.testing.package *** ABORTED ***
[info]   java.nio.channels.UnresolvedAddressException:
[info]   at sun.nio.ch.Net.checkAddress(Net.java:100)
[info]   at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:220)
[info]   at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:134)
[info]   at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:550)
[info]   at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1334)
[info]   at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:506)
[info]   at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:491)
[info]   at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:973)
[info]   at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:248)
[info]   at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:356)
[info]   ...
[info] Run completed in 3 seconds, 451 milliseconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 1
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] *** 1 SUITE ABORTED ***
[error] Error during tests:
[error]         com.unit.testing.package
[error] (test:testOnly) sbt.TestsFailedException: Tests unsuccessful
[error] Total time: 36 s, completed 9-Feb-2024 4:43:30 PM

SparkContext 使用此共享特征进行初始化(也尝试过

local
local[1]
作为主 url):

import org.apache.spark.sql.SparkSession

trait SharedSparkContext {
  implicit val _spark = SparkSession
    .builder()
    .master("local[*]")
    .getOrCreate()
}

我尝试过清理并重新安装 Java(JDK 和 JRE)、重新安装 sbt、

sbt clean
、清除 IntelliJ Idea 缓存以及重新启动软件和笔记本电脑。这些都没有改变回溯。

java scala apache-spark sbt nio
1个回答
0
投票

如果没有任何进一步的上下文,很难说,但从表面上看,如果这是特征初始化顺序的问题,我不会感到惊讶。一般来说,不建议在

val
中包含
trait
,因为如果特征之间存在依赖关系,它们的继承顺序就会变得重要。如果你有一个隐式的,这会让事情变得更加复杂。

作为示例,下面是一些导致

null
出现的代码:

trait Foo {
  val foo = "string"
}

trait Bar { this: Foo =>
  println(foo)
}

// `object`s are initialized lazily
object FooBar extends Foo with Bar
object BarFoo extends Bar with Foo

FooBar // prints "string"
BarFoo // prints "null"

您可以在 Scastie 上使用此代码

如果这是您的问题,您可以通过以下几种方式解决:

  • clean:重构层次结构,以便您的
    val
    s
     中不再有 
    trait
  • 简单:将
    _spark
    设为
    lazy val
    (或者可能是
    def
    ,但您可能不想每次都重新创建上下文)
  • 困难:重新排序继承的特征,以便它们以确保在构造时找到值的方式线性化
© www.soinside.com 2019 - 2024. All rights reserved.