断开连接时Spark无法以本地模式启动[在Spark中处理IPv6的可能错误??]

问题描述 投票:8回答:4

问题与此处描述的Error when starting spark-shell local on Mac相同

......但我找不到解决办法。我还习惯于获取格式错误的URI错误,但现在我得到了预期的主机名。

因此,当我没有连接到互联网时,火花壳无法以本地模式加载[请参阅下面的错误]。所以我正在运行从互联网上下载的Apache Spark 2.1.0,在我的Mac上运行。所以我运行./bin/spark-shell它给了我下面的错误。

所以我已经阅读了Spark代码,它正在使用Java的InetAddress.getLocalHost()来查找localhost的IP地址。因此,当我连接到互联网时,我会使用本地主机名返回IPv4。

scala> InetAddress.getLocalHost
res9: java.net.InetAddress = AliKheyrollahis-MacBook-Pro.local/192.168.1.26

但关键是,当断开连接时,我得到一个带有百分比值的IPv6(它是作用域的):

scala> InetAddress.getLocalHost
res10: java.net.InetAddress = AliKheyrollahis-MacBook-Pro.local/fe80:0:0:0:2b9a:4521:a301:e9a5%10

此IP与您在错误消息中看到的IP相同。我觉得我的问题是它抛出了Spark因为它无法在结果中处理%10

我的猜测是这是一个错误,很可能见证了很少,因为人们总是连接到互联网或他们的mac不返回范围的IPv6。即使我可以配置我的Mac来解决这个问题,我很高兴。我做过任何事情,包括将IPv6设置为手动或链接本地,但无济于事。

enter image description here

我也试过在::1 localhost删除/etc/hosts线无济于事。

所以这是DEBUG输出的完整错误(请注意用于监听的相同IPv6):

7/01/28 22:02:59 DEBUG ShutdownHookManager: Adding shutdown hook
17/01/28 22:03:06 DEBUG Shell: setsid is not available on this machine. So not using it.
17/01/28 22:03:06 DEBUG Shell: setsid exited with exit code 0
17/01/28 22:03:06 INFO SparkContext: Running Spark version 2.1.0
17/01/28 22:03:06 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, value=[Rate of successful kerberos logins and latency (milliseconds)], valueName=Time)
17/01/28 22:03:06 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, value=[Rate of failed kerberos logins and latency (milliseconds)], valueName=Time)
17/01/28 22:03:06 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, value=[GetGroups], valueName=Time)
17/01/28 22:03:06 DEBUG MetricsSystemImpl: UgiMetrics, User and group related metrics
17/01/28 22:03:26 DEBUG KerberosName: Kerberos krb5 configuration not found, setting default realm to empty
17/01/28 22:03:26 DEBUG Groups:  Creating new Groups object
17/01/28 22:03:26 DEBUG NativeCodeLoader: Trying to load the custom-built native-hadoop library...
17/01/28 22:03:26 DEBUG NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
17/01/28 22:03:26 DEBUG NativeCodeLoader: java.library.path=/Users/aliostad/torch/install/lib:/Users/aliostad/torch/install/lib:/Users/aliostad/torch/install/lib:/Users/aliostad/torch/install/lib:/Users/aliostad/torch/install/lib:/Users/aliostad/torch/install/lib:/Users/aliostad/torch/install/lib::/Users/aliostad/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.
17/01/28 22:03:26 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/01/28 22:03:26 DEBUG PerformanceAdvisory: Falling back to shell based
17/01/28 22:03:26 DEBUG JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
17/01/28 22:03:27 DEBUG Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
17/01/28 22:03:27 DEBUG UserGroupInformation: hadoop login
17/01/28 22:03:27 DEBUG UserGroupInformation: hadoop login commit
17/01/28 22:03:27 DEBUG UserGroupInformation: using local user:UnixPrincipal: aliostad
17/01/28 22:03:27 DEBUG UserGroupInformation: Using user: "UnixPrincipal: aliostad" with name aliostad
17/01/28 22:03:27 DEBUG UserGroupInformation: User entry: "aliostad"
17/01/28 22:03:27 DEBUG UserGroupInformation: UGI loginUser:aliostad (auth:SIMPLE)
17/01/28 22:03:27 INFO SecurityManager: Changing view acls to: aliostad
17/01/28 22:03:27 INFO SecurityManager: Changing modify acls to: aliostad
17/01/28 22:03:27 INFO SecurityManager: Changing view acls groups to:
17/01/28 22:03:27 INFO SecurityManager: Changing modify acls groups to:
17/01/28 22:03:27 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(aliostad); groups with view permissions: Set(); users  with modify permissions: Set(aliostad); groups with modify permissions: Set()
17/01/28 22:03:27 DEBUG SecurityManager: Created SSL options for fs: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()}
17/01/28 22:03:27 DEBUG InternalLoggerFactory: Using SLF4J as the default logging framework
17/01/28 22:03:27 DEBUG PlatformDependent0: java.nio.Buffer.address: available
17/01/28 22:03:27 DEBUG PlatformDependent0: sun.misc.Unsafe.theUnsafe: available
17/01/28 22:03:27 DEBUG PlatformDependent0: sun.misc.Unsafe.copyMemory: available
17/01/28 22:03:27 DEBUG PlatformDependent0: direct buffer constructor: available
17/01/28 22:03:27 DEBUG PlatformDependent0: java.nio.Bits.unaligned: available, true
17/01/28 22:03:27 DEBUG PlatformDependent0: java.nio.DirectByteBuffer.<init>(long, int): available
17/01/28 22:03:27 DEBUG Cleaner0: java.nio.ByteBuffer.cleaner(): available
17/01/28 22:03:27 DEBUG PlatformDependent: Java version: 8
17/01/28 22:03:27 DEBUG PlatformDependent: -Dio.netty.noUnsafe: false
17/01/28 22:03:27 DEBUG PlatformDependent: sun.misc.Unsafe: available
17/01/28 22:03:27 DEBUG PlatformDependent: -Dio.netty.noJavassist: false
17/01/28 22:03:27 DEBUG PlatformDependent: Javassist: available
17/01/28 22:03:27 DEBUG PlatformDependent: -Dio.netty.tmpdir: /var/folders/pz/vgqg2gns18j_kxsnkzrp6x_m0000gn/T (java.io.tmpdir)
17/01/28 22:03:27 DEBUG PlatformDependent: -Dio.netty.bitMode: 64 (sun.arch.data.model)
17/01/28 22:03:27 DEBUG PlatformDependent: -Dio.netty.noPreferDirect: false
17/01/28 22:03:27 DEBUG PlatformDependent: io.netty.maxDirectMemory: 0 bytes
17/01/28 22:03:27 DEBUG JavassistTypeParameterMatcherGenerator: Generated: io.netty.util.internal.__matchers__.org.apache.spark.network.protocol.MessageMatcher
17/01/28 22:03:27 DEBUG JavassistTypeParameterMatcherGenerator: Generated: io.netty.util.internal.__matchers__.io.netty.buffer.ByteBufMatcher
17/01/28 22:03:27 DEBUG MultithreadEventLoopGroup: -Dio.netty.eventLoopThreads: 8
17/01/28 22:03:27 DEBUG NioEventLoop: -Dio.netty.noKeySetOptimization: false
17/01/28 22:03:27 DEBUG NioEventLoop: -Dio.netty.selectorAutoRebuildThreshold: 512
17/01/28 22:03:27 DEBUG PlatformDependent: org.jctools-core.MpscChunkedArrayQueue: available
17/01/28 22:03:27 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.numHeapArenas: 8
17/01/28 22:03:27 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.numDirectArenas: 8
17/01/28 22:03:27 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.pageSize: 8192
17/01/28 22:03:27 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.maxOrder: 11
17/01/28 22:03:27 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.chunkSize: 16777216
17/01/28 22:03:27 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.tinyCacheSize: 512
17/01/28 22:03:27 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.smallCacheSize: 256
17/01/28 22:03:27 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.normalCacheSize: 64
17/01/28 22:03:27 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.maxCachedBufferCapacity: 32768
17/01/28 22:03:27 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.cacheTrimInterval: 8192
17/01/28 22:03:27 DEBUG ThreadLocalRandom: -Dio.netty.initialSeedUniquifier: 0x3185a000d3a47bd4 (took 1 ms)
17/01/28 22:03:27 DEBUG ByteBufUtil: -Dio.netty.allocator.type: unpooled
17/01/28 22:03:27 DEBUG ByteBufUtil: -Dio.netty.threadLocalDirectBufferSize: 65536
17/01/28 22:03:27 DEBUG ByteBufUtil: -Dio.netty.maxThreadLocalCharBufferSize: 16384
17/01/28 22:03:27 DEBUG NetUtil: Loopback interface: lo0 (lo0, 0:0:0:0:0:0:0:1)
17/01/28 22:03:27 DEBUG NetUtil: /proc/sys/net/core/somaxconn: 128 (non-existent)
17/01/28 22:03:27 DEBUG TransportServer: Shuffle server started on port: 56107
17/01/28 22:03:27 INFO Utils: Successfully started service 'sparkDriver' on port 56107.
17/01/28 22:03:27 DEBUG SparkEnv: Using serializer: class org.apache.spark.serializer.JavaSerializer
17/01/28 22:03:27 INFO SparkEnv: Registering MapOutputTracker
17/01/28 22:03:27 DEBUG MapOutputTrackerMasterEndpoint: init
17/01/28 22:03:27 INFO SparkEnv: Registering BlockManagerMaster
17/01/28 22:03:27 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
17/01/28 22:03:27 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
17/01/28 22:03:28 INFO DiskBlockManager: Created local directory at /private/var/folders/pz/vgqg2gns18j_kxsnkzrp6x_m0000gn/T/blockmgr-4079e45b-e4e0-4386-bffe-42af18634710
17/01/28 22:03:28 DEBUG DiskBlockManager: Adding shutdown hook
17/01/28 22:03:28 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
17/01/28 22:03:28 INFO SparkEnv: Registering OutputCommitCoordinator
17/01/28 22:03:28 DEBUG OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: init
17/01/28 22:03:28 DEBUG SecurityManager: Created SSL options for ui: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()}
17/01/28 22:03:28 INFO Utils: Successfully started service 'SparkUI' on port 4040.
17/01/28 22:03:28 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://fe80:0:0:0:2b9a:4521:a301:e9a5%10:4040
17/01/28 22:03:28 INFO Executor: Starting executor ID driver on host localhost
17/01/28 22:03:28 INFO Executor: Using REPL class URI: spark://fe80:0:0:0:2b9a:4521:a301:e9a5%10:56107/classes
17/01/28 22:03:28 ERROR SparkContext: Error initializing SparkContext.
java.lang.AssertionError: assertion failed: Expected hostname
    at scala.Predef$.assert(Predef.scala:170)
    at org.apache.spark.util.Utils$.checkHost(Utils.scala:931)
    at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:31)
    at org.apache.spark.executor.Executor.<init>(Executor.scala:121)
    at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalSchedulerBackend.scala:59)
    at org.apache.spark.scheduler.local.LocalSchedulerBackend.start(LocalSchedulerBackend.scala:126)
    at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:156)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
    at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
    at $line3.$read$$iw$$iw.<init>(<console>:15)
    at $line3.$read$$iw.<init>(<console>:42)
    at $line3.$read.<init>(<console>:44)
    at $line3.$read$.<init>(<console>:48)
    at $line3.$read$.<clinit>(<console>)
    at $line3.$eval$.$print$lzycompute(<console>:7)
    at $line3.$eval$.$print(<console>:6)
    at $line3.$eval.$print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
    at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
    at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
    at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
    at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
    at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
    at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
    at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
    at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
    at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
    at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
    at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
    at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
    at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
    at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
    at org.apache.spark.repl.Main$.doMain(Main.scala:68)
    at org.apache.spark.repl.Main$.main(Main.scala:51)
    at org.apache.spark.repl.Main.main(Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/01/28 22:03:28 INFO SparkUI: Stopped Spark web UI at http://fe80:0:0:0:2b9a:4521:a301:e9a5%10:4040
17/01/28 22:03:28 ERROR Utils: Uncaught exception in thread main
java.lang.NullPointerException
    at org.apache.spark.scheduler.local.LocalSchedulerBackend.org$apache$spark$scheduler$local$LocalSchedulerBackend$$stop(LocalSchedulerBackend.scala:158)
    at org.apache.spark.scheduler.local.LocalSchedulerBackend.stop(LocalSchedulerBackend.scala:137)
    at org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:467)
    at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1588)
    at org.apache.spark.SparkContext$$anonfun$stop$8.apply$mcV$sp(SparkContext.scala:1826)
    at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1283)
    at org.apache.spark.SparkContext.stop(SparkContext.scala:1825)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:587)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
    at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
    at $line3.$read$$iw$$iw.<init>(<console>:15)
    at $line3.$read$$iw.<init>(<console>:42)
    at $line3.$read.<init>(<console>:44)
    at $line3.$read$.<init>(<console>:48)
    at $line3.$read$.<clinit>(<console>)
    at $line3.$eval$.$print$lzycompute(<console>:7)
    at $line3.$eval$.$print(<console>:6)
    at $line3.$eval.$print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
    at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
    at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
    at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
    at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
    at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
    at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
    at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
    at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
    at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
    at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
    at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
    at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
    at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
    at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
    at org.apache.spark.repl.Main$.doMain(Main.scala:68)
    at org.apache.spark.repl.Main$.main(Main.scala:51)
    at org.apache.spark.repl.Main.main(Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/01/28 22:03:28 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
17/01/28 22:03:28 INFO MemoryStore: MemoryStore cleared
17/01/28 22:03:28 INFO BlockManager: BlockManager stopped
17/01/28 22:03:28 INFO BlockManagerMaster: BlockManagerMaster stopped
17/01/28 22:03:28 WARN MetricsSystem: Stopping a MetricsSystem that is not running
17/01/28 22:03:28 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
17/01/28 22:03:28 INFO SparkContext: Successfully stopped SparkContext
java.lang.AssertionError: assertion failed: Expected hostname
  at scala.Predef$.assert(Predef.scala:170)
  at org.apache.spark.util.Utils$.checkHost(Utils.scala:931)
  at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:31)
  at org.apache.spark.executor.Executor.<init>(Executor.scala:121)
  at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalSchedulerBackend.scala:59)
  at org.apache.spark.scheduler.local.LocalSchedulerBackend.start(LocalSchedulerBackend.scala:126)
  at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:156)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)
  at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313)
  at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)
  at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)
  at scala.Option.getOrElse(Option.scala:121)
  at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
  at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
  ... 47 elided
<console>:14: error: not found: value spark
       import spark.implicits._
              ^
<console>:14: error: not found: value spark
       import spark.sql
              ^
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.1.0
      /_/

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_40)
Type in expressions to have them evaluated.
Type :help for more information.
macos shell apache-spark apache-spark-2.0
4个回答
11
投票

好吧,我似乎能够直接通过配置--conf spark.driver.host=localhost绕过它

所以我跑:

./bin/spark-shell --conf spark.driver.host=localhost

如果有更好的解决方案,请告诉我。


[UPDATE]

Jacek Laskowski confirmed这可能是目前唯一可用的解决方案。


5
投票

对于那些通过sbt和同样的问题工作的人。只需将.set(“spark.driver.host”,“localhost”)添加到SparkConf()中,因此初始化spark上下文将如下所示:

val conf = 
    new SparkConf()
    .setAppName( "temp1" )
    .setMaster( "local" )
    .set( "spark.driver.host", "localhost" )

val sc = 
    SparkContext
    .getOrCreate( conf )

此初始配置必须在SparkContext的任何其他getOrCreate之前完成。


0
投票

在我的测试中使用SharedSparkContext时遇到了同样的问题。将这两行(在我的beforeAll方法中)添加为@dennis建议为我解决了这个问题:

  override def beforeAll(): Unit = {
    super.beforeAll()
    sc.getConf.setMaster("local").set("spark.driver.host", "localhost")
  }

我希望这将在下一版本的Spark中得到解决。


0
投票

如果您使用的是pyspark,请使用config方法将主机驱动程序设置为localhost。

spark = (SparkSession
              .builder
              .appName( "temp1" )
              .config( "spark.driver.host", "localhost" )
              .getOrCreate()
              )

0
投票

我不确定这对你有帮助,但它解决了我在Mac上的问题。

1)获取您的主机名。 (在终端中,这通常是该行的第一部分(在Linux之前的@之前,在Mac之前:)(在Mac中,您还可以在终端中键入主机名以获取主机名)

2)在/ etc / hosts中添加:

127.0.0.1无论你的主机名是什么

对我来说,我最初有

127.0.0.1 localhost

但我改成了

127.0.0.1我的主机名

保存此更改并重试pyspark。

我从这个stackoverflow得到了这个解决方案:Mac spark-shell Error initializing SparkContext

我希望这可以帮助你。

© www.soinside.com 2019 - 2024. All rights reserved.