如何添加带有火花壳的罐子?

问题描述 投票:0回答:1

我尝试了以下这些方法:

>>./spark-shell –-jars /home/my_path/my_jar.jar

并且在外壳中,我尝试导入软件包:

scala> import com.vertica.spark._
<console>:23: error: object vertica is not a member of package com
       import com.vertica.spark._

它不起作用,我也尝试从罐子的路径中删除斜杠(/)

>>./spark-shell –-jars home/my_path/my_jar.jar

还是一样。虽然有警告

20/04/21 22:34:40 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://ubuntu:4040
Spark context available as 'sc' (master = local[*], app id = local-1587488711233).
Spark session available as 'spark'.
Welcome to

但是另一方面,如果我进入外壳并尝试使用相同的jar路径添加require,那么它将成功导入:

scala> :require /home/my_path/my_jar.jar
Added '/home/my_path/my_jar.jar' to classpath.

scala> import com.vertica.spark._
import com.vertica.spark._

添加带有火花壳本身的罐子时我会缺少什么?

scala apache-spark read-eval-print-loop
1个回答
0
投票

此问题可能是由于hadoop原生问题,请尝试在源bashrc下进行,您会很好。导出JAVA_LIBRARY_PATH = $ HADOOP_HOME / lib / native:$ JAVA_LIBRARY_PATH并导出LD_LIBRARY_PATH = $ HADOOP_HOME / lib / native:$ LD_LIBRARY_PATH

© www.soinside.com 2019 - 2024. All rights reserved.