错误:在运行spark-submit命令时无法从JAR文件加载主类

问题描述 投票:0回答:1

我为我的Scala项目创建了Jar,然后从终端运行以下命令[“ com.sukrit.hbase_”是程序包名称,“ Filters_Usage”是我要运行的scala类]

Macintosh:bin sukritmehta$ ./spark-submit --class "com.sukrit.hbase_.Filters_Usage" --master local[*] "/Users/sukritmehta/Desktop/Sukrit/Spa`rk_Hbase/target/Spark_Hbase-0.0.1-SNAPSHOT.jar"

但是运行此命令后,我遇到以下错误:

20/04/24 20:53:01 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Error: Failed to load class com.sukrit.hbase_.Filters_Usage.
20/04/24 20:53:02 INFO util.ShutdownHookManager: Shutdown hook called
20/04/24 20:53:02 INFO util.ShutdownHookManager: Deleting directory /private/var/folders/d4/psn4wv8s7tjbfgt6gkt35z9c0000gq/T/spark-ae120675-a1c6-4300-997c-bd53f9f35187

pom.xml

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>Spark_Hbase</groupId>
  <artifactId>Spark_Hbase</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <build>
    <sourceDirectory>src</sourceDirectory>
    <plugins>
      <plugin>
        <artifactId>maven-compiler-plugin</artifactId>
        <version>3.6.1</version>
        <configuration>
          <source>1.8</source>
          <target>1.8</target>
        </configuration>
      </plugin>
    </plugins>
  </build>

  <dependencies>
        <!-- Scala and Spark dependencies -->
        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
            <version>2.11.11</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.4.0</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-avro_2.11</artifactId>
            <version>2.4.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>2.4.0</version>
            <scope>provided</scope>
        </dependency>
        <!-- <dependency> <groupId>org.codehaus.janino</groupId> <artifactId>commons-compiler</artifactId> 
            <version>3.0.7</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> 
            <artifactId>spark-network-common_2.11</artifactId> <version>2.1.1</version> 
            </dependency> -->
        <dependency>
            <groupId>joda-time</groupId>
            <artifactId>joda-time</artifactId>
            <version>2.9.9</version>
        </dependency>

        <!-- <dependency> <groupId>org.mongodb.spark</groupId> <artifactId>mongo-spark-connector_2.10</artifactId> 
            <version>2.1.1</version> </dependency> -->
        <!-- <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-mllib_2.10</artifactId> 
            <version>2.1.1</version> </dependency> -->

        <!-- https://mvnrepository.com/artifact/org.apache.hbase/hbase -->


        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-client</artifactId>
            <version>2.2.4</version>
        </dependency>

          <!--  <dependency>
                <groupId>org.apache.hadoop</groupId>
                <artifactId>hadoop-core</artifactId>
                <version>1.2.1</version>
            </dependency>      --> 



        <dependency>
              <groupId>org.apache.hbase</groupId>
              <artifactId>hbase-spark</artifactId>
              <version>2.0.0-alpha4</version> <!-- Hortonworks Latest -->
        </dependency>


         <!-- https://mvnrepository.com/artifact/org.apache.hbase/hbase-mapreduce -->
<dependency>
    <groupId>org.apache.hbase</groupId>
    <artifactId>hbase-mapreduce</artifactId>
    <version>2.2.4</version>
</dependency>

<dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>2.6.5</version>
            <scope>provided</scope>
        </dependency>

    </dependencies>
</project>

如果有人能帮助我解决这个问题,那就太好了。

scala apache-spark hbase spark-submit
1个回答
0
投票

您的spark-submit命令错误。请在下面检查。


spark-submit \
    --master local \
    --class "package.mainClass" "application jar path" \

spark-submit --master local[*] \
--class "com.sukrit.hbase_.Filters_Usage" "/Users/sukritmehta/Desktop/Sukrit/Spa`rk_Hbase/target/Spark_Hbase-0.0.1-SNAPSHOT.jar"


© www.soinside.com 2019 - 2024. All rights reserved.