Spark 3.1.2、Hadoop 3.2.1 和 AWS Hadoop 依赖项的问题

问题描述 投票:0回答:1

我在 Java Spark 应用程序中遇到 Spark、Hadoop 和 AWS Hadoop 依赖项的兼容性问题。

我本机运行的Spark版本是3.1.2。

问题: 我正在开发一个与 Amazon S3 中存储的数据交互的 Spark 应用程序(版本 3.1.2)。该应用程序使用 Hadoop(版本 3.2.1)并包含 AWS Hadoop 的依赖项。

pom.xml

    <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
      xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
      <modelVersion>4.0.0</modelVersion>

      <groupId>org.poc</groupId>
      <artifactId>delta-lake</artifactId>
      <version>1.0-SNAPSHOT</version>
      <packaging>jar</packaging>

      <name>delta-lake</name>
      <url>http://maven.apache.org</url>

      <properties>
        <java.version>1.8</java.version>
        <scala.version>2.12</scala.version>
        <spark.version>3.1.2</spark.version>
        <delta.version>1.0.0</delta.version>
        <aws.sdk.version>1.12.604</aws.sdk.version> <!-- Use the latest version -->
      </properties>

      <dependencies>
        <!-- Spark dependencies -->
        <dependency>
          <groupId>org.apache.spark</groupId>
          <artifactId>spark-core_${scala.version}</artifactId>
          <version>${spark.version}</version>
        </dependency>
        <dependency>
          <groupId>org.apache.spark</groupId>
          <artifactId>spark-sql_${scala.version}</artifactId>
          <version>${spark.version}</version>
        </dependency>

        <!-- Delta Lake dependencies -->
        <dependency>
          <groupId>io.delta</groupId>
          <artifactId>delta-core_2.12</artifactId>
          <version>${delta.version}</version>
        </dependency>

        <!-- Hadoop AWS for S3 connectivity -->
        <dependency>
          <groupId>org.apache.hadoop</groupId>
          <artifactId>hadoop-aws</artifactId>
          <version>3.3.1</version>
        </dependency>

        <!-- Hadoop dependencies -->
        <dependency>
          <groupId>org.apache.hadoop</groupId>
          <artifactId>hadoop-common</artifactId>
          <version>3.2.1</version> <!-- Update to match your Spark version -->
        </dependency>
        <dependency>
          <groupId>org.apache.hadoop</groupId>
          <artifactId>hadoop-hdfs</artifactId>
          <version>3.2.1</version> <!-- Update to match your Spark version -->
        </dependency>

        <!-- AWS SDK for Java dependencies -->
        <!-- https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk-s3 -->
        <dependency>
          <groupId>com.amazonaws</groupId>
          <artifactId>aws-java-sdk-s3</artifactId>
          <version>1.12.604</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk-core -->
        <dependency>
          <groupId>com.amazonaws</groupId>
          <artifactId>aws-java-sdk-core</artifactId>
          <version>1.12.604</version>
        </dependency>

      </dependencies>

      <build>
        <plugins>
          <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>3.8.1</version>
            <configuration>
              <source>${java.version}</source>
              <target>${java.version}</target>
            </configuration>
          </plugin>
          <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>3.2.4</version>
            <executions>
              <execution>
                <phase>package</phase>
                <goals>
                  <goal>shade</goal>
                </goals>
                <configuration>
                  <createDependencyReducedPom>false</createDependencyReducedPom>
                </configuration>
              </execution>
            </executions>
          </plugin>
        </plugins>
      </build>
    </project>

问题:

运行 Spark 应用程序时,我遇到以下与 IOStatisticsSource 相关的运行时错误:

java.lang.NoClassDefFoundError: org/apache/hadoop/fs/statistics/IOStatisticsSource

问题:

哪些版本的 Hadoop 和 AWS Hadoop 依赖项与 Spark 3.1.2 兼容? 我的依赖项中的 Hadoop 版本之间是否存在冲突? 如果我不使用 S3,是否应该删除 hadoop-aws 依赖项? 我尝试过更新依赖项、删除 hadoop-aws 并配置序列化,但问题仍然存在。

任何有关解决此问题的指导或对这些依赖项兼容性的见解将不胜感激。

java apache-spark hadoop delta-lake
1个回答
© www.soinside.com 2019 - 2024. All rights reserved.