类org.apache.spark.storage.StorageUtils$(在未命名模块@0x13d73fa中)无法访问类sun.nio.ch.DirectBuffer

问题描述 投票:0回答:1

我尝试在IDEA中的springboot框架(mvn)项目中执行spark程序,但我发现了以下问题:

信息

org.apache.spark.storage.BlockManagerMasterEndpoint -- Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
10:52:23.594 [main] INFO org.apache.spark.storage.BlockManagerMasterEndpoint -- BlockManagerMasterEndpoint up
Exception in thread "main" java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x13d73fa) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x13d73fa
    at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:213)
    at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)
    at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:114)
    at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:353)
    at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:290)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:339)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:194)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:279)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:464)
    at SparkML.sparkML(SparkML.scala:33)
    at Demo$.main(Demo.scala:4)

这是我的完全依赖:

<properties>
        <java.version>17</java.version>
        <spark.version>3.5.0</spark.version>
        <scala.version>2.12.13</scala.version>
        <hadoop.version>3.3.2</hadoop.version>
    </properties>
    <dependencies>
        <!-- Spark dependencies -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.12</artifactId>
            <version>3.3.2</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.12</artifactId>
            <version>3.3.2</version>
        </dependency>

        <!-- Scala dependency -->
        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
            <version>2.12.15</version>
        </dependency>
        <!-- Hadoop dependencies -->
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
            <version>${hadoop.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-hdfs</artifactId>
            <version>${hadoop.version}</version>
        </dependency>

        <!-- Apache Spark MLlib -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-mllib_2.12</artifactId>
            <version>3.3.2</version>
        </dependency>



        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
            <exclusions>
                <exclusion>
                    <groupId>org.apache.logging.log4j</groupId>
                    <artifactId>log4j-to-slf4j</artifactId>
                </exclusion>
            </exclusions>
        </dependency>
        <dependency>
            <groupId>org.mybatis.spring.boot</groupId>
            <artifactId>mybatis-spring-boot-starter</artifactId>
            <version>3.0.3</version>
        </dependency>

        <dependency>
            <groupId>com.mysql</groupId>
            <artifactId>mysql-connector-j</artifactId>
            <scope>runtime</scope>
        </dependency>
        <dependency>
            <groupId>org.projectlombok</groupId>
            <artifactId>lombok</artifactId>
            <optional>true</optional>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.mybatis.spring.boot</groupId>
            <artifactId>mybatis-spring-boot-starter-test</artifactId>
            <version>3.0.3</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>com.baomidou</groupId>
            <artifactId>mybatis-plus-spring-boot3-starter</artifactId>
            <version>3.5.5</version>
        </dependency>
        <!--引入druid数据源-->
        <dependency>
            <groupId>com.alibaba</groupId>
            <artifactId>druid-spring-boot-starter</artifactId>
            <version>1.2.6</version>
        </dependency>

    </dependencies>

然后我搜索了这个问题,发现可能是我的sprintboot框架中使用的java17和spark之间的冲突导致了这个问题。

首先,我将spark的版本从3.2.3调整为3.5.0(spark 3.5.0运行在Java 8/11/17,Scala 2.12/2.13来自apache文档)。然而问题依然存在。

之后我尝试了stackoverflow中类似问题中其他人提供的方法。我在配置中添加了 vm potions,如下图所示 vm options

这是虚拟机选项:

--add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/sun.nio.cs=ALL-UNNAMED --add-opens=java.base/sun.security.action=ALL-UNNAMED --add-opens=java.base/sun.util.calendar=ALL-UNNAMED --add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED
apache-spark java-8 java-17 java-module java-platform-module-system
1个回答
0
投票

对于异常中突出显示的包,您的 VM 参数中已经有一个

--add-opens
。相反,您需要的是
--add-exports

因为模块 java.base 不会将 sun.nio.ch 导出到未命名模块

添加如下所示的参数将有助于您(暂时):

--add-exports java.base/sun.nio.ch=ALL-UNNAMED

更多关于两者之间的区别可以在这里找到

© www.soinside.com 2019 - 2024. All rights reserved.