Spark提交:这些日志是无级别生成的

问题描述 投票:0回答:1

我在 Kubernetes 上运行 Spark,spark 应用程序的日志发送到 Datadog,除了 Spark 提交后直接的前 4 行之外,日志是否正常且发送正确:

WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/opt/spark/jars/spark-unsafe_2.12-3.2.2.jar) to constructor java.nio.DirectByteBuffer(long,int)
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release

由于这些行没有附加 level 属性,它们被 Datadog 解释为错误,这有点烦人:

其余日志由 Log4j 发送,并附有消息的级别,如下所示:

{
  endOfBatch: false
  instant: {
    epochMillisecond: 1686055630113.133
    epochSecond: 1686055630
    nanoOfSecond: 113133000
  }
  level: INFO
  loggerFqcn: org.apache.logging.slf4j.Log4jLogger
  loggerName: com.package.Main
  thread: main
  threadId: 1
  threadPriority: 5
}

那么,如何在日志中声明前 4 行的信息级别。

apache-spark logging log4j
1个回答
2
投票

此警告是因为您将 Spark 3.x 与 Java 11 一起使用,有 2 种可能的“简单”解决方案:

  1. 将 Java 版本从 11 降级至 8。

  2. 将Spark版本升级至3.3+

© www.soinside.com 2019 - 2024. All rights reserved.