在sbt程序集中未解析Typesafe配置

问题描述 投票:1回答:1

我使用typesafe.Config在我的Scala类中加载application.conf,并使用此方法正确解析驻留在src / main / resources中的配置文件:

 val config = ConfigFactory.load()
 config.resolve()

但是,当我尝试使用spark-submit运行jar文件时,找不到该文件。我使用sbt程序集打包jar文件。我该如何解决这种行为?这是我的build.sbt文件,包含合并策略和依赖项:

dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-core" % "2.9.5"
dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % "2.9.5"
dependencyOverrides += "com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.9.5"
dependencyOverrides += "org.scalatest" %% "scalatest" % "3.2.0-SNAP10" % "it, test"

resolvers += "Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven"
resolvers += "confluent" at "http://packages.confluent.io/maven/"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion,
  "org.apache.spark" %% "spark-sql" % sparkVersion,
  "org.apache.spark" %% "spark-hive" % sparkVersion,
  "org.scalatest" %% "scalatest" % "3.2.0-SNAP10" % "it, test",
  "org.scalacheck" %% "scalacheck" % "1.14.0" % "it, test",
  "io.kubernetes" % "client-java" % "3.0.0" % "it",
  "org.json" % "json" % "20180813",
  "io.circe" %% "circe-core" % circeVersion,
  "io.circe" %% "circe-generic" % circeVersion,
  "io.circe" %% "circe-parser" % circeVersion,
  "org.apache.avro" % "avro" % "1.8.2",
  "org.apache.spark" %% "spark-avro" % "2.4.0",
  "org.apache.logging.log4j" % "log4j-core" % "2.7",
  "org.apache.logging.log4j" % "log4j-api" % "2.7",
  "io.prestosql" % "presto-jdbc" % "304" % "it, test",
  "com.facebook.presto" % "presto-jdbc" % "0.217" % "it, test",
  "com.microsoft.azure" % "azure-sqldb-spark" % "1.0.2",
  "com.microsoft.sqlserver" % "mssql-jdbc" % "7.2.0.jre8",
  "com.typesafe" % "config" % "1.3.3"
)

assemblyJarName in assembly := "spark_mssql_job_avro.jar"
mainClass in assembly := Some("main.SparkJob")

assemblyMergeStrategy in assembly := {
  case x if x.endsWith(".conf") => MergeStrategy.discard
  case PathList("org", "apache", "spark", "unused", "UnusedStubClass.class") => MergeStrategy.first
  case PathList("org", "apache", "commons", "logging", _*) => MergeStrategy.first
  case PathList("org", "apache", "commons", "beanutils", _*) => MergeStrategy.first
  case PathList("org", "apache", "commons", "collections", _*) => MergeStrategy.first
  case PathList("org", "apache", "hadoop", "yarn", _*) => MergeStrategy.first
  case PathList("org", "aopalliance", _*) => MergeStrategy.first
  case PathList("org", "objenesis", _*) => MergeStrategy.first
  case PathList("com", "sun", "jersey", _*) => MergeStrategy.first
  case PathList("org", "apache", "hadoop", "yarn", _*) => MergeStrategy.first
  case PathList("org", "slf4j", "impl", _*) => MergeStrategy.first
  case PathList("com", "codahale", "metrics", _*) => MergeStrategy.first
  case PathList("javax", "transaction", _*) => MergeStrategy.first
  case PathList("javax", "inject", _*) => MergeStrategy.first
  case PathList("javax", "xml", _*) => MergeStrategy.first
  case PathList("META-INF", "jersey-module-version") => MergeStrategy.first
  case PathList("plugin.xml") => MergeStrategy.first
  case PathList("parquet.thrift") => MergeStrategy.first
  case PathList("git.properties") => MergeStrategy.first
  case PathList("codegen", "config.fmpp") => MergeStrategy.first
  case PathList("overview.html") => MergeStrategy.discard
  case "application.conf" => MergeStrategy.concat
  case x => (assemblyMergeStrategy in assembly).value(x)
}
scala apache-spark sbt sbt-assembly
1个回答
2
投票

你删除所有.conf文件(当有多个同名时)

case x if x.endsWith(".conf") => MergeStrategy.discard

删除该行应解决问题。以防万一,然后查看jar内部并检查文件是否包含您的预期。

© www.soinside.com 2019 - 2024. All rights reserved.