使用Livy提交spark submit后外部文件的路径是什么?

问题描述 投票:0回答:0

我正在使用 Livy batch api 提交 spark 作业,如下所示。在这里,我将 .p12 作为文件参数传递,稍后将在应用程序中用于 ssl 通信。

{
   "className":"com.StreamingMain",
   "name":"StreamingMain.single",
   "conf":{
      "spark.yarn.submit.waitAppCompletion":"false",
      "spark.hadoop.fs.azure.enable.flush":"false",
      "spark.executorEnv.CLUSTER":"dev",
      "spark.executorEnv.NAME_SPACE":"dev",
      "spark.executorEnv.AZURE_ACCOUNT_NAME":"istoragedev",
      "spark.executorEnv.KAFKA_HOST":"",
      "spark.executorEnv.KAFKA_PORT":"",
      "spark.executorEnv.KAFKA_USER":"",
      "spark.executorEnv.KAFKA_PASSWD":"+++",
      "spark.executorEnv.HANA_DATA_LAKE_FILE_SYSTEM_URI":"",
      "spark.executorEnv.HANA_DATA_LAKE_PK12_LOCATION":"",
      "spark.executorEnv.HANA_DATA_LAKE_PASSWORD":"/vv8Mg==",
      "spark.sql.legacy.parquet.int96RebaseModeInRead":"LEGACY",
      "spark.sql.legacy.parquet.int96RebaseModeInWrite":"LEGACY",
      "spark.sql.legacy.parquet.datetimeRebaseModeInRead":"LEGACY",
      "spark.sql.legacy.parquet.datetimeRebaseModeInWrite":"LEGACY",
      "spark.sql.legacy.timeParserPolicy":"LEGACY"
   },
   "args":[
      "abfs://streaming/cs-dev.cs-dev.json"
   ],
   "driverMemory":"2g",
   "executorMemory":"12g",
   "driverCores":1,
   "executorCores":8,
   "numExecutors":1,
   "jars":[
      "abfs://dp/dp.jar"
   ],
   "file":"abfs://dp/dp.jar",
   "files":[
      "/app/pk12/client-keystore.p12"
   ]
}

我的问题是 client-keystore.p12 会被复制到 spark 集群吗?如果是,client-keystore.p12 的文件路径是什么,即复制到哪个位置以及如何找到它?

任何帮助将不胜感激

apache-spark spark-streaming spark-submit livy
© www.soinside.com 2019 - 2024. All rights reserved.