使用R将数据桶中的火花数据帧写入azure数据湖存储

问题描述 投票:1回答:1

我想使用R将数据库中的spark数据帧保存/写入/上传到azure data lack store文件夹中。我找到了以下python代码。

spark_df.coalesce(1).write.format("com.databricks.spark.csv").option("header", "true").mode("overwrite").save('...path to azure data lake store folder')

你能告诉我一个类似这个代码的SparkR吗?

r apache-spark sparkr
1个回答
1
投票

这应该是:

spark_df %>% 
  coalesce(1L) %>%          # Same as coalesce(1).
  write.df(                 # Generic writer, because there is no csv specific one
    "...path to azure...",  # Path as before 
     source = "csv",        # Since 2.0 you don't need com.databricks 
     mode = "overwrite", 
     header = "true"        # All ... are used as options
  )
© www.soinside.com 2019 - 2024. All rights reserved.