Spark - 将平面数据帧映射到可配置的嵌套json架构

问题描述 投票:0回答:3

我有一个5-6列的平面数据帧。我想嵌套它们并将其转换为嵌套数据帧,以便我可以将其写入镶木地板格式。

但是,我不想使用案例类,因为我试图尽可能地保持代码的可配置性。我坚持这部分并需要一些帮助。

我的意见:

ID ID-2 Count(apple) Count(banana) Count(potato) Count(Onion)

 1  23    1             0             2             0

 2  23    0             1             0             1

 2  29    1             0             1             0

我的输出:

第1行:

{
  "id": 1,
  "ID-2": 23,
  "fruits": {
    "count of apple": 1,
    "count of banana": 0
  },
  "vegetables": {
    "count of potato": 2,
    "count of onion": 0
  }
} 

我尝试在spark数据框中使用“map”函数,我将值映射到case类。但是,我将使用字段的名称来玩,也可能会更改它们。

我不想维护一个case类并将行映射到sql列名,因为每次都会涉及代码更改。

我正在考虑使用我希望与数据帧的列名保持的列名来维护Hashmap。例如,在示例中,我将“Count(apple)”映射到“apple of apple”。但是,我想不出一个简单的方法可以将我的模式作为配置传递,然后将其映射到我的代码中

json scala apache-spark case-class
3个回答
2
投票

以下是使用scala Map类型使用以下数据集创建列映射的一种方法:

val data = Seq(
(1, 23, 1, 0, 2, 0),
(2, 23, 0, 1, 0, 1),
(2, 29, 1, 0, 1, 0)).toDF("ID", "ID-2", "count(apple)", "count(banana)", "count(potato)", "count(onion)")

首先,我们使用scala.collection.immutable.Map集合和负责映射的函数声明映射:

import org.apache.spark.sql.{Column, DataFrame}

val colMapping = Map(
        "count(banana)" -> "no of banana", 
        "count(apple)" -> "no of apples", 
        "count(potato)" -> "no of potatos", 
        "count(onion)" -> "no of onions")

def mapColumns(colsMapping: Map[String, String], df: DataFrame) : DataFrame = {
       val mapping = df
         .columns
         .map{ c => if (colsMapping.contains(c)) df(c).alias(colsMapping(c)) else df(c)}
         .toList

        df.select(mapping:_*)
}

该函数迭代给定数据帧的列,并使用mapping标识具有公共键的列。然后它根据应用的映射返回更改其名称(带别名)的列。

mapColumns(colMapping, df).show(false)的输出:

+---+----+------------+------------+-------------+------------+
|ID |ID-2|no of apples|no of banana|no of potatos|no of onions|
+---+----+------------+------------+-------------+------------+
|1  |23  |1           |0           |2            |0           |
|2  |23  |0           |1           |0            |1           |
|2  |29  |1           |0           |1            |0           |
+---+----+------------+------------+-------------+------------+

最后我们通过struct类型生成水果和蔬菜:

df1.withColumn("fruits", struct(col(colMapping("count(banana)")), col(colMapping("count(apple)"))))
.withColumn("vegetables", struct(col(colMapping("count(potato)")), col(colMapping("count(onion)"))))
.drop(colMapping.values.toList:_*)
.toJSON
.show(false)

请注意,在完成转换后,我们将删除colMapping集合的所有cols。

输出:

+-----------------------------------------------------------------------------------------------------------------+
|value                                                                                                            |
+-----------------------------------------------------------------------------------------------------------------+
|{"ID":1,"ID-2":23,"fruits":{"no of banana":0,"no of apples":1},"vegetables":{"no of potatos":2,"no of onions":0}}|
|{"ID":2,"ID-2":23,"fruits":{"no of banana":1,"no of apples":0},"vegetables":{"no of potatos":0,"no of onions":1}}|
|{"ID":2,"ID-2":29,"fruits":{"no of banana":0,"no of apples":1},"vegetables":{"no of potatos":1,"no of onions":0}}|
+-----------------------------------------------------------------------------------------------------------------+

0
投票

scala中的::(双冒号)在scala列表中被视为“cons”。这是创建scala List或将元素插入现有可变列表的方法。

scala> val aList = 24 :: 34 :: 56 :: Nil
aList: List[Int] = List(24, 34, 56)

scala> 99 :: aList
res3: List[Int] = List(99, 24, 34, 56)

在第一个例子中,Nil是空列表,并被认为是最右边的cons操作的尾部。

然而

scala> val anotherList = 23 :: 34
<console>:12: error: value :: is not a member of Int
       val anotherList = 23 :: 34

这会引发错误,因为没有要插入的现有列表。


0
投票
val df = spark.sqlContext.read.option("header","true").csv("/sampleinput.txt")

val df1 = df.withColumn("fruits",struct("Count(apple)","Count(banana)") ).withColumn("vegetables",struct("Count(potato)","Count(Onion)")).groupBy("ID","ID-2").agg(collect_list("fruits") as "fruits",collect_list("vegetables") as "vegetables").toJSON 

df1.take(1)

输出:

{"ID":"2","ID-2":"23","fruits":[{"Count(apple)":"0","Count(banana)":"1"}],"vegetables":[{"Count(potato)":"0","Count(Onion)":"1"}]}
© www.soinside.com 2019 - 2024. All rights reserved.