获取Apache Spark Java中的整个数据集或仅列的摘要

问题描述 投票:-2回答:1

对于下面的数据集,为了获得Col1的总摘要值,我做了

import org.apache.spark.sql.functions._
val totaldf = df.groupBy("Col1").agg(lit("Total").as("Col2"), sum("price").as("price"), sum("displayPrice").as("displayPrice"))

然后合并

df.union(totaldf).orderBy(col("Col1"), col("Col2").desc).show(false)

DF。

+-----------+-------+--------+--------------+
|   Col1    | Col2  | price  | displayPrice |
+-----------+-------+--------+--------------+
| Category1 | item1 |     15 |           14 |
| Category1 | item2 |     11 |           10 |
| Category1 | item3 |     18 |           16 |
| Category2 | item1 |     15 |           14 |
| Category2 | item2 |     11 |           10 |
| Category2 | item3 |     18 |           16 |
+-----------+-------+--------+--------------+

合并后。

+-----------+-------+-------+--------------+
|   Col1    | Col2  | price | displayPrice |
+-----------+-------+-------+--------------+
| Category1 | Total |    44 |           40 |
| Category1 | item1 |    15 |           14 |
| Category1 | item2 |    11 |           10 |
| Category1 | item3 |    18 |           16 |
| Category2 | Total |    46 |           44 |
| Category2 | item1 |    16 |           15 |
| Category2 | item2 |    11 |           10 |
| Category2 | item3 |    19 |           17 |
+-----------+-------+-------+--------------+

现在我想要整个数据集的摘要如下,其中Col1 Summary为Total,并且具有All Col1和Col2的数据。需要。

    +-----------+-------+-------+--------------+
    |   Col1    | Col2  | price | displayPrice |
    +-----------+-------+-------+--------------+
    | Total     | Total |    90 |           84 |
    | Category1 | Total |    44 |           40 |
    | Category1 | item1 |    15 |           14 |
    | Category1 | item2 |    11 |           10 |
    | Category1 | item3 |    18 |           16 |
    | Category2 | Total |    46 |           44 |
    | Category2 | item1 |    16 |           15 |
    | Category2 | item2 |    11 |           10 |
    | Category2 | item3 |    19 |           17 |
    +-----------+-------+-------+--------------+

我怎样才能达到上述效果?

apache-spark apache-spark-sql apache-spark-dataset
1个回答
1
投票

totaldf创建第三个数据帧

val finalTotalDF= totaldf.select(lit("Total").as("Col1"), lit("Total").as("Col2"), sum("price").as("price"), sum("displayPrice").as("displayPrice"))

然后用它作为union

df.union(totaldf).union(finalTotalDF).orderBy(col("Col1"), col("Col2").desc).show(false)

你应该有你最后需要的dataframe

更新

如果订购对您很重要,那么您应该通过以下方式将T列中TotalCol2更改为t total

import org.apache.spark.sql.functions._
val totaldf = df.groupBy("Col1").agg(lit("total").as("Col2"), sum("price").as("price"), sum("displayPrice").as("displayPrice"))
val finalTotalDF= totaldf.select(lit("Total").as("Col1"), lit("total").as("Col2"), sum("price").as("price"), sum("displayPrice").as("displayPrice"))
df.union(totaldf).union(finalTotalDF).orderBy(col("Col1").desc, col("Col2").desc).show(false)

你应该得到

+---------+-----+-----+------------+
|Col1     |Col2 |price|displayPrice|
+---------+-----+-----+------------+
|Total    |total|90   |82          |
|Category2|total|46   |42          |
|Category2|item3|19   |17          |
|Category2|item2|11   |10          |
|Category2|item1|16   |15          |
|Category1|total|44   |40          |
|Category1|item3|18   |16          |
|Category1|item2|11   |10          |
|Category1|item1|15   |14          |
+---------+-----+-----+------------+

如果评论中提到的订购对您很重要

我希望将总数据作为优先级,所以我希望它位于Top,这实际上是我的要求

然后,您可以创建另一列进行排序

import org.apache.spark.sql.functions._
val totaldf = df.groupBy("Col1").agg(lit("Total").as("Col2"), sum("price").as("price"), sum("displayPrice").as("displayPrice"), lit(1).as("sort"))
val finalTotalDF= totaldf.select(lit("Total").as("Col1"), lit("Total").as("Col2"), sum("price").as("price"), sum("displayPrice").as("displayPrice"), lit(0).as("sort"))
finalTotalDF.union(totaldf).union(df.withColumn("sort", lit(2))).orderBy(col("sort"), col("Col1"), col("Col2")).drop("sort").show(false)

你应该得到

+---------+-----+-----+------------+
|Col1     |Col2 |price|displayPrice|
+---------+-----+-----+------------+
|Total    |Total|90   |82          |
|Category1|Total|44   |40          |
|Category2|Total|46   |42          |
|Category1|item1|15   |14          |
|Category1|item2|11   |10          |
|Category1|item3|18   |16          |
|Category2|item1|16   |15          |
|Category2|item2|11   |10          |
|Category2|item3|19   |17          |
+---------+-----+-----+------------+
© www.soinside.com 2019 - 2024. All rights reserved.