在现有列的基础上增加新的列,并在Spark数据框架中加入协整值。

问题描述 投票:0回答:1

我想在我的数据框中根据以下条件新建一列。

我的数据框是这样的。

my_string 

2020 test 

2020 prod 

2020 dev 

我的条件。

value1=subtract string after space from my_string

value2=subtract first four digit from my_string

If value 1 contains string 'test' then new_col=value2+"01"

If value 1 contains string 'prod' then new_col=value2+"kk"

If value 1 contains string 'dev' then new_col=value2+"ff"

我需要这样的结果。

my_string       |  new_col

2020 test        | 202001

2020 prod        | 2020kk 

2020 dev        | 2020ff

谁能帮帮我?

apache-spark pyspark apache-spark-sql spark-streaming
1个回答
1
投票

使用 row_number 窗口功能与 monotonically_increasing_id()

from pyspark.sql import *
from pyspark.sql.functions import *
w = Window.orderBy(monotonically_increasing_id())
df.withColumn("new_col",concat(split(col("my_string")," ")[0], lpad(row_number().over(w),2,"0"))).show()

#+---------+-------+
#|my_string|new_col|
#+---------+-------+
#|2020 test| 202001|
#|2020 prod| 202002|
#| 2020 dev| 202003|
#+---------+-------+

UPDATE:

使用 when+otherwise 声明:

df.withColumn("dyn_col",when(lower(split(col("my_string")," ")[1]) =="prod","kk").\
when(lower(split(col("my_string")," ")[1]) =="dev","ff").\
when(lower(split(col("my_string")," ")[1]) =="test","01").\
otherwise("null")).\
withColumn("new_col",concat(split(col("my_string")," ")[0], col("dyn_col"))).\
drop("dyn_col").\
show()
#+---------+-------+
#|my_string|new_col|
#+---------+-------+
#|2020 test| 202001|
#|2020 prod| 2020kk|
#| 2020 dev| 2020ff|
#+---------+-------+

In Scala:

df.withColumn("dyn_col",when(lower(split(col("my_string")," ")(1)) ==="prod","kk").
when(lower(split(col("my_string")," ")(1)) ==="dev","ff").
when(lower(split(col("my_string")," ")(1)) ==="test","01").
otherwise("null")).
withColumn("new_col",concat(split(col("my_string")," ")(0), col("dyn_col"))).
drop("dyn_col").
show()

//+---------+-------+
//|my_string|new_col|
//+---------+-------+
//|2020 test| 202001|
//|2020 prod| 2020kk|
//| 2020 dev| 2020ff|
//+---------+-------+
© www.soinside.com 2019 - 2024. All rights reserved.