在数据库中将scql查询的结果写入带有scala的数据帧失败

问题描述 投票:1回答:1

只是在数据库中运行这个spark-sql查询工作正常:

%sql
select CONCAT(`tsArr[1]`,"-", `tsArr[0]`,"-", `tsArr[2]`," ", `tsArr[3]`) as time,
  cast (context._function as string) as funct, 
  cast (context._param as string) as param, 
  cast(context._value as string) as value from clickstreamDF
  lateral view explode(Context) as context

这个输出:

time                funct   param           value
11-27-2017 08:20:33 Open    location        3424
11-27-2017 08:20:33 Open    Company Id      testinc
11-27-2017 08:20:33 Open    Channel Info    1
11-27-2017 08:20:33 Open    UserAgent       jack
11-27-2017 08:20:33 Open    Language        english

但是当我想将查询结果放在这样的数据帧中时

%scala    
val df_header = spark.sql(s"select CONCAT(`tsArr[1]`,"-", `tsArr[0]`,"-", `tsArr[2]`," ", `tsArr[3]`) as time,
  cast (context._function as string) as funct,
  cast (context._param as string) as param,
  cast(context._value as string) as value
  from clickstreamDF lateral view explode(Context) as context")

df_header.createOrReplaceTempView("clickstreamDF")

那就失败了。它说:

错误:')'预期但找到了字符串文字。

我猜这与“ - ”和“”有关。我已经尝试用''和``替换或延伸或者完全离开“但没有结果。我究竟做错了什么?

问候,

D.

scala apache-spark apache-spark-sql databricks
1个回答
0
投票

为避免引号(即")用于封装整个Spark SQL字符串与SQL语句中使用的字符串之间的歧义,请使用三引号(""")作为封闭引号。您还需要删除包含这些backtickss的tsArr[],如以下示例所示:

import org.apache.spark.sql.functions._
import spark.implicits._

case class CT(_function: String, _param: String, _value: String)

val clickstreamDF = Seq(
  (Seq("27", "11", "2017", "08:20:33"), Seq(CT("f1", "p1", "v1"), CT("f2", "p2", "v2"))),
  (Seq("28", "12", "2017", "09:30:44"), Seq(CT("f3", "p3", "v3")))
).toDF("tsArr", "contexts")

clickstreamDF.createOrReplaceTempView("clickstreamTable")

val df_header = spark.sql("""
  select
    concat(tsArr[1], "-", tsArr[0], "-", tsArr[2], " ", tsArr[3]) as time,
    cast(context._function as string) as funct,
    cast(context._param as string) as param,
    cast(context._value as string) as value
  from
    clickstreamTable lateral view explode(contexts) as context
""")

df_header.show
// +-------------------+-----+-----+-----+
// |               time|funct|param|value|
// +-------------------+-----+-----+-----+
// |11-27-2017 08:20:33|   f1|   p1|   v1|
// |11-27-2017 08:20:33|   f2|   p2|   v2|
// |12-28-2017 09:30:44|   f3|   p3|   v3|
// +-------------------+-----+-----+-----+

顺便说一下,您可能需要考虑使用DataFrame API,因为您已经拥有DataFrame中的数据:

val df_header = clickstreamDF.
  withColumn("time",
    concat($"tsArr"(1), lit("-"), $"tsArr"(0), lit("-"), $"tsArr"(2), lit(" "), $"tsArr"(3))
  ).
  withColumn("context", explode($"contexts")).
  select($"time",
    $"context._function".cast("String").as("funct"),
    $"context._param".cast("String").as("param"),
    $"context._value".cast("String").as("value")
  )
© www.soinside.com 2019 - 2024. All rights reserved.