Spark SQL中的数组交集

问题描述 投票:2回答:2

我有一个名为writer的数组类型列的表,其值为array[value1, value2]array[value2, value3] ....等。

我正在做self join以获得在数组之间具有共同值的结果。我试过了:

sqlContext.sql("SELECT R2.writer FROM table R1 JOIN table R2 ON R1.id != R2.id WHERE ARRAY_INTERSECTION(R1.writer, R2.writer)[0] is not null ")

sqlContext.sql("SELECT R2.writer FROM table R1 JOIN table R2 ON R1.id != R2.id WHERE ARRAY_INTERSECT(R1.writer, R2.writer)[0] is not null ")

但有同样的例外:

线程“main”中的异常org.apache.spark.sql.AnalysisException:未定义的函数:'ARRAY_INTERSECT'。此函数既不是已注册的临时函数,也不是在数据库'default'中注册的永久函数。第1行pos 80

可能Spark SQL不支持ARRAY_INTERSECTIONARRAY_INTERSECT。我怎样才能在Spark SQL实现我的目标?

apache-spark apache-spark-sql spark-dataframe hiveql apache-spark-dataset
2个回答
6
投票

你需要一个udf:

import org.apache.spark.sql.functions.udf

spark.udf.register("array_intersect", 
  (xs: Seq[String], ys: Seq[String]) => xs.intersect(ys))

然后检查交叉点是否为空:

scala> spark.sql("SELECT size(array_intersect(array('1', '2'), array('3', '4'))) = 0").show
+-----------------------------------------+
|(size(UDF(array(1, 2), array(3, 4))) = 0)|
+-----------------------------------------+
|                                     true|
+-----------------------------------------+


scala> spark.sql("SELECT size(array_intersect(array('1', '2'), array('1', '4'))) = 0").show
+-----------------------------------------+
|(size(UDF(array(1, 2), array(1, 4))) = 0)|
+-----------------------------------------+
|                                    false|
+-----------------------------------------+

4
投票

由于Spark 2.4 array_intersect函数可以直接在SQL中使用

spark.sql(
  "SELECT array_intersect(array(1, 42), array(42, 3)) AS intersection"
).show
+------------+
|intersection|
+------------+
|        [42]|
+------------+

Dataset API:

import org.apache.spark.sql.functions.array_intersect

Seq((Seq(1, 42), Seq(42, 3)))
  .toDF("a", "b")
  .select(array_intersect($"a", $"b") as "intersection")
  .show
+------------+
|intersection|
+------------+
|        [42]|
+------------+

客户语言中也存在等效函数:

© www.soinside.com 2019 - 2024. All rights reserved.