原因是:java.lang.IllegalArgumentException。无法为null获取JDBC类型。

问题描述 投票:0回答:1

加载时出现以下错误 Null 值到数据库中的spark。Datatype 目标表的值是 smallint

Caused by: java.lang.IllegalArgumentException: Can't get JDBC type for null

代码:

val hivedata = spark.sql(s"""select 1 as column1 , B a column2 , NULL as column3 from table""")

hivedata .write.mode(SaveMode.Append).jdbc(url = con, table = targettable, Pconnectionropertiess)

谁能帮帮我

scala apache-spark-sql
1个回答
1
投票

cast(NULL as smallint) 你必须这样做......这将把null转换为 "小int"。short 类型,如下图所示。

val df1 =spark.sql(
     " select 1 as column1 , 2 column2 , cast(NULL as smallint) as column3 from table  ")
  df1.show
df1.printSchema()

结果。

+-------+-------+-------+
|column1|column2|column3|
+-------+-------+-------+
|      1|      2|   null|
+-------+-------+-------+

root
 |-- column1: integer (nullable = false)
 |-- column2: integer (nullable = false)
 |-- column3: short (nullable = true)

其他方式 它将是nulltype而不是小int类型。

val df1 =spark.sql(" select 1 as column1 , 2 column2 ,  NULL   as column3 from table  ")
  df1.show
df1.printSchema()
+-------+-------+-------+
|column1|column2|column3|
+-------+-------+-------+
|      1|      2|   null|
+-------+-------+-------+

root
 |-- column1: integer (nullable = false)
 |-- column2: integer (nullable = false)
 |-- column3: null (nullable = true)

这就是你得到异常的原因。

© www.soinside.com 2019 - 2024. All rights reserved.