我使用 Spark SQL 3.0 和 scala_2.12。我将数据插入到iceberg表中并成功从表中读取数据。当我尝试通过spark SQL从表中删除一条错误的记录时,日志显示异常。 github中的apache Iceberg的issue 1444显示iceberg在上一个版本中支持行级删除。为什么我删除失败?我使用的主要冰山版本是 0.10.0 。包 org.apache.iceberg.iceberg-hive 版本是 0.9.1 。 请帮忙!我的 Spark SQL 代码段是:
public static void deleteSingleDataWithoutCatalog3(){
// SparkSQL Configure
SparkConf sparkSQLConf = new SparkConf();
// 'hadoop_prod' is name of the catalog,which is used in accessing table
sparkSQLConf.set("spark.sql.catalog.hadoop_prod", "org.apache.iceberg.spark.SparkCatalog");
sparkSQLConf.set("spark.sql.catalog.hadoop_prod.type", "hadoop");
sparkSQLConf.set("spark.sql.catalog.hadoop_prod.warehouse", "hdfs://hadoop01:9000/warehouse_path/");
sparkSQLConf.set("spark.sql.sources.partitionOverwriteMode", "dynamic");
SparkSession spark = SparkSession.builder().config(sparkSQLConf).master("local[2]").getOrCreate();
// String selectDataSQLALL = "select * from hadoop_prod.xgfying.booksSpark3 ";
String deleteSingleDataSQL = "DELETE FROM hadoop_prod.xgfying.booksSpark3 where price=33 ";
// spark.sql(deleteSingleDataSQL);
spark.table("hadoop_prod.xgfying.booksSpark3").show();
spark.sql(deleteSingleDataSQL);
spark.table("hadoop_prod.xgfying.booksSpark3").show();
}
代码运行时,异常信息为:
......
Exception in thread "main" java.lang.IllegalArgumentException: Failed to cleanly delete data files matching: ref(name="price") == 33
at org.apache.iceberg.spark.source.SparkTable.deleteWhere(SparkTable.java:168)
......
Caused by: org.apache.iceberg.exceptions.ValidationException: Cannot delete file where some, but not all, rows match filter ref(name="price") == 33: hdfs://hadoop01:9000/warehouse_path/xgfying/booksSpark3/data/title=Gone/00000-1-9070110f-35f8-4ee5-8047-cca2a1caba1f-00001.parquet
......
我知道这是一个相当老的问题,我最近遇到了类似的问题,我可以通过将spark.sql.extension添加到spark配置来修复它
--conf spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions