我正在 Yarn 集群上使用 Flink SQL Client 测试 Apache Hudi。当我尝试创建 Hudi 目录(如所描述的)时,我遇到一个错误,告诉我不支持
hive.conf.dir
和 mode
选项。
org.apache.flink.table.client.gateway.SqlExecutionException: Could not execute SQL statement.
at org.apache.flink.table.client.gateway.local.LocalExecutor.executeOperation(LocalExecutor.java:211) ~[flink-sql-client_2.11-1.14.6.jar:1.14.6]
at org.apache.flink.table.client.cli.CliClient.executeOperation(CliClient.java:625) ~[flink-sql-client_2.11-1.14.6.jar:1.14.6]
at org.apache.flink.table.client.cli.CliClient.callOperation(CliClient.java:447) ~[flink-sql-client_2.11-1.14.6.jar:1.14.6]
at org.apache.flink.table.client.cli.CliClient.lambda$executeStatement$1(CliClient.java:332) [flink-sql-client_2.11-1.14.6.jar:1.14.6]
at java.util.Optional.ifPresent(Optional.java:159) ~[?:1.8.0_131]
at org.apache.flink.table.client.cli.CliClient.executeStatement(CliClient.java:325) [flink-sql-client_2.11-1.14.6.jar:1.14.6]
at org.apache.flink.table.client.cli.CliClient.executeFile(CliClient.java:314) [flink-sql-client_2.11-1.14.6.jar:1.14.6]
at org.apache.flink.table.client.cli.CliClient.executeInitialization(CliClient.java:240) [flink-sql-client_2.11-1.14.6.jar:1.14.6]
at org.apache.flink.table.client.SqlClient.openCli(SqlClient.java:135) [flink-sql-client_2.11-1.14.6.jar:1.14.6]
at org.apache.flink.table.client.SqlClient.start(SqlClient.java:95) [flink-sql-client_2.11-1.14.6.jar:1.14.6]
at org.apache.flink.table.client.SqlClient.startClient(SqlClient.java:187) [flink-sql-client_2.11-1.14.6.jar:1.14.6]
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:161) [flink-sql-client_2.11-1.14.6.jar:1.14.6]
Caused by: org.apache.flink.table.api.ValidationException: Unable to create catalog 'hoodie_catalog'.
Catalog options are:
'catalog.path'='/my/path'
'hive.conf.dir'='/etc/hive/conf'
'mode'='hms'
'type'='hudi'
at org.apache.flink.table.factories.FactoryUtil.createCatalog(FactoryUtil.java:292) ~[flink-table_2.11-1.14.6.jar:1.14.6]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.createCatalog(TableEnvironmentImpl.java:1292) ~[flink-table_2.11-1.14.6.jar:1.14.6]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1122) ~[flink-table_2.11-1.14.6.jar:1.14.6]
at org.apache.flink.table.client.gateway.local.LocalExecutor.lambda$executeOperation$3(LocalExecutor.java:209) ~[flink-sql-client_2.11-1.14.6.jar:1.14.6]
at org.apache.flink.table.client.gateway.context.ExecutionContext.wrapClassLoader(ExecutionContext.java:88) ~[flink-sql-client_2.11-1.14.6.jar:1.14.6]
at org.apache.flink.table.client.gateway.local.LocalExecutor.executeOperation(LocalExecutor.java:209) ~[flink-sql-client_2.11-1.14.6.jar:1.14.6]
... 11 more
Caused by: org.apache.flink.table.api.ValidationException: Unsupported options found for 'hudi'.
Unsupported options:
hive.conf.dir
mode
Supported options:
catalog.path
default-database
property-version
是否有某些版本不兼容,或者我应该寻找另一个 Hudi 版本文档页面?
我正在使用:
以这种方式启动 Flink CLI:
flink/bin/sql-client.sh embedded -j /home/otarie/flink-1.14.6/opt/hudi-flink1.14-bundle_2.11-0.11.1.jar
我解决了我的问题。我在官网链接上找不到需要的版本,所以我下载了旧版本。使用捆绑版本
hudi-flink1.14-bundle-0.14.1.jar
,错误消失了。