由于 Derby 配置错误而启动 Hive 时出错

问题描述 投票:0回答:0

我正在尝试将 Hive 添加到我的 Hadoop 3.2.0 生态系统中。 我已按照此处所述的安装和配置步骤进行操作:https://www.tutorialspoint.com/hive/hive_installation.htm

不幸的是,我在尝试使用
测试蜂巢时卡住了

hive -hiveconf hive.root.logger=DEBUG,console

报如下错误。我觉得 derby jdbc 驱动程序有问题。请提供任何建议或线索如何修复它,我将不胜感激。 我的类路径是:

env | grep CLASSPATH CLASSPATH=:/opt/derby/lib/derby.jar:/opt/derby/lib/derbytools.jar:/home/hadoop/hadoop/lib/*:.:/opt/hive/lib/*:.:/home/hadoop/hadoop/lib/*:.:/opt/hive/lib/*:.:/home/hadoop/hadoop/lib/*:.:/opt/hive/lib/*:.:/opt/derby/lib/derby.jar:/opt/derby/lib/derbytools.jar

然后:

hive -hiveconf hive.root.logger=DEBUG,console
/usr/bin/which: no hbase in (/export/viya/python/bin:/export/viya/R/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/derby/bin:/home/hadoop/hadoop/sbin:/home/hadoop/hadoop/bin:/opt/hive/bin:/opt/spark/bin:/opt/spark/sbin:/home/nfsuser/.local/bin:/home/nfsuser/bin:/opt/kustomize:/home/nfsuser/sas-viya:/opt/spark/bin:/opt/spark/sbin:/opt/spark/bin:/opt/spark/sbin:/export/viya/python/bin:/opt/hive/bin:/home/hadoop/hadoop/sbin:/home/hadoop/hadoop/bin:/opt/hive/bin:/opt/derby/bin)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in \[jar:file:/opt/hive/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class\]
SLF4J: Found binding in \[jar:file:/home/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class\]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type \[org.apache.logging.slf4j.Log4jLoggerFactory\]
Hive Session ID = 9a08bcc0-5ff9-458d-88bd-2488702ac1cb
2023-02-25T03:32:26,599  INFO \[main\] SessionState: Hive Session ID = 9a08bcc0-5ff9-458d-88bd-2488702ac1cb
...
2023-02-25T03:32:27,889 DEBUG \[9a08bcc0-5ff9-458d-88bd-2488702ac1cb main\] metastore.ObjectStore: Overriding javax.jdo.option.ConnectionDriverName value null from jpox.properties with org.apache.derby.jdbc.EmbeddedDriver
...
2023-02-25T03:32:27,889 DEBUG \[9a08bcc0-5ff9-458d-88bd-2488702ac1cb main\] metastore.ObjectStore: Overriding datanucleus.connectionPool.maxPoolSize value null from jpox.properties with 10
...
2023-02-25T03:32:27,917 DEBUG \[9a08bcc0-5ff9-458d-88bd-2488702ac1cb main\] datasource.HikariCPDataSourceProvider: Configuration requested hikaricp pooling, HikariCpDSProvider exiting
2023-02-25T03:32:28,206  INFO \[9a08bcc0-5ff9-458d-88bd-2488702ac1cb main\] hikari.HikariDataSource: HikariPool-1 - Starting...
2023-02-25T03:32:28,210  WARN \[9a08bcc0-5ff9-458d-88bd-2488702ac1cb main\] util.DriverDataSource: Registered driver with driverClassName=org.apache.derby.jdbc.EmbeddedDriver was not found, trying direct instantiation.
2023-02-25T03:32:28,210 ERROR \[9a08bcc0-5ff9-458d-88bd-2488702ac1cb main\] DataNucleus.Datastore: Exception thrown creating StoreManager. See the nested exception
org.datanucleus.exceptions.NucleusException: Error creating transactional connection factory
at org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:214) \~\[datanucleus-core-4.1.17.jar:?\]
at org.datanucleus.store.AbstractStoreManager.\<init\>(AbstractStoreManager.java:162) \~\[datanucleus-core-4.1.17.jar:?\]
...
... 75 more
Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "HikariCP" plugin to create a ConnectionPool gave an error : Driver org.apache.derby.jdbc.EmbeddedDriver claims to not accept jdbcUrl, jdbc:derby://zbwv4demo1-nfs-vm:1527/metastore_db?create=true
hive derby
© www.soinside.com 2019 - 2024. All rights reserved.