Sqoop导入Hive-错误(“ javax.management.MBeanTrustPermission”“注册”)

问题描述 投票:0回答:1

我在运行sqoop导入命令以配置单元和HDFS时遇到此错误,HDFS作业运行没有问题,但是由于这些Java错误,我无法将相同的配置导入配置单元。

log:

[sga-dl@localhost ~]$ sqoop import --connect jdbc:mysql://localhost:3306/v3_abrasivos --username hiveusr --password DATPASS --table History_Demand --m 1 --target-dir /raw_imports/History_Demand --hive-import --create-hive-table --hive-table sqoop_v3_abrasivos.History_Demand
...
20/02/04 09:50:01 INFO mapreduce.ImportJobBase: Transferred 794.1428 MB in 58.9304 seconds (13.476 
20/02/04 09:50:01 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `History_Demand` AS t LIMIT 1
20/02/04 09:50:01 WARN hive.TableDefWriter: Column Dia had to be cast to a less precise type in Hive
20/02/04 09:50:01 INFO hive.HiveImport: Loading uploaded data into Hive
20/02/04 09:50:01 INFO conf.HiveConf: Found configuration file file:/opt/hive/conf/hive-site.xml
2020-02-04 09:50:02,957 main ERROR Could not register mbeans java.security.AccessControlException: access denied ("javax.management.MBeanTrustPermission" "register")
    at java.security.AccessControlContext.checkPermission(AccessControlContext.java:472)
    at java.lang.SecurityManager.checkPermission(SecurityManager.java:585)
    at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.checkMBeanTrustPermission(DefaultMBeanServerInterceptor.java:1848)
    at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerMBean(DefaultMBeanServerInterceptor.java:322)
    at com.sun.jmx.mbeanserver.JmxMBeanServer.registerMBean(JmxMBeanServer.java:522)
    at org.apache.logging.log4j.core.jmx.Server.register(Server.java:380)
    at org.apache.logging.log4j.core.jmx.Server.reregisterMBeansAfterReconfigure(Server.java:165)
    at org.apache.logging.log4j.core.jmx.Server.reregisterMBeansAfterReconfigure(Server.java:138)
    at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:507)
    at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:249)
    at org.apache.logging.log4j.core.async.AsyncLoggerContext.start(AsyncLoggerContext.java:86)
    at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:239)
    at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:157)
    at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:130)
    at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:100)
    at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:187)
    at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jDefault(LogUtils.java:154)
    at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:90)
    at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:82)
    at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:65)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:702)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:331)
    at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

Logging initialized using configuration in jar:file:/usr/local/apache-hive-2.3.6-bin/lib/hive-common-2.3.6.jar!/hive-log4j2.properties Async: true
20/02/04 09:50:03 INFO SessionState: 
Logging initialized using configuration in jar:file:/usr/local/apache-hive-2.3.6-bin/lib/hive-common-2.3.6.jar!/hive-log4j2.properties Async: true
20/02/04 09:50:03 INFO session.SessionState: Created HDFS directory: /tmp/hive/sga-dl/9133a12b-b9d9-4d17-ad47-3dc6ca872ab6
20/02/04 09:50:03 INFO session.SessionState: Created local directory: /tmp/sga-dl/9133a12b-b9d9-4d17-ad47-3dc6ca872ab6
20/02/04 09:50:03 INFO session.SessionState: Created HDFS directory: /tmp/hive/sga-dl/9133a12b-b9d9-4d17-ad47-3dc6ca872ab6/_tmp_space.db
20/02/04 09:50:03 INFO conf.HiveConf: Using the default value passed in for log id: 9133a12b-b9d9-4d17-ad47-3dc6ca872ab6
20/02/04 09:50:03 INFO session.SessionState: Updating thread name to 9133a12b-b9d9-4d17-ad47-3dc6ca872ab6 main
20/02/04 09:50:03 INFO conf.HiveConf: Using the default value passed in for log id: 9133a12b-b9d9-4d17-ad47-3dc6ca872ab6
20/02/04 09:50:03 INFO ql.Driver: Compiling command(queryId=sga-dl_20200204095003_9b2a8fbf-d798-4146-b57c-ffcb5920891b): CREATE TABLE `sqoop_v3_abrasivos.History_Demand` ( `idHistory_Demand` BIGINT, `Entidade_SalesOrg` STRING, `Cod_Material` BIGINT, `Material` STRING, `Plan_Group` STRING, `Linha_Produto` STRING, `MTS_MTO` STRING, `Cod_Tipo_OV` STRING, `Tipo_OV` STRING, `OV_Numero` BIGINT, `OV_Linha` INT, `Cod_Cliente` STRING, `Cliente` STRING, `Cod_Cliente_Corp` STRING, `Cliente_Corp` STRING, `Pais_Cliente` STRING, `UF_Cliente` STRING, `Estado_Cliente` STRING, `Cidade_Cliente` STRING, `Bairro_Cliente` STRING, `Sigla_Moeda` STRING, `OV_Moeda` STRING, `Cod_Sell_Plant` INT, `Sell_Plant` STRING, `Pais_SalesOrg` STRING, `Dia` STRING, `Periodo` STRING, `UN_Estoque` STRING, `Qtd_UN_Estoque` STRING, `Vendas_LOC` STRING) COMMENT 'Imported by sqoop on 2020/02/04 09:50:01' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE
20/02/04 09:50:05 INFO metastore.HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
20/02/04 09:50:05 INFO metastore.ObjectStore: ObjectStore, initialize called
20/02/04 09:50:06 INFO DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
20/02/04 09:50:06 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
20/02/04 09:50:06 ERROR bonecp.BoneCP: Unable to start/stop JMX
java.security.AccessControlException: access denied ("javax.management.MBeanTrustPermission" "register")
    at java.security.AccessControlContext.checkPermission(AccessControlContext.java:472)
    at java.lang.SecurityManager.checkPermission(SecurityManager.java:585)
    at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.checkMBeanTrustPermission(DefaultMBeanServerInterceptor.java:1848)
    at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerMBean(DefaultMBeanServerInterceptor.java:322)
    at com.sun.jmx.mbeanserver.JmxMBeanServer.registerMBean(JmxMBeanServer.java:522)
    at com.jolbox.bonecp.BoneCP.registerUnregisterJMX(BoneCP.java:528)
    at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:500)
    at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
    at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:483)
    at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:297)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606)
    at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
    at org.datanucleus.NucleusContextHelper.createStoreManagerForProperties(NucleusContextHelper.java:133)
    at org.datanucleus.PersistenceNucleusContextImpl.initialise(PersistenceNucleusContextImpl.java:422)
    at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:817)
    at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:334)
    at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:213)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
    at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
    at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:521)
    at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:550)
    at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:405)
    at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:342)
    at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:303)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
    at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
    at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:628)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:594)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:588)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:655)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:431)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:79)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:92)
    at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6902)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:164)
    at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1707)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:83)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
    at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3600)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3652)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3632)
    at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3894)
    at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:248)
    at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:231)
    at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:388)
    at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:332)
    at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:312)
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:288)
    at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.createHiveDB(BaseSemanticAnalyzer.java:236)
    at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.<init>(BaseSemanticAnalyzer.java:215)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.<init>(SemanticAnalyzer.java:362)
    at org.apache.hadoop.hive.ql.parse.CalcitePlanner.<init>(CalcitePlanner.java:267)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzerFactory.get(SemanticAnalyzerFactory.java:318)
    at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:484)
    at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
    at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336)
    at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:474)
    at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:490)
    at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:331)
    at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
20/02/04 09:50:06 INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
20/02/04 09:50:08 ERROR bonecp.BoneCP: Unable to start/stop JMX
java.security.AccessControlException: access denied ("javax.management.MBeanTrustPermission" "register")
    at java.security.AccessControlContext.checkPermission(AccessControlContext.java:472)
    at java.lang.SecurityManager.checkPermission(SecurityManager.java:585)
    at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.checkMBeanTrustPermission(DefaultMBeanServerInterceptor.java:1848)
    at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerMBean(DefaultMBeanServerInterceptor.java:322)
    at com.sun.jmx.mbeanserver.JmxMBeanServer.registerMBean(JmxMBeanServer.java:522)
    at com.jolbox.bonecp.BoneCP.registerUnregisterJMX(BoneCP.java:528)
    at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:500)
    at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
    at org.datanucleus.store.rdbms.ConnectionProviderPriorityList.getConnection(ConnectionProviderPriorityList.java:57)
    at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:402)
    at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getXAResource(ConnectionFactoryImpl.java:361)
    at org.datanucleus.store.connection.ConnectionManagerImpl.allocateConnection(ConnectionManagerImpl.java:316)
    at org.datanucleus.store.connection.AbstractConnectionFactory.getConnection(AbstractConnectionFactory.java:84)
    at org.datanucleus.store.AbstractStoreManager.getConnection(AbstractStoreManager.java:347)
    at org.datanucleus.store.AbstractStoreManager.getConnection(AbstractStoreManager.java:310)
    at org.datanucleus.store.rdbms.query.JDOQLQuery.performExecute(JDOQLQuery.java:591)
    at org.datanucleus.store.query.Query.executeQuery(Query.java:1855)
    at org.datanucleus.store.query.Query.executeWithArray(Query.java:1744)
    at org.datanucleus.store.query.Query.execute(Query.java:1726)
    at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:374)
    at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:216)
    at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.ensureDbInit(MetaStoreDirectSql.java:181)
    at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:144)
    at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:410)
    at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:342)
    at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:303)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
    at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
    at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:628)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:594)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:588)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:655)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:431)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:79)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:92)
    at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6902)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:164)
    at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1707)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:83)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
    at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3600)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3652)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3632)
    at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3894)
    at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:248)
    at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:231)
    at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:388)
    at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:332)
    at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:312)
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:288)
    at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.createHiveDB(BaseSemanticAnalyzer.java:236)
    at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.<init>(BaseSemanticAnalyzer.java:215)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.<init>(SemanticAnalyzer.java:362)
    at org.apache.hadoop.hive.ql.parse.CalcitePlanner.<init>(CalcitePlanner.java:267)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzerFactory.get(SemanticAnalyzerFactory.java:318)
    at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:484)
    at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
    at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336)
    at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:474)
    at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:490)
    at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:331)
    at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
20/02/04 09:50:09 INFO metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is MYSQL
20/02/04 09:50:09 INFO metastore.ObjectStore: Initialized ObjectStore
20/02/04 09:50:09 INFO metastore.HiveMetaStore: Added admin role in metastore
20/02/04 09:50:09 INFO metastore.HiveMetaStore: Added public role in metastore
20/02/04 09:50:09 INFO metastore.HiveMetaStore: No user is added in admin role, since config is empty
20/02/04 09:50:09 INFO metastore.HiveMetaStore: 0: get_all_functions
20/02/04 09:50:09 INFO HiveMetaStore.audit: ugi=sga-dl  ip=unknown-ip-addr  cmd=get_all_functions   
20/02/04 09:50:09 INFO parse.CalcitePlanner: Starting Semantic Analysis
20/02/04 09:50:09 INFO parse.CalcitePlanner: Creating table sqoop_v3_abrasivos.History_Demand position=13
20/02/04 09:50:09 INFO metastore.HiveMetaStore: 0: get_database: sqoop_v3_abrasivos
20/02/04 09:50:09 INFO HiveMetaStore.audit: ugi=sga-dl  ip=unknown-ip-addr  cmd=get_database: sqoop_v3_abrasivos    
20/02/04 09:50:09 WARN metastore.ObjectStore: Failed to get database sqoop_v3_abrasivos, returning NoSuchObjectException
FAILED: SemanticException [Error 10072]: Database does not exist: sqoop_v3_abrasivos
20/02/04 09:50:09 ERROR ql.Driver: FAILED: SemanticException [Error 10072]: Database does not exist: sqoop_v3_abrasivos
org.apache.hadoop.hive.ql.parse.SemanticException: Database does not exist: sqoop_v3_abrasivos
    at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.getDatabase(BaseSemanticAnalyzer.java:1687)
    at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.getDatabase(BaseSemanticAnalyzer.java:1676)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.addDbAndTabToOutputs(SemanticAnalyzer.java:12171)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeCreateTable(SemanticAnalyzer.java:12024)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:11020)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:11133)
    at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:286)
    at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:258)
    at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:512)
    at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
    at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336)
    at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:474)
    at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:490)
    at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:331)
    at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

20/02/04 09:50:09 INFO ql.Driver: Completed compiling command(queryId=sga-dl_20200204095003_9b2a8fbf-d798-4146-b57c-ffcb5920891b); Time taken: 6.541 seconds
20/02/04 09:50:09 INFO conf.HiveConf: Using the default value passed in for log id: 9133a12b-b9d9-4d17-ad47-3dc6ca872ab6
20/02/04 09:50:09 INFO session.SessionState: Resetting thread name to  main
20/02/04 09:50:09 INFO conf.HiveConf: Using the default value passed in for log id: 9133a12b-b9d9-4d17-ad47-3dc6ca872ab6
20/02/04 09:50:09 INFO session.SessionState: Deleted directory: /tmp/hive/sga-dl/9133a12b-b9d9-4d17-ad47-3dc6ca872ab6 on fs with scheme hdfs
20/02/04 09:50:09 INFO session.SessionState: Deleted directory: /tmp/sga-dl/9133a12b-b9d9-4d17-ad47-3dc6ca872ab6 on fs with scheme file
20/02/04 09:50:09 INFO metastore.HiveMetaStore: 0: Cleaning up thread local RawStore...
20/02/04 09:50:09 INFO HiveMetaStore.audit: ugi=sga-dl  ip=unknown-ip-addr  cmd=Cleaning up thread local RawStore...    
20/02/04 09:50:09 INFO metastore.HiveMetaStore: 0: Done cleaning up thread local RawStore
20/02/04 09:50:09 INFO HiveMetaStore.audit: ugi=sga-dl  ip=unknown-ip-addr  cmd=Done cleaning up thread local RawStore  
20/02/04 09:50:09 ERROR tool.ImportTool: Import failed: java.io.IOException: Hive CliDriver exited with status=10072
    at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:355)
    at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

来自mysql的数据进入hdfs,但是配置单元部分没有运行。在我脑海中,主要的“攻击者”是这里指出的错误

2020-02-04 09:50:02,957 main ERROR Could not register mbeans java.security.AccessControlException: access denied ("javax.management.MBeanTrustPermission" "register")

我该如何解决?

hadoop hive sqoop
1个回答
0
投票

当我尝试使用KafkaClient获取KafkaConsumer时遇到了问题。我将SecurityManager设置为系统,以避免调用“ System.exit(在org.apache.spark.deploy.SparkSubmit.main中)”,这将导致服务退出。

  def forbidSystemExistCall = {
val securityManager = new SecurityManager {
  override def checkPermission(perm: Permission): Unit = {
    if(perm.getName.startsWith("exitVM")) {
      throw new ExitTrappedException
    }
  }
}
System.setSecurityManager(securityManager)

}

但是,此SecurityManager不处理“ javax.management.MBeanTrustPermission”,然后出现问题。我的预付款(在我的应用程序中起作用):

Policy.setPolicy(new MBeanRegistryPolicy)

class MBeanRegistryPolicy extends Policy {

val defaultPolicy = Policy.getPolicy()

override def implies(domain: ProtectionDomain, permission: Permission): Boolean = {
  if(permission.isInstanceOf[javax.management.MBeanTrustPermission]) true
  else defaultPolicy.implies(domain, permission)
}
}

默认情况下,JVM不使用SecurityManager,因此,如果我不调用“ forbidSystemCall”(将用户定义的SecurityManager附加到JVM),则不会发生此问题。可能会帮助您解决问题。

© www.soinside.com 2019 - 2024. All rights reserved.