hive> insert into test.emp (sr_no,usr_name,city) values (10,"Prince","Kathmandu");
Query ID = princemehta_20230311033859_3d7f53f2-3523-4a8b-840d-fcb6aa8729cd
Total jobs = 3
Launching Job 1 out of 3
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
set mapreduce.job.reduces=<number>
Starting Job = job_1678480813847_0014, Tracking URL = http://Princes-MacBook-Air.local:8088/proxy/application_1678480813847_0014/
Kill Command = /usr/local/opt/hadoop/libexec/bin/mapred job -kill job_1678480813847_0014
Hadoop job information for Stage-1: number of mappers: 0; number of reducers: 0
2023-03-11 03:39:11,840 Stage-1 map = 0%, reduce = 0%
Ended Job = job_1678480813847_0014 with errors
Error during job, obtaining debugging information...
Job Tracking URL: http://Princes-MacBook-Air.local:8088/cluster/app/application_1678480813847_0014
FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
MapReduce Jobs Launched:
Stage-Stage-1: HDFS Read: 0 HDFS Write: 0 FAIL
Total MapReduce CPU Time Spent: 0 msec
尝试运行插入查询。日志中出现错误,
[2023-03-11 03:39:10.914]容器以非零退出代码 127 退出。错误文件:prelaunch.err。
prelaunch.err 的最后 4096 字节:
stderr 的最后 4096 字节:
/bin/bash: /Library/Internet: 没有这样的文件或目录
HDFS 不包含 /Library 文件夹,macOS 机器也不包含。
更改 Hive 设置或表定义以不引用 Mac 特定文件路径
我也遇到了类似的问题,设置为 /Library/internet Plug-Ins/.... 的 $JAVA_HOME 变量应该替换为 Library/Java/JavaVirtualMachines/{jdk-version}.jdk/Contents/Home 和更新 hadoop-env.sh 上的 $JAVA_HOME 记录以反映新的 $JAVA_HOME 路径,然后停止并重新启动 hadoop,它现在应该可以正常工作了。