尝试执行“bin/mkdistro.sh -DskipTests”时出现 Oozie 错误

问题描述 投票:0回答:4

尝试按照以下方式安装oozie 4.0.1 http://www.thecloudavenue.com/2013/10/installation-and-configuration-of.html
hadoop 版本 - 2.4.0
行家 - 3.0.4
sqoop - 1.4.4

尝试执行“bin/mkdistro.sh -DskipTests”时,构建失败

…………
[信息] Apache Oozie HCatalog 库 ................................ 成功 [0.399s]
[信息] Apache Oozie Core ................................失败 [7.819s]
[信息] Apache Oozie 文档 ................................ 已跳过
.........
[错误] 无法在项目 oozie-core 上执行目标:无法解析项目 org.apache.oozie:oozie-core:jar:4.0.0 的依赖项:无法解析以下工件:org.apache.oozie:oozie -hadoop-test:jar:2.4.0.oozie-4.0.0, org.apache.oozie:oozie-hadoop:jar:2.4.0.oozie-4.0.0, org.apache.oozie:oozie-sharelib-oozie :jar:4.0.0-cdh5.0.2,org.apache.oozie:oozie-sharelib-hcatalog:jar:4.0.0-cdh5.0.2:找不到org.apache.oozie:oozie-hadoop-test:jar: htp://repo1.maven.org/maven2 中的 2.4.0.oozie-4.0.0 已缓存在本地存储库中,直到中心更新间隔已过或强制更新时才会重新尝试解析 -> [帮助 1 ]
[错误]
[错误] 要查看错误的完整堆栈跟踪,请使用 -e 开关重新运行 Maven。
[错误] 使用 -X 开关重新运行 Maven 以启用完整的调试日志记录。
[错误]
[错误] 有关错误和可能的解决方案的更多信息,请阅读以下文章:
[错误] [帮助 1] htp://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[错误]
[错误] 纠正问题后,您可以使用以下命令恢复构建
[错误] mvn -rf :oozie-core


有人尝试过 oozie4.0.1 和 hadoop 2.4.0 吗?我该如何解决这个问题?

maven hadoop oozie
4个回答
2
投票

我也面临同样的问题。

尝试这个安装步骤,它对我有用,根据您需要的版本更改以下步骤中的版本。

STEP 1 : Extract the tar file using tar -xvf oozie-4.0.1.tar.gz
STEP 2 : Change the name oozie-4.0.1 to oozie using below command.
mv oozie-4.0.1 oozie
STEP 3 : Move to oozie/bin directory using cd oozie/bin Build oozie for 
             Hadoop-2.2 using below command.
mkdistro.sh -DskipTests Dhadoopversion=2
             Before build oozie we must change versions for java, hive pig, 
             sqoop in pom.xml file.
                    Java   - 1.7
                    Hive    - 0.13.0
                    Pig      - 0.12.1
                    Sqoop - 1.4.3


        Eg : <javaVersion>1.7</javaVersion>
             <targetJavaVersion>1.7</targetJavaVersion>
             <hive.version>0.13.0</hive.version>
             <pig.version>0.12.1</pig.version>
             <pig.classifier></pig.classifier>
             <sqoop.version>1.4.3</sqoop.version>
             If build is success you will get the message like 
             Oozie distro created, DATE[2014.01.05-18:55:14GMT] VC-  
               REV[unavailable], available at 
               [/home/labuser/oozie/distro/target]
             Now use the expanded oozie located in   
             /home/labuser/oozie/distro/target/oozie-4.0.1-distro/oozie-4.0.1
STEP 4 : Create a libext directory in expanded oozie and copy the 
             Hadoop-2.2.0 jar files and extjs zip file to libext directory.
STEP 5 : Set this property in Hadoop core-site.xml file.
        Eg : <property>
             <name>hadoop.proxyuser.labuser.hosts</name>
             <value>*</value>
             </property>
             <property>
             <name>hadoop.proxyuser.labuser.groups</name>
             <value>*</value>
             </property>
             Set this property in oozie-site.xml file located in conf directory
              <name>oozie.service.JPAService.create.db.schema</name>
              <value>true</value>
              By default it is false change it to true

Step 6 : Now prepare a oozie war file. So move to expanded oozie/bin 
             and run the below command. 
./oozie-setup.sh prepare-war
             If you get any error like zip: command not found then install 
             zip using following command sudo apt-get install zip
            Then again run the prepare-war command to create a file. if the
             war file created successfully you will get the message like 
                  INFO: Oozie is ready to be started
Step 7 : upload the share lib folder from expanded oozie to hdfs using the
            below command
./oozie-setup.sh sharelib create -fs hdfs://localhost:8020
Step 8 : Create a database for oozie using the command 
./oozie-setup.sh db create –run
              If database created then you will get the message like
setting CATALINA_OPTS="$CATALINA_OPTS -Xmx1024m"
Validate DB Connection 
DONE 
Check DB schema does not exist 
DONE 
Check OOZIE_SYS table does not exist 
DONE 
Create SQL schema 
DONE 
Create OOZIE_SYS table 
DONE
Oozie DB has been created for Oozie version ’4.0.0′
Step 9 : Start the oozie using ./oozied.sh start


Step 10 : Check status of oozie using the below command
              ./oozie admin –oozie http://localhost:11000/oozie -status
              You will get the message like System mode: NORMAL


Issues Faced with this installation
1.  While building hive-0.13.0 share library of Oozie, there is an unsolvable dependency ‘hive-builtins’. 
Cause: Hive-builtins jar is necessary in hive-0.10.0 but in hive-0.13.0 there is no hive-builtins.jar.
Solution: Removed dependency hive-builtins
2.  While building Oozie, we faced issue with java.lang.OutOfMemoryError
Cause: This error signals that the JVM running Maven has run out of memory. It is caused by maven-compiler-plugin
Solution: Edited a maven-compiler-plugin property
<fork>true</fork>
Fork allows running the compiler in a separate process. If false it uses the built in compiler, while if true it will use an executable.

Finally we made a Oozie bulid, with above versions of  Hadoop-ecosystems.

0
投票

我偶然发现了同样的错误,我在 Google 上进行了快速搜索,结果发现还有一个 JIRA 票证仍处于打开状态,可将 hadoop-2 配置文件移动到 Hadoop 2.4.0

https://issues.apache.org/jira/browse/OOZIE-1730

我不知道这有多难,但构建对我来说却以完全相同的方式失败了。


0
投票

请看。

我构建了4.0.1 + Hadoop 1.2.1 + Hive 0.10.0。

QQ:当你说Hadoop = 2.4.1时,是你在oozie的pom.xml中更改了什么还是?

当我使用自定义版本的 hadoop 和 hive 构建时,我必须更改

a. <OOZIE_BUILD_HOME>/pom.xml
b. <OOZIE_BUILD_HOME>/hadooplibs/hadoop-1/pom.xml
c. <OOZIE_BUILD_HOME>/hadooplibs/hadoop-distcp-1/pom.xml
d. <OOZIE_BUILD_HOME>/hadooplibs/hadoop-test-1/pom.xml 

hadoop 的默认版本是 1.1.1,因此查找并替换为您想要的每个版本。就像我对 1.2.1 所做的那样

其他类似的方法是更改与Hadoop 2相关的pom.xml。

如有任何问题请告诉我。

问候, 马尼什


0
投票

对于遇到此问题的其他人,解决方法是覆盖 pom.xml 中的 conjars 存储库

<repositories>
  <repository>
    <id>conjars</id>
    <url>https://conjars.wensel.net/repo/</url>
  </repository>
© www.soinside.com 2019 - 2024. All rights reserved.