我正在尝试在云模式下使用Apache Solr 7.6.0设置Apache Nutch 1.15。爬网脚本(nutch/bin/crawl
)可以正常工作,直到清洁作业(CleaningJob.java
)开始。然后,它会无故失败(reason: NA
)。
我已经成功设置了相同版本的Nutch和Solr,但在独立模式下使用了Sorl。
我正在使用以下命令在云模式下启动Solr:
solr/bin/solr start -cloud -p 8983 -s "solr/cloud/node1/solr"
solr/bin/solr start -cloud -p 7574 -s "solr/cloud/node2/solr" -z localhost:9983
并且我正在使用以下命令开始爬网过程:
nutch/bin/crawl -i -s nutch/urls/ --num-threads 400 --hostdbupdate --hostdbgenerate --num-tasks 16 --sitemaps-from-hostdb once niche-crawl 8
在清洁作业上失败。 :
nutch/bin/nutch clean niche-crawl/crawldb
[例外:
No exchange was configured. The documents will be routed to all index writers.
SolrIndexer: deleting 1000/1000 documents
SolrIndexer: deleting 1000/1000 documents
ERROR CleaningJob: java.lang.RuntimeException: CleaningJob did not succeed, job status:FAILED, reason: NA
at org.apache.nutch.indexer.CleaningJob.delete(CleaningJob.java:169)
at org.apache.nutch.indexer.CleaningJob.run(CleaningJob.java:197)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.nutch.indexer.CleaningJob.main(CleaningJob.java:208)
这是我在云端模式下的Solr的index-writers.xml
:
<writer id="indexer_solr_1" class="org.apache.nutch.indexwriter.solr.SolrIndexWriter">
<parameters>
<param name="type" value="cloud"/>
<param name="url" value="http://localhost:8983/solr"/>
<param name="collection" value="nutch"/>
<param name="weight.field" value=""/>
<param name="commitSize" value="1000"/>
<param name="auth" value="true"/>
<param name="username" value="solr"/>
<param name="password" value="password"/>
</parameters>
<mapping>
<copy>
<!-- <field source="content" dest="search"/> -->
<!-- <field source="title" dest="title,search"/> -->
</copy>
<rename>
<field source="metatag.description" dest="description"/>
<field source="metatag.keywords" dest="keywords"/>
</rename>
<remove>
<field source="segment"/>
</remove>
</mapping>
</writer>
尝试升级到Nutch 1.16版。这听起来像一个已知的错误https://issues.apache.org/jira/browse/NUTCH-2731,该错误已在1.16中修复,请参见https://apache.org/dist/nutch/1.16/CHANGES.txt