slawter
slawter

Reputation: 535

Solr indexing failed

I did all like in this tutorial, but there is some truble. When I try call ./nutch solrindex http://127.0.0.1:8080/solr/ crawl/crawldb -linkdb crawl/linkdb crawl/segments/* command after nutch crawling I get next exception

> WARNING: job_local_0001 org.apache.solr.common.SolrException: Bad
> Request
> 
> Bad Request
> 
> request: http://127.0.0.1:8080/solr/update?wt=javabin&version=2
>         at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:430)
>         at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:244)
>         at org.apache.solr.client.solrj.request.AbstractUpdateRequest.process(AbstractUpdateRequest.java:105)
>         at org.apache.nutch.indexer.solr.SolrWriter.close(SolrWriter.java:142)
>         at org.apache.nutch.indexer.IndexerOutputFormat$1.close(IndexerOutputFormat.java:48)
>         at org.apache.hadoop.mapred.ReduceTask$OldTrackingRecordWriter.close(ReduceTask.java:466)
>         at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:530)
>         at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:420)
>         at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:260)

What should I do to resolve this problem?

P.s. Solr is working. I used solr 4.1 and nutch 1.6.

Upvotes: 2

Views: 2924

Answers (1)

slawter
slawter

Reputation: 535

When you got any exception, you should open logs and check that exceptions in log. In my case I modify schema.xml and insert some new fields with type="text", but in my schema.xml was another type which called text_general, it was easy fix after logs reading.

Upvotes: 1

Related Questions