Reputation: 11628
I was using bulk insertion on my ElasticSearch 1.0 and NEST 1.0. Now I moved to ElasticSearch 2.0 and NEST 2.0 and the bulk insertion throws a stackoverflow (!) exception. My gut feeling is that an infinite recursion is taking place and it consumes the entire stack.
this is my POCO
[ElasticsearchType(Name = "MyDoc")]
public class MyDoc: DynamicResponse
{
[String(Store = false, Index = FieldIndexOption.NotAnalyzed)]
public string HistoryId { get; set; }
[String(Store = false, Index = FieldIndexOption.NotAnalyzed)]
public string FileId { get; set; }
[Date(Store = false)]
public DateTime DateTime { get; set; }
}
That's how I create the mapping:
elastic.Map<MyDoc>(m => m.Index(indexName).AutoMap());
Bulk insertion is done in two steps
First I create the descriptor:
List<dynamic> dataRecordList;
var descriptor = new BulkDescriptor();
if (dataRecordList == null || dataRecordList.Count == 0)
{
return;
}
foreach (var dataRecord in dataRecordList)
{
var nonClosedDataRecord = dataRecord;
descriptor.Index<object>(record => record.Index(indexName).Type("MyDoc").Document(nonClosedDataRecord));
}
Then I call the NEST bulk()
method
var bulkResponse = elastic.Bulk(d => descriptor);
I get the stackoverflow exception inside the foreach, when calling descriptor.Index()
EDIT
An unhandled exception of type 'System.StackOverflowException' occurred in mscorlib.dll
Unfortunately no more details are available as the exception occurred on an external code (system or framework?)
Upvotes: 1
Views: 652