Reputation: 1445
I have below confiugration for my Azure Function in host.json & I am sending 30000 msgs in multiple batches to EvenHub but I see that my azure function doesn't pick more than 64 which is a default value. Any particular reason that it doesn't pick the maxbtachsize value.
Event Hub configuration is with Partition Count 25 for single event Hub and Single TU as 15
"version": "2.0",
"extensions": {
"eventHubs": {
"batchCheckpointFrequency": 5,
"eventProcessorOptions": {
"maxBatchSize": 256,
"prefetchCount": 512
}
}
}
Upvotes: 0
Views: 3231
Reputation: 2199
For dotnet core, I had this problem that application did not take the maxBatchSize or other params mentioned in host.json. I could not do it from host.json. But it is done using dependency injection (IOptions)
public override void Configure(IFunctionsHostBuilder builder)
{
builder.Services.PostConfigure<EventHubOptions>(o =>
{
o.EventProcessorOptions.MaxBatchSize = 256;
o.EventProcessorOptions.PrefetchCount = 512;
});
}
Now, it works for me! It is drawing 256 events if available.
Upvotes: 2
Reputation: 23
"enableReceiverRuntimeMetric": true,
Gives you maxBatchSize count if events are available.
"eventHubs": {
"batchCheckpointFrequency": 5,
"eventProcessorOptions": {
"enableReceiverRuntimeMetric": true,
"invokeProcessorAfterReceiveTimeout": false,
"maxBatchSize": 256,
"prefetchCount": 512,
"receiveTimeout": "00:05:00"
}
}
Upvotes: 0
Reputation: 2042
You should not expect events to be delivered exactly in batch sizes as configured by maxBatchSize. maxBatchSize defines the maximum batch size so events can come in 1 to maxBatchSize batch sizes.
If a backlog is building due to single host not able to process much then Azure Functions should scale out by adding more hosts. When new hosts become active, they should take over some of the partitions thereby increasing the read and process rate. See scaling section here for more details -https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-event-hubs
Upvotes: 0