Reputation: 197
We use log4net and DB / File appenders currently to capture log4net logging. In Production, while we mostly have WARN and higher level logging alone, there are occasions INFO level logging is turned on to troubleshoot end user issues.
These appenders are reliable and we haven't had experience where logs were lost so our support could not troubleshoot issues.
We are setting up Azure hosting for our application and Application Insights is adding a lot of value with its standard telemetry collection. We are now evaluating redirecting log4net logs into Application Insights and gradually make AI the log aggregator.
MS has released a log4net AI appender - https://www.nuget.org/packages/Microsoft.ApplicationInsights.Log4NetAppender
I did not find enough information on how sampling can affect the logs. For e.g. if some INFO logs are lost to sampling, it is not ideal but we might still be ok. If WARN, ERROR or FATAL logs are lost, that will just make AI unreliable for our support to use and they would just fall back to the traditional logs in DB or files.
Sampling on automatic telemetry (like requests, dependencies) is ok and understood, but is there a mechanism to ensure log4net logs are reliably made available in Application Insights?
Upvotes: 1
Views: 534
Reputation: 29940
According to this issue, the Log4Net appender
should be sampled in the same way as other telemetry is sampled.
So you can just turn off the sampling(the default sampling is Adaptive sampling) by using code.
For example, if it's an .net core application, you can use the code below as per this doc:
public void ConfigureServices(IServiceCollection services)
{
// ...
var aiOptions = new Microsoft.ApplicationInsights.AspNetCore.Extensions.ApplicationInsightsServiceOptions();
aiOptions.EnableAdaptiveSampling = false;
services.AddApplicationInsightsTelemetry(aiOptions);
//...
}
Upvotes: 1