Reputation: 3881
I am new to Serilog
and I am trying to determine how to send a serialized json
to the console
with a log level
and date time
field. There doesn't seem to be any info in the docs under structured data
.
Here is my code that is called in the Startup.cs
:
private void LoggerLoop(ILogger<Startup> logger)
{
RabbitModel rb = new RabbitModel
{
Id = 1,
DeviceNum = 1,
DeviceName = "Device 1",
InputNum = 1,
InputName = "Input 1",
InputState = 1,
OnPhrase = "On",
OffPhrase = "Off",
When = "2020-01-01T22:45:00.1124303+00:00"
};
while (true)
{
logger.LogInformation("{@rb}", rb);
Thread.Sleep(1000);
}
}
And here is my output:
[14:28:22 INF] {"Id": 1, "DeviceNum": 1, "DeviceName": "Device 1", "InputNum": 1, "InputName": "Input 1", "InputState": 1, "OnPhrase": "On", "OffPhrase": "Off", "When": "2020-01-01T22:45:00.1124303+00:00", "$type": "RabbitModel"}
I did notice that its added a field $type
and wondered if this is possible for the [14:28:22 INF]
to be added to the json
?
Upvotes: 3
Views: 5701
Reputation: 24609
According to 12 factor app, application should writes all logs to stdout
/stderr
.
Then you need collect all logs together, and route to one or more final destinations for viewing (Elasticserach). Open-source log routers (such as FluentBit, Fluentd and Logplex) are available for this purpose.
So, the app never concerns itself with routing or storage of its logs. In dotnet app You can easily achieve it using Serilog
Let's say we have the following logger settings in appsettings.json
"Logging": {
"OutputFormat": "console",
"MinimumLevel": "Information"
}
We can create an extension method
private static IWebHostBuilder CreateWebHostBuilder() =>
WebHost.CreateDefaultBuilder()
.UseStartup<Startup>()
.UseLogging();
}
that can write logs to the console in both plain text and elasticsearch format. Plain text logs will be useful for development, because it is more human readable. On Production we enable elasticsearch format and see all logs only in Kibana.
The code of extension with comments:
public static IWebHostBuilder UseLogging(this IWebHostBuilder webHostBuilder, string applicationName = null) =>
webHostBuilder
.UseSetting("suppressStatusMessages", "True") // disable startup logs
.UseSerilog((context, loggerConfiguration) =>
{
var logLevel = context.Configuration.GetValue<string>("Logging:MinimumLevel"); // read level from appsettings.json
if (!Enum.TryParse<LogEventLevel>(logLevel, true, out var level))
{
level = LogEventLevel.Information; // or set default value
}
// get application name from appsettings.json
applicationName = string.IsNullOrWhiteSpace(applicationName) ? context.Configuration.GetValue<string>("App:Name") : applicationName;
loggerConfiguration.Enrich
.FromLogContext()
.MinimumLevel.Is(level)
.MinimumLevel.Override("Microsoft", LogEventLevel.Warning)
.MinimumLevel.Override("System", LogEventLevel.Warning)
.Enrich.WithProperty("Environment", context.HostingEnvironment.EnvironmentName)
.Enrich.WithProperty("ApplicationName", applicationName);
// read other Serilog configuration
loggerConfiguration.ReadFrom.Configuration(context.Configuration);
// get output format from appsettings.json.
var outputFormat = context.Configuration.GetValue<string>("Logging:OutputFormat");
switch (outputFormat)
{
case "elasticsearch":
loggerConfiguration.WriteTo.Console(new ElasticsearchJsonFormatter());
break;
default:
loggerConfiguration.WriteTo.Console(
theme: AnsiConsoleTheme.Code,
outputTemplate: "[{Timestamp:yy-MM-dd HH:mm:ss.sssZ} {Level:u3}] {Message:lj} <s:{Environment}{Application}/{SourceContext}>{NewLine}{Exception}");
break;
}
});
When OutputFormat
is elasticsearch
the log will be like
{"@timestamp":"2020-02-07T16:02:03.4329033+02:00","level":"Information","messageTemplate":"Get customer by id: {CustomerId}","message":"Get customer by id: 20","fields":{"CustomerId":20,"SourceContext":"Customers.Api.Controllers.CustomerController","ActionId":"c9d77549-bb25-4f87-8ea8-576dc6aa1c57","ActionName":"Customers.Api.Controllers.CustomerController.Get (Customers.Api)","RequestId":"0HLTBQP5CQHLM:00000004","RequestPath":"/v1/customers","CorrelationId":"daef8849b662117e","ConnectionId":"0HLTBQP5CQHLM","Environment":"Development","ApplicationName":"API","Timestamp":"2020-02-07T14:02:03.4329033Z"}}
in other case (use only for debugging)
[20-02-07 13:59:16.16Z INF] Get customer by id: 20
Then you should configure log router to collect logs from container and send it to the Elasticsearch.
If all logs are structured, it improves searching and creating indexes in Kibana.
Upvotes: 3