theateist
theateist

Reputation: 14399

Writing each json message to unique file with NLog

I want to log each json message received from the network into a unique file.

At first I thought to do File.WriteAllText($"{Guid.NewGuid()}.json", jsonMsg);. But, I need to be able to archive and delete old log files. So, I have to watch the folder when the application starts up and do all the things that NLog already knows how to do.

With NLog, I can create and add a new Target for each message from code. But I'm not sure how to remove those targets after the message has been written. Otherwise, it will create memory leak.

So, should I just stick with my first idea and just implement the logic to archive and delete files, or is there a way to do this with NLog?

Upvotes: 0

Views: 197

Answers (1)

Rolf Kristensen
Rolf Kristensen

Reputation: 19867

I recommend that you use a single filetarget for doing the writing, but avoid using MaxArchiveFiles for archive-cleanup:

var myguid = Guid.NewGuid();
var logEvent = NLog.LogEventInfo.Create(NLog.LogLevel.Info, null, jsonMsg);
logEvent.Properties["msguid"] = myguid;
logger.Log(logEvent);

And then use ${event-properties} in the FileName-option:

<target type="file" name="fileNetworkMessages"
        fileName="messages/Archive/Output-${event-properties:myguid}.json"
        layout="${message}" keepFileOpen="false" />

When including Guid in the filename, then you should avoid using MaxArchiveFiles, because it will introduce a performance-hit for every new file created, and cleanup will not work (HEX-letters will disturb file-wildcard).

NLog FileTarget MaxArchiveFiles has an overhead when rolling to a new file, where it scans all existing files to see if cleanup is necessary. This works fine when only rolling once every hour/day. But when using Guid in the filename, then NLog FileTarget will trigger a cleanup-check for every new file created. This will introduce a performance overhead.

Upvotes: 1

Related Questions