Reputation: 10297
I want to parse data from a log file, pump it into a database, and then purge the log file.
I could use the FileSystemWatcher component, and monitor the Change event, but the event would be firing non-stop, as the log file is pretty much "constantly" being written to. I don't want to be opening/closing db connections willy-nilly.
My current instinct is to use a Timer, and then parse/pump/purge the log file every so often (based on time or based on time and size of file).
Is there a common/proven way of handling the scenario (design pattern)?
Update: I see FileSystemWatcher has a NotifyFilter property, with one of the filterables being "Size"; I'm guessing (haven't found any verification yet) that any time the size of the file changes by 1KB it fires; this would be a reasonable "throttle," if true...
Upvotes: 0
Views: 589
Reputation: 74290
Not sure if this is a design pattern, but if you control how much you buffer before actually writing to the log file you can minimize the frequency.
Upvotes: 1
Reputation: 21881
Do you have any control over the log file generation? if so what you could do is create a new log file say every time it gets to a certain log size and rename the old log file to a specific format. Then have the filesystem watcher filter for the "archive" log files and process them when they are created.
Upvotes: 1
Reputation: 42656
The change event is way too chatty here. I would check the file on a scheduled basis with a timer, looking at the modification timestamp (and possibly create, especially if someone deletes/recreates the file.)
Upvotes: 1