Reputation: 3018
General background
I have a need to watch 100's of files (which each weight around ~30KB). Thus, I came to conclusion that in order to preserve reliablilty, I cannot trust the FileSystemWatcher
even when InternalBuffer
increased to 64KB (due to file size and rapid file changes).
Extra information
The files are sent via a 3-rd party, this I cannot do anything else with the input such as receive the data via the database. What I can do, obviously, is to aggregate the file and do whatever I want with them.
Also, I need the handling the occur as close to the change time as possible.
Code
With that being said, I have created a small library which looks for changes within an interval.
The class looks like this:
public delegate void OnSingleFileModified(string filePath);
public delegate void OnFileError(Exception e);
public sealed class FolderMonitor :IDisposable
{
private const int TimerInterval = 10000;
private readonly Timer _timer;
private readonly Dictionary<string, DateTime> _files;
private readonly string _path;
private readonly string _pattern;
private bool _start;
public event OnSingleFileModified OnSingleFileModified;
public event OnFileError OnFileError;
public FolderMonitor(string folderPath, string pattern)
{
_start = true;
_path = folderPath;
_pattern = pattern;
_files = new Dictionary<string, DateTime>();
MapFolder();
_timer = new Timer(TimerInterval)
{
AutoReset = true,
Enabled = _start
};
_timer.Elapsed += CheckChanges;
}
private void MapFolder()
{
if (!Directory.Exists(_path))
return;
IEnumerable<string> files = Directory.EnumerateFiles(_path, _pattern);
foreach (string file in files)
{
DateTime lastWrite = File.GetLastWriteTimeUtc(file);
_files[file] = lastWrite;
}
}
private void CheckChanges(object sender, ElapsedEventArgs el)
{
if (!Directory.Exists(_path))
return;
try
{
_timer.Enabled = false; // prevents multithreating.
IEnumerable<string> files = Directory.EnumerateFiles(_path, _pattern);
foreach (string file in files)
{
DateTime lastWrite = File.GetLastWriteTimeUtc(file);
// if file is new, add it and report
if (!_files.TryGetValue(file, out DateTime prevLastWrite))
{
_files[file] = lastWrite;
OnSingleFileModified?.Invoke(file); // OnFileCreate
continue;
}
if (lastWrite == prevLastWrite)
continue;
// Change detected
_files[file] = lastWrite;
OnSingleFileModified?.Invoke(file); // OnFileModified
}
}
catch (Exception e)
{
OnFileError?.Invoke(e); // OnError
}
finally
{
_timer.Enabled = _start;
}
}
public void Start()
{
_start = true;
_timer.Enabled = true;
}
public void Stop()
{
_start = false;
_timer.Enabled = false;
}
public void Dispose()
{
_timer?.Dispose();
}
}
Problem
The CPU consumption of this is 53% when this is wrapped inside a windows service.
Suspicion
Calling FileInfo on a path consumes lots of CPU.
New suspicion: The Directory.EnumerateFiles(_path, _pattern);
is still resource consuming.
Any suggestions of how to resolve this? Thanks alot.
EDIT new code.
Upvotes: 2
Views: 281
Reputation: 1679
Firstly, try adding the pattern to the
string[] files = Directory.GetFiles(_path);
statement, e.g.
string[] files = Directory.GetFiles(_path, _pattern);
You could also consider using the
Directory.EnumerateFiles()
method, in the same way.
string[] files = Directory.EnumerateFiles(_path, _pattern);
as you can start working through them before the entire collection is returned.
Upvotes: 2