Reputation: 163
I'm building a program that keeps track of a directory and adds an entry to an mysql database whenever a new file has been created in the directory.
I'll show the code for the FileSystemWatcher, the storage instance is an mysql class:
FileSystemWatcher watcher = new FileSystemWatcher
{
Path = directoryToWatch,
IncludeSubdirectories = true,
NotifyFilter = NotifyFilters.Attributes |
NotifyFilters.DirectoryName |
NotifyFilters.FileName,
EnableRaisingEvents = true,
Filter = "*.*"
};
watcher.Created += (OnDirectoryChange);
public void OnDirectoryChange(object sender, FileSystemEventArgs e)
{
storage.Insert(Settings.Default.added_files, e.Name);
}
So that's clear. Here's the code for the mysql database. CloseConnection is almost the same as 'OpenConnection' so I didn't copy that over.
public bool OpenConnection()
{
try
{
connection.Open();
return true;
}
catch (Exception ex)
{
throw ex;
}
}
public void Insert(string tablename, string filename, string attractionCode, int number=0)
{
string query = "INSERT INTO " + tablename + " (FILE_NAME) VALUES('" + filename + "')";
MySqlCommand cmd = new MySqlCommand(query, connection);
OpenConnection();
cmd.ExecuteNonQuery();
CloseConnection();
}
Now the thing is, when I paste 20 files at once in the directory the filesystemwatcher checks, the sql will only process 18 of them. It throws errors like 'Connection was open already'. I also use an select statement somewhere in the sql code, which contains this part for example:
MySqlCommand cmd = new MySqlCommand(query, connection);
MySqlDataReader dataReader = cmd.ExecuteReader();
while (dataReader.Read())
{
list[0].Add(dataReader["id"] + "");
list[1].Add(dataReader["code"] + "");
list[2].Add(dataReader["name"] + "");
}
dataReader.Close();
This also sometimes throws errors like 'can only use one datareader'.
I think the solution for me would be creating some kind of queue for all files the filesystemwatcher handles and then iterating through this queue one by one. But how would I handle this, because the filesystemwatcher has to keep watching the directory. I'm afraid it might miss some files while processing the queue. What could work?
Upvotes: 3
Views: 159
Reputation: 2229
Borrowing from this excellent solution add this class somewhere in your project:
public class BackgroundQueue
{
private Task previousTask = Task.FromResult(true);
private object key = new object();
public Task QueueTask(Action action)
{
lock (key)
{
previousTask = previousTask.ContinueWith(t => action()
, CancellationToken.None
, TaskContinuationOptions.None
, TaskScheduler.Default);
return previousTask;
}
}
public Task<T> QueueTask<T>(Func<T> work)
{
lock (key)
{
var task = previousTask.ContinueWith(t => work()
, CancellationToken.None
, TaskContinuationOptions.None
, TaskScheduler.Default);
previousTask = task;
return task;
}
}
}
I propose the following change to your main module:
// Place this as a module level variable.. so it doesn't go out of scope as long
// as the FileSystemWatcher is running
BackgroundQueue _bq = new BackgroundQueue();
Then the following change to invoke the queue:
FileSystemWatcher watcher = new FileSystemWatcher
{
Path = directoryToWatch,
IncludeSubdirectories = true,
NotifyFilter = NotifyFilters.Attributes |
NotifyFilters.DirectoryName |
NotifyFilters.FileName,
EnableRaisingEvents = true,
Filter = "*.*"
};
watcher.Created += (OnDirectoryChange);
public void OnDirectoryChange(object sender, FileSystemEventArgs e)
{
// Using the shorthand lambda syntax
_bq.QueueTask(() => storage.Insert(Settings.Default.added_files, e.Name));
}
This should enqueue each change the FileSystemWatcher throws your way. Keep in mind the comments in this SO Question about the FileSystemWatcher not catching everything.
Upvotes: 2