Reputation: 1
I'm using .NET 8 and I have the following function running in a foreach loop:
private bool LookupHistoryFile(string salesID, MapValueContainer folderInfo, string created)
{
var monthYearPath = GetMonthFolder(created);
var filename = salesID.RemoveLeadingZeros().ToXmlFileName("CD");
var path = $"{folderInfo.ShopType}\\{folderInfo.CustomerFolderName}\\Remote\\History\\{monthYearPath}";
var lookupPath = string.Format("{0}\\{1}\\", _remoteDrivePath, path);
var doesFolderHaveHistory = Directory.Exists(lookupPath);
if (!doesFolderHaveHistory) return false;
var dirInfo = new DirectoryInfo(lookupPath);
return dirInfo.EnumerateFiles(searchPattern: "*.xml", SearchOption.TopDirectoryOnly).Where(f => f.Name.EndsWith(filename)).Any();
}
My foreach list has about 500 - 600 items. The problem arises when i am enumerating a remote network drive that has about 1000+ files then the execution almost stops entirely but if i make filter in file explorer it is very fast... Is there a better way to search in directories than:
dirInfo.EnumerateFiles(searchPattern: "*.xml", SearchOption.TopDirectoryOnly).Where(f => f.Name.EndsWith(filename)).Any();
I can see i visual studio monitor that GC heap goes off but process memory stays the same would async/await be the solution?
Here is the forloop:
public List<OrderCompleteResult> LookupHistoryCDFiles(IEnumerable<OrderCompleteResult> orderCompleteResults)
{
var results = orderCompleteResults.ToList();
for (int i = 0; i < orderCompleteResults.Count(); i++)
{
if (Constants.Maps.CustomerFolderMap.TryGetValue(results[i].CustomerNo, out var folderMap))
{
results[i].CustomerName = folderMap.CustomerFolderName;
results[i].CDHistoryFileExists = LookupHistoryFile(results[i].SalesId, folderMap, results[i].Created);
}
else
{
_logger.LogWarning($"Cannot locate order: {results[i].OrderId} the customer id: {results[i].CustomerNo} in customer folder maps");
}
}
return results;
}
Upvotes: -1
Views: 70