Reputation: 154995
I have a single method that takes a filename of an image and processes the image (CPU intensive), then uploads it to blob storage (async IO). Here is a method summary:
public async Task<ImageJob> ProcessImage(String fileName) {
Byte[] imageBytes = await ReadFileFromDisk( fileName ).ConfigureAwait(false); // IO-bound
Byte[] processedImage = RunFancyAlgorithm( imageBytes ); // CPU-bound
Uri blobUri = await this.azureBlobClient.UploadBlob( processedImage ).ConfigureAwait(false); // IO-bound
return new ImageJob( blobUri );
}
The other part of my program receives a list of thousands of filenames to be processed.
What is the most appropriate way of calling my ProcessImage
method in a way that makes maximum use of available IO and CPU power?
I've identified six different ways (so far) for calling my method - but I'm unsure which is the best:
String[] fileNames = GetFileNames(); // typically contains thousands of filenames
// Approach 1:
{
List<Task> tasks = fileNames
.Select( fileName => ProcessImage( fileName ) )
.ToList();
await Task.WhenAll( tasks );
}
// Approach 2:
{
List<Task> tasks = fileNames
.Select( async fileName => await ProcessImage( fileName ) )
.ToList();
await Task.WhenAll( tasks );
}
// Approach 3:
{
List<Task> tasks = new List<Task>();
foreach( String fileName in fileNames )
{
Task imageTask = ProcessImage( fileName );
tasks.Add( imageTask );
}
await Task.WhenAll( tasks );
}
// Approach 4 (Weirdly, this gives me this warning: CS4014 "Because this call is not awaited, execution of the current method continues before the call is completed. Consider applying the 'await' operator to the result of the call."
// ...even though I don't use an async lambda in the previous 3 examples, why is Parallel.ForEach so special?
{
ParallelLoopResult parallelResult = Parallel.ForEach( fileNames, fileName => ProcessImage( fileName ) );
}
// Approach 5:
{
ParallelLoopResult parallelResult = Parallel.ForEach( fileNames, async fileName => await ProcessImage( fileName ) );
}
// Approach 6:
{
List<Task> tasks = fileNames
.AsParallel()
.Select( fileName => ProcessImage( fileName ) )
.ToList();
await Task.WhenAll( tasks );
}
// Approach 7:
{
List<Task> tasks = fileNames
.AsParallel()
.Select( async fileName => await ProcessImage( fileName ) )
.ToList();
await Task.WhenAll( tasks );
}
Upvotes: 4
Views: 162
Reputation: 7091
It sounds like you have many items that need to be processed in exactly the same way. As @StephenCleary mentioned TPL Dataflow is great for the type of problem. A great intro can be found here. The simplest way to start would be with just a couple of blocks with your main TransformBlock
executing ProcessImage
Here's a simple example to get you started:
public class ImageProcessor {
private TransformBlock<string, ImageJob> imageProcessor;
private ActionBlock<ImageJob> handleResults;
public ImageProcessor() {
var options = new ExecutionDataflowBlockOptions() {
BoundedCapacity = 1000,
MaxDegreeOfParallelism = Environment.ProcessorCount
};
imageProcessor = new TransformBlock<string, ImageJob>(fileName => ProcessImage(fileName), options);
handleResults = new ActionBlock<ImageJob>(job => HandleResults(job), options);
imageProcessor.LinkTo(handleResults, new DataflowLinkOptions() { PropagateCompletion = true });
}
public async Task RunData() {
var fileNames = GetFileNames();
foreach (var fileName in fileNames) {
await imageProcessor.SendAsync(fileName);
}
//all data passed into pipeline
imageProcessor.Complete();
await imageProcessor.Completion;
}
private async Task<ImageJob> ProcessImage(string fileName) {
//Each of these steps could also be separated into discrete blocks
var imageBytes = await ReadFileFromDisk(fileName).ConfigureAwait(false); // IO-bound
var processedImage = RunFancyAlgorithm(imageBytes); // CPU-bound
var blobUri = await this.azureBlobClient.UploadBlob(processedImage).ConfigureAwait(false); // IO-bound
return new ImageJob(blobUri);
}
private void HandleResults(ImageJob job) {
//do something with results
}
}
Upvotes: 3