Quant
Quant

Reputation: 61

c# Using Process.start () efficiently

Below is the piece of code, this piece of code joins two file to generate another file.

foreach (string jpegfile in files)
{
    string fileName = jpegfile + ".dcm";
    string ConfigFile =GetFileNameWithoutExtension(jpegfile) + ".cfg";
    string destFile = System.IO.Path.Combine(DirectoryPath, fileName);
    string command = "/C " + batFileName + " -C " + patientConfigFile + " " + jpegfile + " " + destFile;
    try
    {
        System.Diagnostics.Process process = new System.Diagnostics.Process();
        process.StartInfo.UseShellExecute = false;
        // You can start any process, HelloWorld is a do-nothing example.
        process.StartInfo.FileName = "cmd.exe";
        process.StartInfo.Arguments = command;
        process.StartInfo.CreateNoWindow = true;


        process.Start();

        process.WaitForExit();

        this.counter++;

        // Report progress as a percentage of the total task.
        int percentComplete =
        (int)((float)this.counter / (float)form.sumofImages * 100);
        if (percentComplete > highestPercentageReached)
        {
            highestPercentageReached = percentComplete;
            worker.ReportProgress(percentComplete);
        }

     }
     catch (Exception exp)
     {
         MessageBoxButtons buttons = MessageBoxButtons.OK;
         DialogResult result;
         result = MessageBox.Show("Batch File execution error " + exp, "Warning", buttons);
     }
}

My question is, if i use waitforexit() after start() my code takes good amount of time. How can i speed up the above loop?

Upvotes: 3

Views: 11236

Answers (2)

You could spawn a bunch of tasks and do a Task.WaitAll, as shown in Matt Burland's answer.

A few other options are as follows. (I haven't tested these too thoroughly so you'll probably want to do so). First, you could use an event instead of WaitForExit:

private int counter = 0;
    int sumOfImages = 10; // Set this to the number of files

    private void ProcessStart(List<string> files)
    {
        foreach (string file in files)
        {
            Process process = new Process();
            process.StartInfo.UseShellExecute = false;
            // You can start any process, HelloWorld is a do-nothing example.
            process.StartInfo.FileName = "cmd.exe";
            process.StartInfo.Arguments = "someCommand";
            process.StartInfo.CreateNoWindow = true;
            process.EnableRaisingEvents = true;
            process.Exited += Process_Exited;

            process.Start();
        }
    }

    private void Process_Exited(object sender, EventArgs e)
    {
        int result = Interlocked.Increment(ref counter);

        int percentComplete = ((result / sumOfImages) * 100);

        worker.ReportProgress(percentComplete);
    }

You could, if you wanted, put the whole thing on the Thread Pool. I actually like this as an answer - the CPU-bound part of this is creating and starting the processes and that would put that on a background thread so it wouldn't hang your UI. It would eliminate the overhead of waiting for a result, which isn't CPU-bound.

Here's an illustration. Suppose you go to a restaurant with 10 people. When the waiter comes, 9 out of the 10 people are ready to order. The waiter happens to ask that guy for his order first. At this point, everyone at the table could either wait around for him to decide or the waiter can take orders from the 9 other people at the table and come back to the first guy. Odds are it would be rather pointless to bring in an additional waiter to wait for the first guy's order while the original waiter takes the other 9 orders. If absolutely necessary the waiter could take the 9 orders to the kitchen and come back to take the first guy's order.

Point being that if it's just a matter of waiting for results from one of the people bringing in additional waiters isn't necessarily going to give you much of a performance boost.

Obviously in this analogy the waiter is a thread and the people are tasks that need to be accomplished. In the above solution, you have a single waiter (the Thread Pool thread) service all of the people (create all of the processes) and then the people (processes) tell him when they're ready to order (i.e. the process raises the event). He then tells the kitchen their order (raises the ReportProgress event on the Worker).

Another option would be a Parallel.ForEach loop:

private void ProcessStart(List<string> files)
    {
        int sumOfImages = files.Count;
        int count = 0;
        string command = "";

        Parallel.ForEach(files,
            delegate (string file)
            {
                Process process = new Process();
                process.StartInfo.UseShellExecute = false;
                // You can start any process, HelloWorld is a do-nothing example.
                process.StartInfo.FileName = "cmd.exe";
                process.StartInfo.Arguments = command;
                process.StartInfo.CreateNoWindow = true;

                process.Start();

                process.WaitForExit();

                int result = Interlocked.Increment(ref count);

                int percentComplete = ((result / sumOfImages) * 100);

                worker.ReportProgress(percentComplete);
            });
    }

Upvotes: 1

Matt Burland
Matt Burland

Reputation: 45155

If you don't have to wait for the previous file to finish before you start the next one, you could try something like this:

// for each file, spawn a task to do your processing
var tasks = (from jpegfile in files
            select Task.Run(() =>
            {
                 // set up your process here...
                 process.Start();
                 // task won't be done until process exits
                 process.WaitForExit();
            }).ToArray();
// Wait for all the tasks to be done
Task.WaitAll(tasks);

Upvotes: 5

Related Questions