Radka
Radka

Reputation: 13

Speed up loops on large data in C#

I have three nested loops from zero to n. n is a large number, around 12000th These three loops working on 2DList. It is actually a Floyd algorithm. At these large data it takes along time, could you advise me how to improve it? Thank you (Sorry for my english:) )

 List<List<int>> distance = new List<List<int>>();

 ...

  for (int i = 0; i < n; i++)

        for (int v = 0; v < n; v++)

            for (int w = 0; w < n; w++)
            {
                if (distance[v][i] != int.MaxValue &&
                    distance[i][w] != int.MaxValue)
                {
                    int d = distance[v][i] + distance[i][w];
                    if (distance[v][w] > d)
                        distance[v][w] = d;
                }

            }

Upvotes: 0

Views: 1679

Answers (5)

Patrick D&#39;Souza
Patrick D&#39;Souza

Reputation: 3571

After a simple look at your code, it seems that you might be heading for a overflow, as the condition check would not be able to block it.

In your code, the condition below adds no value, since we can have distance[v][i] < Int.MaxValue & distance[i][w] < Int.MaxValue but distance[v][i] + distance[i][w] > Int.Maxvalue.

if (distance[v][i] != int.MaxValue && distance[i][w] != int.MaxValue)

Upvotes: 1

rincewound
rincewound

Reputation: 328

As the others have mentioned, the complexity is fixed so you don't exactly have many options there. However, you can use

  • Use arrays instead of lists, if possible.
  • Use an "unsafe" block with pointersemantics, this should decrease the time required to access your array data.
  • Check if you can parallelize your algorithm. In your case you could use multiple copies of your data (multiple copies to get rid of the need for synchronisation) and have several threads work on it (e.g. by splitting the range of the outerloop into some subranges (1-1000, 1001-2000 e.g.).

Upvotes: 0

Terje
Terje

Reputation: 1763

The first part of your if statement distance[v][i] != int.MaxValue can be moved outside of the iteration over w to reduce overhead in some cases. However, I have no idea how often your values are at int.MaxValue

Upvotes: 3

Konrad Rudolph
Konrad Rudolph

Reputation: 546015

You cannot change Floyd’s algorithm, its complexity is fixed (and it’s provably the most efficient solution to the general problem of finding all pairwise shortest path distances in a graph with negative edge weights).

You can only improve the runtime by making the problem more specific or the data set smaller. For a general solution you’re stuck with what you have.

Upvotes: 2

Ian
Ian

Reputation: 34539

Normally I would suggest using Parallel Linq - for example the Ray Tracer example, however this assumes that the items you're operating on are independent. In your example you are using results from a previous iteration, in the current one, making it impossible to parallelize.

As your code is quite simple and there isn't really any overhead, there's not really anything you can do to speed that up. As mentioned you could switch the Lists to arrays. You might also want to compare Double arithmetic to Integer arithmetic on your target machine.

Upvotes: 1

Related Questions