chase
chase

Reputation: 3782

How to Cythonize / allow numba.jit on a simple function: (Finding Triangles in network)

Backstory:

I have been searching for a highly performant way to find cliques within a network which are below a given dimension (e.g all k-cliques with k<=3 are all nodes, edges, and triangles). As this example of low dimensional cliques (k<=3 or k<=4) is often the case, I have resorted to simply looking for highly performant triangle finding methods.

Networkx is incredibly slow; however, networkit has a much more performant solution with a Cython backend.

Unfortunately, networkit does not have an algorithm for listing all cliques <= a given dimension. They have a MaximalCliques algorithm, which is different, and unfortunately simply runs for all possible dimensions of cliques in no particular order (from what I can tell). It also only counts triangles, but does not list the nodes which make up each triangle. Thus, I am writing my own function that implements a reasonably efficient method right now below.

Problem:

I have the function nk_triangles below; however, it is resisting an easy jamming into numba or Cython. Therefore, I wanted to see if anyone has more expertise in these areas that may be able to shove this towards faster speeds.

I have made a simple, yet fully workable snippet of code with the function of interest here:

import networkit as nk
import numba
from itertools import combinations
from urllib.request import urlopen
import tempfile

graph_url="https://raw.githubusercontent.com/networkit/networkit/master/input/tiny_02.graph"
big_graph_url="https://raw.githubusercontent.com/networkit/networkit/master/input/caidaRouterLevel.graph"
with tempfile.NamedTemporaryFile() as f:
    with urlopen(graph_url) as r:
        f.write(r.read())
    f.read()
    G = nk.readGraph(f.name, nk.Format.METIS)

#@numba.jit
def nk_triangles(g):
    # Source:
    # https://cs.stanford.edu/~rishig/courses/ref/l1.pdf
    triangles = set()
    for node in g.iterNodes():
        ndeg = g.degree(node)

        neighbors = [neigh for neigh in g.iterNeighbors(node)
                     if (ndeg < g.degree(neigh)) or
                        ((ndeg == g.degree(neigh))
                          and node < neigh)]

        node_triangles = set({(node, *c): max(g.weight(u,v)
                                              for u,v in combinations([node,*c], 2))
                              for c in combinations(neighbors, 2)
                              if g.hasEdge(*c)})
        triangles = triangles.union(node_triangles)
    return triangles


tris = nk_triangles(G)
tris

The big_graph_url can be switched in to see if the algorithm is actually performing reasonably well. (My graphs are orders of magnitude larger than this still)

As it stands, this takes ~40 minutes minutes to compute my machine (single threaded python loops calling C backend code in networkit and itertools). The number of triangles in the big network is 455,062.

Upvotes: 3

Views: 865

Answers (1)

Lukas S
Lukas S

Reputation: 3583

Here is a numpy version of your code taking ~1 min for your big graph.

graph_url = "https://raw.githubusercontent.com/networkit/networkit/master/input/tiny_02.graph"
big_graph_url = "https://raw.githubusercontent.com/networkit/networkit/master/input/caidaRouterLevel.graph"

with tempfile.NamedTemporaryFile() as f:
    with urlopen(big_graph_url) as r:
        f.write(r.read())
    f.read()
    G = nk.readGraph(f.name, nk.Format.METIS)

nodes = np.array(tuple(G.iterNodes()))
adjacency_matrix = nk.algebraic.adjacencyMatrix(G, matrixType='sparse').astype('bool')
degrees = np.sum(adjacency_matrix, axis=0)
degrees = np.array(degrees).reshape(-1)



def get_triangles(node, neighbors):
    buffer = neighbors[np.argwhere(triangle_condition(*np.meshgrid(neighbors, neighbors)))]
    triangles = np.empty((buffer.shape[0], buffer.shape[1]+1), dtype='int')
    triangles[:,0] = node
    triangles[:,1:] = buffer
    return triangles

def triangle_condition(v,w):
    upper = np.tri(*v.shape,-1,dtype='bool').T
    upper[np.where(upper)] = adjacency_matrix[v[upper],w[upper]]
    return upper

def nk_triangles():
    triangles = list()
    for node in nodes:
        ndeg = degrees[node]
        neighbors = nodes[adjacency_matrix[node].toarray().reshape(-1)]
        neighbor_degs = degrees[neighbors]
        neighbors = neighbors[(ndeg < neighbor_degs) | ((ndeg == neighbor_degs) & (node < neighbors))]
        if len(neighbors) >= 2:
            triangles.append(get_triangles(node, neighbors))
    return triangles

tris = np.concatenate(nk_triangles())
print('triangles:', len(tris))

Giving me

triangles: 455062
CPU times: user 50.6 s, sys: 375 ms, total: 51 s
Wall time: 52 s

Upvotes: 4

Related Questions