Amit Gupta
Amit Gupta

Reputation: 471

Computing a pagerank on a weighted graph with absolute weights

I am facing the same issue as expressed in this link (Networkx PageRank - Equal Ranks with Different Weights).

Essentially, I am using networkx to compute the pagerank on a graph. Since, pagerank computation first converts the graph to a right stochastic matrix (all out-going edges are normalised to one).

What I need is a way to not normalise the edge weights. So, If one node as only one outgoing edge with weight 0.1 and another one has only one outgoing edge with weight 0.05, I want this information to be used in the computation of pagerank (rather than being normalized to 1 each).

Does anyone know what could be the right way of modifying pagerank to achieve this.

thanks in advance, Amit

Upvotes: 0

Views: 2976

Answers (2)

Amit Gupta
Amit Gupta

Reputation: 471

I have a graph which is not right stochastic (i.e. edge weights are absolute and consistent across nodes). I changed pagerank implementation of networkx to avoid converting initial matrix to a right stochastic matrix thus giving me the right answer. However, this implies that pagerank doesn't converge as sometimes total sum of edges > 1, but usually rankings are fairly consistent after 30-40 iterations.

In essence, removing this line from the networkx code (algorithms/link_analysis/pagerank_alg.py) did the job:-

W = x.stochastic_graph(D, weight=weight)

Upvotes: -1

Aric
Aric

Reputation: 25289

Maybe you are thinking of what Larry and Sergey called "Personalized PageRank"? You can adjust the weighting of the nodes in the random jump part of the algorithm to create a bias. E.g.

In [1]: import networkx as nx

In [2]: G = nx.DiGraph()

In [3]: G.add_path([1,2,3,4])

In [4]: nx.pagerank_numpy(G)
Out[4]: 
{1: 0.11615582303660349,
 2: 0.2148882726177166,
 3: 0.29881085476166286,
 4: 0.370145049584017}

In [5]: nx.pagerank_numpy(G,personalization={1:1,2:10,3:1,4:1})
Out[5]: 
{1: 0.031484535189871404,
 2: 0.341607206810105,
 3: 0.3218506609784609,
 4: 0.3050575970215628}

See, for example, the discussion here http://ilpubs.stanford.edu:8090/422/1/1999-66.pdf

Upvotes: 2

Related Questions