Reputation: 165
I am having this weird behavior with NetworkX that is not following what would be expected based on the documentation on the Graph object.
Here is my code:
In [22]: G = nx.Graph()
In [23]: G.add_node("Roasted",attr_dict={"css_color":"#454463"})
In [24]: G["Roasted"]
Out[24]: {}
In [25]: G.nodes(data=True)
Out[25]: [('Roasted', {'css_color': '#454463'})]
At In [23]
I added a node with an attribute dictionary. In In [24]
, I was simply indexing the graph G
like I would on a dictionary and expected it to return {'css_color':'#454463'}
but I got back an empty dictionary instead. I only get to see that dictionary if I were to call for a printout of the list of nodes with their data displayed.
In the documentation, it showed that:
>>> G.add_node(1, time='5pm')
>>> G.add_nodes_from([3], time='2pm')
>>> G.node[1]
{'time': '5pm'}
you should be able to obtain the dictionary simply by indexing on the node name itself. Why did it not work for my case?
EDIT: In case the problem could have been that I used a string instead of an int for the node name, I tried this:
In [29]: G.add_node(1,attr_dict={"css_color":"#454463"})
In [30]: G[1]
Out[30]: {}
And the problem still persists! Could this be a bug???
Upvotes: 0
Views: 1286
Reputation: 23887
Your expectation that G["Roasted"]
should give the attributes of the node "Roasted"
is the source of the error. In fact G["Roasted"]
gives information about the neighbors of "Roasted"
. The fact that it is an empty dictionary represents the fact that you have not assigned any neighbors to "Roasted"
.
import networkx as nx
G=nx.Graph()
G.add_edge(1,2)
G[1]
>{2: {}}
What you need to use is G.node["Roasted"]
rather than G["Roasted"]
.
Upvotes: 1