Drawing colored trees with NetworkX
NOTE. This is a complete reworking of the previous question, which I think is too complex. It presents a much simpler version which, if I can work out, will lead to progress.
I adapted my code from this SO answer: Is there a way to guarantee hierarchical inference from NetworkX?
With the following very simple code, I get strange coloring behavior.
import networkx as nx
import matplotlib.pyplot as plt
G = nx.DiGraph()
G.add_node("ROOT")
for i in xrange(1):
for j in range(1):
G.add_node(str(j)+"_%i" % i, )
if j ==0:
G.add_edge("ROOT", str(j)+"_%i" % i)
else:
G.add_edge(str(j-1)+"_%i" %i, str(j)+"_%i" % i)
pos=nx.graphviz_layout(G,prog='dot')
for i in xrange(1):
nodelist = ['ROOT']
for j in range(1):
nodelist.append(str(j)+"_%i" % i )
nx.draw_networkx_nodes(G,pos, nodelist=nodelist, cmap=plt.get_cmap('Set3'), node_color=[0,1])
nx.draw_networkx_edges(G, pos,arrows=True)
limits=plt.axis('off')
It doesn't seem to matter which values ββI give for node_color
, only the values ββare different or not. For example, with node_color = [0,1]
I get the same behavior as with [0,.1]
, or [0,1000]
. (Why? Colormap takes values ββfrom 0 to 1).
However, if I change the color code, the DO colors change. For example:
And if I set node_color = [3,3]
(or any two values ββthat are the same) I always get the same, the nodes are colored the same.
Any ideas what I am doing wrong here?
source to share
Before going to the color palette, the node's color values ββare normalized to the interval [0, 1]. This is intended to include the full range of colors, regardless of the range of values. To use a different spacing for scaling, you can set options vmin
and vmax
:
nx.draw_networkx_nodes(G,pos, nodelist, cmap=plt.get_cmap('Set3'),
node_color=[0,1], vmin=0, vmax=100)
Detailed explanation:
From the description of the node_color parameter in the draw_networkx_nodes () docs :
If numerical values ββare specified they will be displayed in colors using cmap and vmin, vmax. See Matplotlib.scatter for details.
Unfortunately, these docs do not describe the behavior of vmin
and vmax
. But the link for matplotlib.scatter does cover it more:
vmin and vmax are used in conjunction with norm to normalize luma data. If they are none, the minimum and maximum color values ββare used.
So, the minimum node_color array you pass is mapped to 0 in the color palette by default and the maximum to 1. Everything in between is mapped to spacing [0.0,1.0] using linear mapping (this is called normalization or function scaling ). Here are some examples, going from node_color
to points in the colormap domain:
- [0, 0.1] β [0, 1]
- [0, 1000] β [0, 1]
- [0, 200, 1000] β [0, 0.2, 1]
- [10, 15, 20] β [0, 0.5, 1]
- [0, 0] -> [0,0] (or it could be [1,1], I didn't actually run this)
source to share