Reputation: 303
I want to show a color gradient as the background of matplotlib chart. I found some code on the matplotlib site that is close to what I want.
But when using this example and creating my own colormap, the values I specify don't match up to the values on the axis.
For example, using the sample code above and... Specifing my colors and values in a dict:
Notice the 0.8 for the interval value, I expect the 0-0.8 to be black, and 0.8 to 1.0 to be a gradient of red to black.
cdict1 = {
'red':
[
(0.0, 0.0, 0.0),
(0.8, 0.0, 0.0),
(0.8, 1.0, 1.0),
(1.0, 0.0, 0.0)
],
'blue': [(0.0, 0.0, 0.0), (1.0, 0.0, 0.0)],
'green': [(0.0, 0.0, 0.0), (1.0, 0.0, 0.0)]
}
Converting the dict to a colormap:
from matplotlib.colors import LinearSegmentedColormap
testcm1 = LinearSegmentedColormap('testcm1 ', cdict1)
And changing the call of gradient_image
to use my colormap:
gradient_image(ax, direction=0, extent=(0, 1, 0, 1), transform=ax.transAxes,
cmap=testcm1 , cmap_range=(0, 1))
Yet my output shows the red-black gradient starting at about 0.72.
Naturally I expect the gradient to be exactly matched up to the 0.8 value on the y axis, and I have no idea what's happening here.
My code:
import matplotlib.pyplot as plt
from matplotlib.colors import LinearSegmentedColormap
import numpy as np
np.random.seed(19680801)
def gradient_image(ax, extent, direction=0.3, cmap_range=(0, 1), **kwargs):
"""
Draw a gradient image based on a colormap.
Parameters
----------
ax : Axes
The axes to draw on.
extent
The extent of the image as (xmin, xmax, ymin, ymax).
By default, this is in Axes coordinates but may be
changed using the *transform* kwarg.
direction : float
The direction of the gradient. This is a number in
range 0 (=vertical) to 1 (=horizontal).
cmap_range : float, float
The fraction (cmin, cmax) of the colormap that should be
used for the gradient, where the complete colormap is (0, 1).
**kwargs
Other parameters are passed on to `.Axes.imshow()`.
In particular useful is *cmap*.
"""
phi = direction * np.pi / 2
v = np.array([np.cos(phi), np.sin(phi)])
X = np.array([[v @ [1, 0], v @ [1, 1]],
[v @ [0, 0], v @ [0, 1]]])
a, b = cmap_range
X = a + (b - a) / X.max() * X
im = ax.imshow(X, extent=extent, interpolation='bicubic',
vmin=0, vmax=1, **kwargs)
return im
xmin, xmax = xlim = 0, 10
ymin, ymax = ylim = 0, 1
fig, ax = plt.subplots()
ax.set(xlim=xlim, ylim=ylim, autoscale_on=False)
cdict1 = {
'red':
[
(0.0, 0.0, 0.0),
(0.8, 0.0, 0.0),
(0.8, 1.0, 1.0),
(1.0, 0.0, 0.0)
],
'blue': [(0.0, 0.0, 0.0), (1.0, 0.0, 0.0)],
'green': [(0.0, 0.0, 0.0), (1.0, 0.0, 0.0)]
}
testcm1 = LinearSegmentedColormap('testcm1', cdict1)
# background image
gradient_image(ax, direction=0, extent=(0, 1, 0, 1), transform=ax.transAxes,
cmap=testcm1, cmap_range=(0, 1))
ax.set_aspect('auto')
plt.show()
Upvotes: 0
Views: 118
Reputation: 339180
First, I'd say there are only three anchor points, so the color dictionary should rather look like
'red':
[
(0.0, 0.0, 0.0),
(0.8, 0.0, 1.0),
(1.0, 0.0, 0.0)
],
But that's not really the problem here.
The real problem is that your image consists of the 4 corner points. bicubic interpolation will make the gradient; but of course you do not have enough resolution to show the exact gradient specified. In particular, 0.8 is not part of your image data.
One would hence need to define an image that has enough resolution to show the discontinuity. (In the following I also made the colormap creation a bit more intuitive.)
import numpy as np
import matplotlib.pyplot as plt
from matplotlib.colors import LinearSegmentedColormap
colors = [(0, "black"),(0.8,"black"),(0.8,"red"),(1.0,"black")]
testcm2 = LinearSegmentedColormap.from_list('testcm2', colors)
fig, ax = plt.subplots()
X = np.repeat(np.linspace(0,1,301),2).reshape(301,2)
im = ax.imshow(X, extent=(0,1,0,1), cmap=testcm2, vmin=0, vmax=1,
interpolation="bicubic", origin="lower")
plt.show()
Upvotes: 1