user9280506
user9280506

Reputation:

Create a pie with matplotlib

Hello I have this code with Python :

import matplotlib.pyplot as plt

total_a = 0.004095232
total_b = 0.05075945
total_c = 0.005425
total_d = 0.022948572
total_e = 0.015012

slices = [total_a,total_b,total_c,total_d,total_e]
activities = ['a', 'b', 'c', 'd','e']
cols = ['gold', 'yellowgreen', 'lightcoral', 'lightskyblue', 'orangered']
plt.pie(slices,labels=activities,autopct='%1.1f%%',colors=cols,startangle=140,shadow=True)
plt.show()

But I get this when I execute this code :

Pie

I don't understand why I don't get a full pie ? Thank you for your help !

Upvotes: 2

Views: 1958

Answers (2)

Jack Schofield
Jack Schofield

Reputation: 302

As per ImportanceOfBeingErnest's answer below with an explanation of the documentation, plt.py takes an array input and will normalise the values for you as long as the sum of the elements is greater than 1.

Since your values sum to less than 1 the array is not normalised. In order to normalise yourself you should should divide each element by the largest value of the list with the following line:

slices = [aSlice/max(slices) for aSlice in slices]

And I would put this in your program here:

import matplotlib.pyplot as plt

total_a = 0.004095232
total_b = 0.05075945
total_c = 0.005425
total_d = 0.022948572
total_e = 0.015012


slices = [total_a,total_b,total_c,total_d,total_e]
slices = [aSlice/max(slices) for aSlice in slices]

activities = ['a', 'b', 'c', 'd','e']
cols = ['gold', 'yellowgreen', 'lightcoral', 'lightskyblue', 'orangered']
plt.pie(slices,labels=activities,autopct='%1.1f%%',colors=cols,startangle=140,shadow=True)
plt.show()

For me this then produces a graph like this:

The properly rendered pie chart

Upvotes: 4

ImportanceOfBeingErnest
ImportanceOfBeingErnest

Reputation: 339705

The x argument to pie(x, ...) distingishes two cases. As the documentation states:

Make a pie chart of array x. The fractional area of each wedge is given by x/sum(x). If sum(x) < 1, then the values of x give the fractional area directly and the array will not be normalized. The resulting pie will have an empty wedge of size 1 - sum(x).

In the case from the question, the sum is indeed smaller one sum(x) < 1. An easy workaround may be to multiply the input array by some large number or divide it by its sum.

slices = np.array([total_a,total_b,total_c,total_d,total_e])*100

or

slices = np.array([total_a,total_b,total_c,total_d,total_e])
slices /= slices.sum() 

Upvotes: 2

Related Questions