Reputation: 350
Can someone explain why I get this strange output when running this code:
import matplotlib.pyplot as plt
import numpy as np
def x_y():
return np.random.randint(9999, size=1000), np.random.randint(9999, size=1000)
plt.plot(x_y())
plt.show()
The output:
Upvotes: 0
Views: 207
Reputation: 2167
Your data is a tuple of two 1000 length arrays.
def x_y():
return np.random.randint(9999, size=1000), np.random.randint(9999, size=1000)
xy = x_y()
print(len(xy))
# > 2
print(xy[0].shape)
# > (1000,)
Let's read pyplot's documentation:
plot(y) # plot y using x as index array 0..N-1
Thus pyplot will plot a line between (0, xy[0][i])
and (1, xy[1][i])
, for i in range(1000).
You probably try to do this:
plt.plot(*x_y())
This time, it will plot 1000 points joined by lines: (xy[0][i], xy[1][i])
for i in range 1000.
Yet, the lines don't represent anything here. Therefore you probably want to see individual points:
plt.scatter(*x_y())
Upvotes: 1
Reputation: 361
Your function x_y
is returning a tuple, assigning each element to a variable gives the correct output.
import matplotlib.pyplot as plt
import numpy as np
def x_y():
return np.random.randint(9999, size=1000), np.random.randint(9999, size=1000)
x, y = x_y()
plt.plot(x, y)
plt.show()
Upvotes: 0