Reputation: 2349
So I have defined a function, and for some reason the terminal is returning the following error:
TypeError: only length-1 arrays can be converted to Python scalars
I'm not sure what I have done wrong exactly?
Here is my self-contained function with corresponding plot:
import matplotlib
import math
import numpy
import matplotlib.pyplot as pyplot
import matplotlib.gridspec as gridspec
def rotation_curve(r):
v_rotation = math.sqrt((r*(1.33*(10**32)))/(1+r)**2)
return v_rotation
curve_range = numpy.linspace(0, 100, 10000)
fig = pyplot.figure(figsize=(16,6))
gridspec_layout = gridspec.GridSpec(1,1)
pyplot = fig.add_subplot(gridspec_layout[0])
pyplot.plot(curve_range, rotation_curve(curve_range))
matplotlib.pyplot.show()
Could anyone advise me where I have gone wrong?
Upvotes: 0
Views: 1476
Reputation: 18521
The problem is in the definition of rotation_curve(r)
. You input and manipulate a numpy array (curve_range
), but you do so using a non-vectorized function math.sqrt
:
v_rotation = math.sqrt((r*(1.33*(10**32)))/(1+r)**2)
Instead, use numpy.sqrt
which broadcasts the sqrt operation across every element in the array. The multiplication and exponentiation operators are overloaded in numpy arrays, so those should work fine.
def rotation_curve(r):
v_rotation = numpy.sqrt((r*(1.33*(10**32)))/(1+r)**2)
return v_rotation
Upvotes: 1