Reputation: 95
I am using scipy's interval function for a normal random variable to calculate the confidence interval. However, there seems be some misunderstanding about the significance level.
From scipy.stats.norm docs:
*Signature: stats.norm.interval(alpha, *args, **kwds)
Docstring:
Confidence interval with equal areas around the median.*
Parameters:
alpha : array_like of float
Probability that an rv will be drawn from the returned range.
Each value should be in the range [0, 1].
It seems that they are representing the alpha parameter to be the confidence level rather than the significance level. For example, in statistics an alpha value of 0.05 would mean 5% significance level and 95% confidence level. However, scipy expect to pass 0.95 for alpha variable's value. This is confusing as it should be 0.05 going by statistics terminology. Am I missing something here?
Upvotes: 0
Views: 732
Reputation:
Yes, the quantity called alpha
in scipy.stats.rv_continuous.interval
would be called 1-alpha
in statistics textbooks. You are not missing anything. It's just a suboptimal name choice.
The only discussion of the name of that parameter that I found concerned a different name collision:
Sigh. Given that both
interval
andlevy_stable
are quite esoteric,[...]
Upvotes: 1