Reputation: 1011
I am trying to learn latent variables from observation using pymc. A simplified version is the following:
I want to learn two hidden parameters $\lambda_0, \lambda_1$,
while there are two distributions $X_0, X_1$ using these parameters respectively:
$X_0 \sim Expon(\lambda_0)$, $X_1 \sim Expon(\lambda_1)$.
I don't have observations for $X_i$. Instead, I have linear combinations of these variables observed: $x_0^{(0)} + x_1^{(1)} + x_1^{(1)} = 6$, $x_0^{(1)} = 2$.
My initial approach was like this, but I don't think it's working :
import pymc
lambda0 = pymc.Uniform('lambda0', 0, 10)
lambda1 = pymc.Uniform('lambda1', 0, 10)
x00 = pymc.Exponential('x00', lambda0)
x01 = pymc.Exponential('x01', lambda0)
x10 = pymc.Exponential('x10', lambda1)
x11 = pymc.Exponential('x11', lambda1)
z = pymc.Normal('z', mu=[x00+x10+x11, x01], tau=1.0, value=[6, 2], observed=True)
model = pymc.Model([lambda0, lambda1, x00, x01, x10, x11, z])
mcmc = pm.MCMC(model)
mcmc.sample(10000)
Could you help me with this toy example?
Upvotes: 1
Views: 497
Reputation: 4203
You should create a deterministic of your exponential variables before using them as arguments. Try this:
mu = [x00+x10+x11, x01]
z = pymc.Normal('z', mu=mu, tau=1.0, value=[6, 2], observed=True)
Also, you don't need to instantiate both a Model
and an MCMC
object. Just the latter:
mcmc = pymc.MCMC([lambda0, lambda1, mu, z])
Upvotes: 0