archgoon
archgoon

Reputation: 1608

Generic Mixture Models in pymc

I have a distribution with multiple humps. I would like to try fitting several different types of distributions to each hump, gaussian, exponential, weibuill, etc. However, as it stands, it seems that I have to manually define a stochastic class for each combination. What I would like to do is something like

@stochastic(model_a, model_b, observed=True)
def mixture(value=observed_time_series, model_a_parameters, model_b_parameters, p):
     def logp(value, model_a_parameters, model_b_parameters):
         return p*model_a.logp(value, *model_a_parameters) + (1-p)*model_b.logp(value, *model_b_parameters)
     def random(model_a_parameters, model_b_paramters, ratio):
        if(random() < ratio):
             return model_a.random()
        return model_b.random()

Is delegation like this possible? Is there a standard way to do this? The main thing that would stop something like the above is that I can't think of any way to group sets of variables together.

Upvotes: 0

Views: 175

Answers (1)

Chris Fonnesbeck
Chris Fonnesbeck

Reputation: 4203

You are on the right track. Your stochastic decorator can be simplified simply to:

@observed
def mixture(...):
    ...

Also, you only need to define random if you need to sample from the likelihood.

Another approach for modeling mixtures is to use a latent variable model, where individual observations have indicators corresponding to which distribution they are derived from. These indicators can be modeled with a Categorical distribution, for example. This can then have a Dirichlet prior, etc.

Upvotes: 1

Related Questions