Peter Dutton
Peter Dutton

Reputation: 307

JAGS logistic regression models with non-integer weights

For the logistic regression model below I want to be able to sample from the posterior using non integer values for n (and y). This can occur in this sort of model when partial data is available or it is desirable to down weight is desirable.

model <- function() {
    ## Specify likelihood
    for (i in 1:N1) {
        y[i] ~ dbin(p[i], n[i])
        logit(p[i]) <- log.alpha[1] + alpha[2] * d[i]
    }
   ## Specify priors
   alpha[1] <- exp(log.alpha[1])
   alpha[2] <- exp(log.alpha[2])
   Omega[1:2, 1:2] <- inverse(p2[, ])
   log.alpha[1:2] ~ dmnorm(p1[], Omega[, ])
 }

dbin requires integer values for n and so returns an error in the case of non-integer n.

I have read that it should be possible to do this with the ones trick but have failed to get it to work correctly. Help appreciated.

Upvotes: 0

Views: 771

Answers (1)

mfidino
mfidino

Reputation: 3055

As you said, you should be able to do this with the ones trick. The difficult part is correctly coding up the binomial likelihood because JAGS does not have a binomial coefficient function. However, there are ways to do this. The model below should be able to do what you would like to. For a more specific explanation of the ones trick, see my answer here.

data{
  C <- 10000
  for(i in 1:N1){
    ones[i] <- 1
  }
}
 model{
for(i in 1:N1){
# calculate a binomial coefficient
bin_co[i] <- exp(logfact(n[i]) - (logfact(y[i]) + logfact(n[i] - y[i])))
# logit p
logit(p[i]) <- log.alpha[1] + alpha[2] * d[i]
# calculate a binomial likelihood using ones trick
prob[i] <- (bin_co[i]*(p[i]^y[i])) * ((1-p[i])^(n[i] - y[i]))
# put prob in Bernoulli trial and divide by large constant
ones[i] ~ dbern(prob[i]/C)
}
## Specify priors
alpha[1] <- exp(log.alpha[1])
alpha[2] <- exp(log.alpha[2])
Omega[1:2, 1:2] <- inverse(p2[, ])
log.alpha[1:2] ~ dmnorm(p1[], Omega[, ])
}

Upvotes: 1

Related Questions