Mara
Mara

Reputation: 31

How to constrain the coefficients of a Poisson regression to be positive in R?

Assume the following situation:
I have count data of a variable Y which I assume to be Poisson distributed. I also have data of a variable X over the same time period and each observation represents a certain event. I assume that the values of Y come from two different impacts, so I split each observation Y_i into two Poisson-distributed Y_i1 and Y_i2, but I still only have observations about the total Y_i's. I also assume that the events (represented by X) have a long-term effect on Y_i1 and I have estimators of the parameter lambda_i2.

So my regression formula is
fml=Y_i ~ b_1*X_i+....+b_n*X_(i-n+1) + offset(lambda_i2) -1 with n>=24.
That means that the last 24 (or more, because of the long-term effect) values of X influence the value of Y_i1 in an additive way and I have no intercept(b_0=0).

I made a matrix m whose rows represent Y_i, all of its 24 (or more) regressors for each observation of Y_i and the corresponding estimator of lambda_i2.

Now I used glm(fml, family=poisson(link="identity"), data=m) and tried it for different values of n (=24,48,36,...).

Always, some of the coefficients received negative values which doesnt't make sense in interpretation. (The events represented by X can only have a positive or none effect on the value of Y.)

This leads to my question:

How can I use the constraint b_i >=0 in my model?

In my previous research I found the function glmc(), but I'm not sure how to include my constraint here.

As an alternative I also thought of analyzing this model in a Bayesian way, but yet I haven't found a Bayesian version of glm() for the Poisson distribution such that I can specify the prior for the b_i on my own. (Then I could include the positivity in the prior.)

Do you have any ideas?

This is an extract of my data and my code:

y=c(279,623, 1025, 1701, 1862, 2544, 2308, 2231, 2234, 2550, 2698, 2805, 3510, 3032, 2746, 2074, 1062,  513,  226,  116,   87,   79,  116, 335,  594, 1081, 1425, 1775, 2056, 2387, 2337, 2354, 2665, 2406, 2433, 2550, 2820, 3655, 4566, 2330, 1267,  531,  280,  148,   92,   89, 141,  458,  852, 1214)
X=c(0, 0, 0,  0,  0,  0, 0, 0,  0, 0,  0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.88349136, 0.54951680, 0.13306912, 0.15321180, 0.00000000, 1.42569128, 0.55808054, 0.65486418, 0.27530564, 0.24813572, 0, 0, 2.09889028, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1.18947898, 0.17347032, 0.94538886, 0.03334654, 0.05593732, 0.00000000, 0.99772264, 0.11121918, 0, 1.41673120, 0.27375384, 0, 0, 0, 0, 5.67487576, 0, 0, 0, 0, 0, 0, 0, 0, 1.55642510, 0.98419866, 0.50992652)
lambda=c(253.5,  562.5, 1053.0, 1645.0, 2064.5, 2215.0, 2503.0, 2443.0, 2514.5, 2701.0, 2972.5, 3035.5, 3422.5, 3295.0, 2882.5, 2094.0, 1211.0,  579.5,  265.5,  155.0,  112.5,   82.5,  117.5,  306.0,  627.0, 1021.0, 1463.5, 1722.5, 2017.5, 2146.5, 2209.0, 2231.5, 2265.0, 2320.0, 2442.0, 2507.0, 2957.0, 3674.0, 3345.5, 2285.0, 1265.0,  555.5,  252.0,  145.5,   86.5,   90.5,  148.0,  362.0, 738.0, 1137.5)

regressors=function(n,x){
  m=length(x)-n+1;
  r=matrix(0,m,n);
  for (i in 0:(n-1)){ r[,(i+1)]=x[(n-i):(length(x)-i)]}
  return(r);
}
r=regressors(24,X);
reg=cbind(y,data.frame(r),lambda);
fml=as.formula(paste("y~", paste(colnames(reg)[2:25], collapse = "+"), "+offset(lambda)-1"));
g=glm(fml, poisson(link="identity"), data=reg); %this leads to negative coefficients
obj=function(b){-sum(dpois(y, r%*%b, log=TRUE))}
st=coef(lm(fml, data=reg));
opt=optim(st, obj); % this is where the error occurs

regressors() is the function I wrote to compute the regressors(it results in a matrix r with n columns and 50 rows and each row i represents the regressors of y_i).

Upvotes: 3

Views: 1718

Answers (1)

G. Grothendieck
G. Grothendieck

Reputation: 269346

Try first principles.

# generate random input data
set.seed(123)
n <- 100
x <- 1:n
X <- cbind(1, x)
b <- c(0.1, 3)
y <- rpois(n, X %*% b)

# log likelihood objective function
obj <- function(b) -sum(dpois(y, X %*% b, log = TRUE))

# as a check try with no constraints - these two should give the same coefs
glm(y ~ X + 0, family = poisson("identity"))
st <- coef(lm(y ~ X + 0)); optim(st, obj)

# now add lower bounds ensuring starting value is in feasible region
optim(pmax(st, 1), obj, lower = c(0, 0), method = "L-BFGS-B")

Note 1: Be careful if your parameter estimates are on the boundary of the feasible region as occurs in the constrained example above.

Note 2: Here is a reworking of the code later added to the question. y, X, lambda and regressors are as in the question. Note adding the constraint forces many of the coefficients to zero.

r <- regressors(24,X)
reg <- cbind(y,data.frame(r),lambda)

fml <- y ~ . - lambda + offset(lambda)

# check that g and opt give the same coefs

g <- glm(fml, poisson(link = "identity"), data = reg)

obj <- function(b)-sum(dpois(y, r%*%b + lambda, log = TRUE))
st <- coef(lm(fml, data=reg))
opt <- optim(st, obj, method = "BFGS", control = list(maxit = 500))

optim(pmax(coef(g), 1), obj, method = "L-BFGS-B", lower = 0 * st)

Upvotes: 5

Related Questions