Jautis
Jautis

Reputation: 469

Don't save p-value/coefficient from glmer if there was a warning

I'm running a for loop through a series of analyses. For some of these tests, glmer returns a warning message (copied below). I would like to not save the p-values or parameter estimates for these sites. I imagine this would be an if-statement with pseudocode resembling if no warning, then print beta/p-value to tests, but I can't figure out how to implement this. Any suggestions would be appreciated!

Warning message:
In checkConv(attr(opt, "derivs"), opt$par, ctrl = control$checkConv, :
Model failed to converge with max|grad| = 0.00778865 (tol = 0.001, component 1)

for (i in 1:10000) {
  test <- glmer(cbind(mcounts[i,], counts[i,] - mcounts[i,]) ~ pred + (1 |indiv), family = binomial)
  summary(test)$coefficients[2,4] -> tests[i,2]
  summary(test)$coefficients[2,1] -> tests[i,1]
}

Upvotes: 3

Views: 257

Answers (1)

Ben Bolker
Ben Bolker

Reputation: 226027

For what it's worth, these are warnings, not errors (old-school R people tend to be fussy about this distinction: I edited your question accordingly, you can roll back the changes if you like). You can follow @RichardTelford's advice (in comment above), but you get greater precision if rather than upgrading all warnings to errors, you upgrade only gradient-convergence warnings (you also have the option to adjust the tolerance for this check).

I modified your code slightly for readability and robustness (e.g. refer to the components of the coefficient table you want by name rather than position).

 gcontrol <- glmerControl(check.conv.grad = .makeCC("error", tol = 2e-3))
 for (i in 1:10000) {
     test <- try(glmer(cbind(mcounts[i,], counts[i,] - mcounts[i,]) ~ 
                pred + (1 |indiv), family = binomial,
                control=gcontrol
                ))
     if (inherits(test,"try-error")) next ## if error skip to next iteration
     cc <- coef(summary(test))
     tests[i,] <- cc["pred",c("Estimate","Pr(>|z|)")]    
}

Make sure to fill the tests structure with NA values before you start: then you can easily see which tests failed, and strip the results down only to the successful tests (e.g. with na.omit(as.data.frame(tests))).

It might be worth reading ?convergence: I would probably set the tolerance for bad convergence to be somewhat higher (e.g. 0.02).

Upvotes: 3

Related Questions