Reputation: 373
I try to fit a graded response model with the R package ltm. The issue is that the Hessian matrix does not converge, and I don't understand why. Here is the code I use:
dset %>%
select(Apathy5, Apathy6, Apathy7, Apathy8 ) %>%
grm(IRT.param = TRUE, Hessian = TRUE, start.val = "random") %>%
summary()
This leads to the error message: "Hessian matrix at convergence contains infinite or missing values; unstable solution."
I have added start.val = "random"
as suggested in the help page of grm, and I tried scaling the variables as suggested here using standardize()
, but without success.
The culprit is the variable Apathy8, as the code works fine without that variable. The output of the Hessian matrix also shows that the last five columns break down.
[,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [,9] [,10] [,11] [,12] [,13]
[1,] 112.885423 168.57318 15.121739 29.733845 -43.944168 -30.110538 -48.87855 -9.490120 -5.469535 8.527498 -34.110730 -50.38311 -15.992948
[2,] 168.573183 668.52572 47.524373 93.447066 -187.886153 -45.249649 -112.95289 -28.018970 -16.976693 40.735345 -55.110927 -122.69238 -48.527770
[3,] 15.121739 47.52437 33.518296 15.442993 -22.313874 -3.987572 -11.06158 -3.479726 -2.526640 4.539699 -4.739956 -12.18191 -6.179648
[4,] 29.733845 93.44707 15.442993 145.616224 -53.533768 -7.432701 -20.96739 -7.859794 -7.633892 10.381387 -8.930554 -23.43291 -14.142549
[5,] -43.944168 -187.88615 -22.313874 -53.533768 89.867650 11.921767 38.91777 11.170036 8.049626 -16.939146 15.689355 43.61965 19.947149
[6,] -30.110538 -45.24965 -3.987572 -7.432701 11.921767 141.122622 214.78691 42.922011 24.824655 -43.143223 -32.839612 -45.16608 -14.239393
[7,] -48.878554 -112.95289 -11.061583 -20.967386 38.917771 214.786905 789.77786 122.001160 70.561387 -203.506690 -53.744979 -107.83995 -39.538108
[8,] -9.490120 -28.01897 -3.479726 -7.859794 11.170036 42.922011 122.00116 117.428337 28.904687 -60.202784 -9.767986 -24.52154 -12.093539
[9,] -5.469535 -16.97669 -2.526640 -7.633892 8.049626 24.824655 70.56139 28.904687 96.874701 -43.127315 -5.590817 -14.56299 -8.576025
[10,] 8.527498 40.73535 4.539699 10.381387 -16.939146 -43.143223 -203.50669 -60.202784 -43.127315 110.411328 10.244042 38.94826 17.207054
[11,] -34.110730 -55.11093 -4.739956 -8.930554 15.689355 -32.839612 -53.74498 -9.767986 -5.590817 10.244042 106.281603 145.39262 48.717527
[12,] -50.383106 -122.69238 -12.181912 -23.432914 43.619649 -45.166085 -107.83995 -24.521540 -14.562993 38.948261 145.392624 500.58555 129.222316
[13,] -15.992948 -48.52777 -6.179648 -14.142549 19.947149 -14.239393 -39.53811 -12.093539 -8.576025 17.207054 48.717527 129.22232 157.761913
[14,] -7.434605 -23.13389 -3.528695 -10.950332 11.651912 -6.539343 -18.44108 -6.827938 -6.500651 9.723388 22.443312 59.53046 38.698537
[15,] 16.226144 51.88630 6.025721 13.961658 -19.822990 13.267376 42.38738 12.094652 8.965027 -17.567169 -43.954635 -153.44485 -72.100830
[16,] NaN NaN NaN NaN 2077.264319 NaN NaN NaN NaN 4384.654980 NaN NaN NaN
[17,] NaN NaN NaN NaN 2113.689915 NaN NaN NaN NaN 4419.579473 NaN NaN NaN
[18,] NaN NaN NaN NaN 2070.957253 NaN NaN NaN NaN 4381.196510 NaN NaN NaN
[19,] NaN NaN NaN NaN 2077.112082 NaN NaN NaN NaN 4385.610768 NaN NaN NaN
[20,] NaN NaN NaN NaN 2042.495092 NaN NaN NaN NaN 4355.470331 NaN NaN NaN
[,14] [,15] [,16] [,17] [,18] [,19] [,20]
[1,] -7.434605 16.226144 NaN NaN NaN NaN NaN
[2,] -23.133895 51.886301 NaN NaN NaN NaN NaN
[3,] -3.528695 6.025721 NaN NaN NaN NaN NaN
[4,] -10.950332 13.961658 NaN NaN NaN NaN NaN
[5,] 11.651912 -19.822990 2077.264 2113.690 2070.957 2077.112 2042.4951
[6,] -6.539343 13.267376 NaN NaN NaN NaN NaN
[7,] -18.441078 42.387378 NaN NaN NaN NaN NaN
[8,] -6.827938 12.094652 NaN NaN NaN NaN NaN
[9,] -6.500651 8.965027 NaN NaN NaN NaN NaN
[10,] 9.723388 -17.567169 4384.655 4419.579 4381.197 4385.611 4355.4703
[11,] 22.443312 -43.954635 NaN NaN NaN NaN NaN
[12,] 59.530460 -153.444848 NaN NaN NaN NaN NaN
[13,] 38.698537 -72.100830 NaN NaN NaN NaN NaN
[14,] 100.203004 -41.326820 NaN NaN NaN NaN NaN
[15,] -41.326820 80.464068 1274.630 1312.653 1266.878 1274.224 1235.2047
[16,] NaN 1274.630183 NaN NaN NaN NaN NaN
[17,] NaN 1312.652564 NaN NaN NaN NaN NaN
[18,] NaN 1266.878092 NaN NaN NaN NaN NaN
[19,] NaN 1274.223660 NaN NaN NaN NaN NaN
[20,] NaN 1235.204722 NaN NaN NaN NaN -975.8656
>
It is interesting that the problem occurs for many samples of the data that I have tested. I have 1000 obsdervations, but the problem persists even with 60 random observations, for instance the following (Link to csv):
N Apathy5 Apathy6 Apathy7 Apathy8
1 1 1 2 1
2 1 1 1 1
3 4 1 1 2
4 3 5 4 5
5 1 2 2 1
6 4 5 4 1
7 4 5 4 5
8 4 3 3 4
9 1 2 2 1
10 2 4 3 3
11 5 5 5 5
12 1 1 1 1
13 3 3 3 2
14 2 2 2 2
15 1 1 1 1
16 2 2 2 3
17 1 1 1 2
18 2 2 2 2
19 2 1 3 2
20 1 1 1 1
21 1 2 2 1
22 2 3 4 2
23 1 1 1 1
24 1 1 1 1
25 1 2 1 1
26 4 3 4 1
27 3 3 3 3
28 3 3 3 2
29 2 2 2 2
30 2 2 2 1
31 2 2 2 2
32 1 1 1 1
33 1 1 1 2
34 2 2 2 2
35 2 2 3 2
36 1 1 1 1
37 1 2 1 1
38 1 1 1 1
39 1 1 1 1
40 1 1 1 1
41 2 2 1 1
42 1 1 1 1
43 2 2 3 4
44 1 1 1 1
45 1 1 2 2
46 1 4 1 1
47 2 2 1 1
48 1 1 1 1
49 2 3 3 2
50 1 1 1 1
51 1 2 1 1
52 1 2 1 1
53 1 1 1 1
54 3 4 5 3
55 5 5 4 5
56 1 2 2 2
57 1 1 1 1
58 2 2 2 1
59 1 1 1 1
60 1 2 1 1
Does anyone know how to solve this?
Upvotes: 1
Views: 592
Reputation: 862
You could try to use another optimization algorithms using the control
parameter of the grm
function, for example:
dset %>%
select(Apathy5, Apathy6, Apathy7, Apathy8) %>%
grm(IRT.param = TRUE, Hessian = TRUE, start.val = "random",
control = list(verbose = T, method = "Nelder-Mead", iter.qN = 2000)) %>% summary()
verbose
just outputs what's happening and you can also change the number of iterations the method is doing using iter.qN
. Read more about the different optimization algorithms from the optim()
documentation. Each of them has its pros and cons. The Nelder-Mead in the example is a heuristic method and seems to run with the example data you provided. However, it is very inconsistent. If you use different starting values, it gives a bit different outcome and might not find the actual global minimum. For the complete data, you perhaps need to do some testing on what kind of method and amount of iterations works. For example, L-BFGS-B
with different random starting points.
Upvotes: 1