Reputation: 2807
I have a linear model (lm
object) and use margins
to calculate marginal effects of the regressors. As far as I understand, this is equivalent to the partial effect, if the regressor is only once in the model. That's true for "kids".
library("car")
library("plm")
data("LaborSupply", package = "plm")
# Regression
lm1 <- lm(lnwg ~ kids + age + I(age^2), data = LaborSupply)
# kids is once in the model
summary(lm1) # partial effect of kids -2.182e-02
summary(margins(lm1)) # equals marginal effect -0.0218
Output:
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 1.218e+00 1.228e-01 9.921 < 2e-16 ***
kids -2.182e-02 5.398e-03 -4.043 5.36e-05 ***
age 6.704e-02 6.392e-03 10.488 < 2e-16 ***
I(age^2) -7.465e-04 7.936e-05 -9.406 < 2e-16 ***
factor AME SE z p lower upper
age 0.0089 0.0007 12.0683 0.0000 0.0075 0.0104
kids -0.0218 0.0054 -4.0426 0.0001 -0.0324 -0.0112
But why is the marginal effect of age not equivalent to:
6.704e-02 + 2*-7.465e-04 = 0.065547
I mean, shouldn't it be equal to the partial derivative in my model formula?
Upvotes: 1
Views: 383
Reputation: 46908
You are right that it should be the partial derivative of the term, but bear in mind that if you differentiate the your formula w.r.t to age, you get:
beta1 + 2*(beta2)*age
Where beta1 is the coefficent for age, and beta2 the coefficient for age^2 as obtained from the model.
This means the marginal effect of age will vary with age. With the margins package, they will return you the average of the marginal effect across all age values, which is the equivalent of:
lm1 <- lm(lnwg ~ kids + age + I(age^2), data = LaborSupply)
mean(coefficients(lm1)[3] + 2*LaborSupply$age*coefficients(lm1)[4])
[1] 0.008938904
To see this more clearly, do:
lm2 <- lm(lnwg ~ age + I(age^2), data = LaborSupply)
margins(lm2)
Average marginal effects
lm(formula = lnwg ~ age + I(age^2), data = LaborSupply)
age
0.009625
The coefficients are different (because we did not include kids) but you will see the message returned above in the output.
Upvotes: 2