coefvlm.RdExtracts the estimated coefficients from VLM objects such as VGLMs.
coefvlm(object, matrix.out = FALSE, label = TRUE, colon = FALSE, ...)An object for which the extraction of
coefficients is meaningful.
This will usually be a vglm object.
Logical. If TRUE then a matrix is returned.
The explanatory variables are the rows.
The linear/additive predictors are the columns.
The constraint matrices are used to compute this matrix.
Logical. If FALSE then the names
of the vector of coefficients are set to NULL.
Logical. Explanatory variables which appear in more than one
linear/additive predictor are labelled with a colon,
e.g., age:1, age:2.
However, if it only appears in one linear/additive predictor
then the :1 is omitted by default.
Then setting colon = TRUE will add the :1.
Currently unused.
This function works in a similar way to
applying coef() to a lm
or glm object.
However, for VGLMs, there are more options available.
A vector usually.
A matrix if matrix.out = TRUE.
Yee, T. W. and Hastie, T. J. (2003). Reduced-rank vector generalized linear models. Statistical Modelling, 3, 15–41.
zdata <- data.frame(x2 = runif(nn <- 200))
zdata <- transform(zdata, pstr0 = logitlink(-0.5 + 1*x2, inverse = TRUE),
lambda = loglink( 0.5 + 2*x2, inverse = TRUE))
zdata <- transform(zdata, y2 = rzipois(nn, lambda, pstr0 = pstr0))
fit2 <- vglm(y2 ~ x2, zipoisson(zero = 1), data = zdata, trace = TRUE)
#> Iteration 1: loglikelihood = -326.12618
#> Iteration 2: loglikelihood = -324.79101
#> Iteration 3: loglikelihood = -324.70759
#> Iteration 4: loglikelihood = -324.69898
#> Iteration 5: loglikelihood = -324.69804
#> Iteration 6: loglikelihood = -324.69793
#> Iteration 7: loglikelihood = -324.69791
#> Iteration 8: loglikelihood = -324.69791
coef(fit2, matrix = TRUE) # Always a good idea
#> logitlink(pstr0) loglink(lambda)
#> (Intercept) -0.09140163 0.3102218
#> x2 0.00000000 2.1841194
coef(fit2)
#> (Intercept):1 (Intercept):2 x2
#> -0.09140163 0.31022181 2.18411941
coef(fit2, colon = TRUE)
#> (Intercept):1 (Intercept):2 x2:1
#> -0.09140163 0.31022181 2.18411941