logLikvlm.RdCalculates the log-likelihood value or the element-by-element contributions of the log-likelihood.
# S3 method for class 'vlm'
logLik(object, summation = TRUE, ...)Some VGAM object, for example, having
class vglmff-class.
Logical, apply sum?
If FALSE then a \(n\)-vector or
\(n\)-row matrix (with the number of responses as
the number of columns) is returned.
Each element is the contribution to the log-likelihood.
Currently unused.
In the future:
other possible arguments fed into
logLik in order to compute the log-likelihood.
By default, this function returns the log-likelihood of the object. Thus this code relies on the log-likelihood being defined, and computed, for the object.
Returns the log-likelihood of the object.
If summation = FALSE then a \(n\)-vector or
\(n\)-row matrix (with the number of responses as
the number of columns) is returned.
Each element is the contribution to the log-likelihood.
The prior weights are assimulated within the answer.
Not all VGAM family functions currently have the
summation argument implemented.
Not all VGAM family functions have had the
summation checked.
VGLMs are described in vglm-class;
VGAMs are described in vgam-class;
RR-VGLMs are described in rrvglm-class;
AIC;
anova.vglm.
zdata <- data.frame(x2 = runif(nn <- 50))
zdata <- transform(zdata, Ps01 = logitlink(-0.5 , inverse = TRUE),
Ps02 = logitlink( 0.5 , inverse = TRUE),
lambda1 = loglink(-0.5 + 2*x2, inverse = TRUE),
lambda2 = loglink( 0.5 + 2*x2, inverse = TRUE))
zdata <- transform(zdata, y1 = rzipois(nn, lambda = lambda1, pstr0 = Ps01),
y2 = rzipois(nn, lambda = lambda2, pstr0 = Ps02))
with(zdata, table(y1)) # Eyeball the data
#> y1
#> 0 1 2 3 4 5
#> 28 9 8 3 1 1
with(zdata, table(y2))
#> y2
#> 0 1 2 3 4 5 6 7 9 11 13
#> 33 3 3 1 2 2 1 1 2 1 1
fit2 <- vglm(cbind(y1, y2) ~ x2, zipoisson(zero = NULL), data = zdata)
logLik(fit2) # Summed over the two responses
#> [1] -131.0854
sum(logLik(fit2, sum = FALSE)) # For checking purposes
#> [1] -131.0854
(ll.matrix <- logLik(fit2, sum = FALSE)) # nn x 2 matrix
#> y1 y2
#> 1 -1.7788391 -0.4551255
#> 2 -0.6068569 -4.2666074
#> 3 -0.2775615 -0.4653479
#> 4 -1.0743810 -2.5490054
#> 5 -2.1705889 -0.4550957
#> 6 -1.6022889 -0.4371667
#> 7 -1.4202158 -0.3346933
#> 8 -0.8077211 -0.4179719
#> 9 -2.6313808 -0.3737048
#> 10 -0.4640544 -0.4610976
#> 11 -0.3471433 -0.3373881
#> 12 -1.0913405 -0.4423398
#> 13 -3.4556299 -3.3103778
#> 14 -0.4074659 -0.3557256
#> 15 -1.9300326 -0.4575258
#> 16 -0.9970949 -0.4501086
#> 17 -0.3170426 -0.3265559
#> 18 -1.0319933 -0.4181447
#> 19 -0.6116250 -0.3960609
#> 20 -2.7416074 -0.3662881
#> 21 -0.4758236 -3.1213778
#> 22 -1.6266301 -3.1535223
#> 23 -0.7447630 -0.4558961
#> 24 -0.7142881 -0.4088163
#> 25 -0.5462736 -0.4595263
#> 26 -3.6788163 -4.0823170
#> 27 -1.5529848 -0.4522423
#> 28 -0.8521433 -2.9127306
#> 29 -2.3256844 -0.3934590
#> 30 -0.3548307 -0.3399595
#> 31 -0.3826318 -3.4900884
#> 32 -1.5451676 -5.4354411
#> 33 -2.1572908 -3.1876624
#> 34 -1.2919888 -0.3582631
#> 35 -0.9188985 -0.4522846
#> 36 -2.1703747 -2.9608889
#> 37 -1.0910099 -0.4422139
#> 38 -0.7967828 -0.4549033
#> 39 -1.4743800 -0.4509816
#> 40 -0.3706857 -0.3450336
#> 41 -0.2009077 -0.4677222
#> 42 -0.4079623 -2.4244236
#> 43 -0.3892025 -3.0649824
#> 44 -0.9079196 -0.4261489
#> 45 -1.1498986 -2.6999145
#> 46 -2.0139713 -6.0888741
#> 47 -0.6817177 -2.5623816
#> 48 -0.3414784 -2.8876591
#> 49 -0.4324802 -0.3622352
#> 50 -1.8489187 -0.4563923
colSums(ll.matrix) # log-likelihood for each response
#> y1 y2
#> -59.21077 -71.87467