Here is an example using the oven bird data set that works:
- Code: Select all
# Subset to one session of oven bird data
testCH <- ovenCH[[1]]
testMask <- ovenmask[[1]]
# Time varying covariate
tmp <- secr::covariates(testCH)
tmp <- data.frame( tmp, matrix(rnorm(prod(dim(testCH)[1:2])), dim(testCH)[1], dim(testCH)[2]))
dimnames(tmp)[[2]] <- sub("X","scl",dimnames(tmp)[[2]])
secr::covariates(testCH) <- tmp
secr::timevaryingcov(testCH) <- list( scl = c(2:ncol(tmp)))
# Temporal covariate
temporalDf <- data.frame(T = 1:ncol(testCH) - mean(1:ncol(testCH)))
# Fit model
fit <- openCR::openCR.fit(
capthist = testCH,l
type = "JSSAsecrb",
model = list(lambda0 ~ T,
b ~ Sex + scl,
phi ~ Sex + scl,
superD ~ 1,
sigma ~1),
mask = testMask,
detectfn = "HHN",
movementmodel = "static",
timecov = temporalDf,
trace = TRUE,
ncores = 1
)
The above works; but, when I attempt to do the same thing on my real data set, I get the following error:
- Code: Select all
fit <- openCR::openCR.fit(
+ capthist = turtleCH,
+ type = "JSSAsecrb",
+ model = list(lambda0 ~ T,
+ b ~ sex + scl,
+ phi ~ sex + scl,
+ superD ~ 1,
+ sigma ~1),
+ mask = turtleMask,
+ detectfn = "HHN",
+ movementmodel = "static",
+ timecov = capSpline3,
+ trace = TRUE,
+ ncores = parallel::detectCores() - 1
+ )
Preparing design matrices
Maximizing likelihood...
[color=#FFBF00]Eval Loglik lambda0 lambda0.T phi phi.sexMale phi.scl b b.sexMale b.scl superD sigma
Error in makegkParalleldcpp(as.integer(data$detectfn), as.integer(.openCRstuff$sigmai[type]), :
negative length vectors are not allowed[/color]
When I remove `T` from the lambda0 model, I get the following error:
- Code: Select all
Preparing design matrices
Maximizing likelihood...
Eval Loglik lambda0 phi phi.sexMale phi.scl b b.sexMale b.scl superD sigma
[color=#FFBF00]Error: cannot allocate vector of size 13.0 Gb[/color]
My data set is huge, or I would include it. Dimension of `turtleCH' is 863 X 17 X 1485. Dimension of `turtleMask` is 1620 X 2. The first six rows of `capSpline3` are:
- Code: Select all
> h(capSpline3)
spline3.1 spline3.2 spline3.3 T
1 0.0000000 0.000000000 0.0000000000 -8
2 0.1486626 0.008744856 0.0001714678 -7
3 0.2633745 0.032921811 0.0013717421 -6
4 0.4032922 0.115226337 0.0109739369 -4
5 0.4346708 0.167181070 0.0214334705 -3
6 0.4444444 0.222222222 0.0370370370 -2
The first six rows of `covariates(turtleCH)` are:
- Code: Select all
h(covariates(turtleCH))
sex scl2000 scl2001 scl2002 scl2004 scl2005 scl2006 scl2007 scl2008 scl2009 scl2011 scl2012 scl2013 scl2014 scl2015 scl2016 scl2017 scl2018
3 Female 89.3 89.3 89.3 89.2 89.1 89.0 88.9 88.9 88.8 88.6 88.6 88.5 88.9 89.3 89.0 88.8 88.5
172 Female 89.3 89.3 89.3 89.3 89.3 89.3 89.3 89.3 89.2 89.0 88.9 88.8 88.8 88.7 88.6 88.5 88.4
194 Female 78.7 79.0 79.3 79.9 80.2 80.6 80.9 81.2 81.5 82.1 82.4 82.7 83.0 83.3 83.6 83.9 84.2
213 Male 89.6 89.6 89.6 89.6 89.6 89.6 89.6 89.6 89.6 89.6 89.7 89.5 89.4 89.2 89.0 88.9 88.7
214 Female 65.7 66.5 67.2 68.7 69.5 70.3 71.0 71.8 72.6 74.1 74.8 75.6 76.3 77.1 77.9 78.6 79.4
215 Female 71.2 71.8 72.3 73.5 74.1 74.6 75.2 75.8 76.3 77.5 78.0 78.6 79.2 79.7 80.3 80.9 81.4
Due to the difference in errors when I remove the temporal covariate `T` from lambda0, I am inferring there is something amiss when individual-time and time covariates are included in lambda0. Perhaps my data set is too large? Any insights based on the negative length vectors are not allowed error would be greatly appreciated.
Finally, any suggestions for reducing the memory requirements? I.e., any suggestions on overcoming the `cannot allocate vector of size 13 Gb` error. I have a machine with 64 Gb of RAM and several terabytes free on the hard drive.