Hello,
I have a large leopard occurrence data set from camera trap surveys done across multiple regions throughout the Western Cape. I am looking to estimate density across the separate survey areas, as well as overall. While I could keep things simple and just use the most basic of scripts, I am excited about (and frankly overwhelmed by) all the possibilities that secr has to offer. Unfortunately I am very new to R and struggling to understand what and how to incorporate various covariates, behavioral response models, Dsurface options, etc, to ultimately come up with a density estimate that makes sense.
I'm trying to establish density in conjunction with varying land cover and land use. I'm trying to use covariates like human footprint, vegetation status, altitude, some of the Worldclim variables, anthropogenic biomes and GlobCover. Perhaps this is too many/ much?
After much frustration (and likely some dumb luck) I have the script running and providing me with density estimates but I don't have much confidence in what I have gotten. I'm currently running each regional camera trap survey separately (made master trap and capture inputs with all the surveys but haven't been able to get it to run).
This is one of the scripts I'm currently using for one survey region with 48 camera stations and 210 captures over 4 months (2 survey sessions):
PP_robCH_allCov <- read.capthist(captfile = 'robertson_CAP.csv',
trapfile = 'robertson_TRAP.csv',
detector = 'count', fmt = 'trapID', covnames = 'sex',
trapcovnames=c('/Hft','Veg4_terr_eco_status','Veg4_cons_value','Veg4_prot_value',
'Veg4_eco_status','Altitude','Bio_4','Bio_12','Bio_15','Bio_1',
'Anthro_biomes','Globcover_2009'))
secr.fit(PP_robCH_allCov, model = g0 ~ 1, trace = FALSE, buffer=7000)
RESULTS:
Fitted (real) parameters evaluated at base levels of covariates
session = RoberstonPhase1
link estimate SE.estimate lcl ucl
D log 8.800222e-05 1.443208e-05 6.394763e-05 1.211052e-04
g0 log 1.969809e-02 2.577642e-03 1.525845e-02 2.542949e-02
sigma log 4.155779e+03 2.106761e+02 3.762952e+03 4.589614e+03
session = RoberstonPhase2
link estimate SE.estimate lcl ucl
D log 8.800222e-05 1.443208e-05 6.394763e-05 1.211052e-04
g0 log 1.969809e-02 2.577642e-03 1.525845e-02 2.542949e-02
sigma log 4.155779e+03 2.106761e+02 3.762952e+03 4.589614e+03
...WITH NO BUFFER:
secr.fit(PP_robCH_allCov, model = g0 ~ 1, trace = FALSE)
Fitted (real) parameters evaluated at base levels of covariates
session = RoberstonPhase1
link estimate SE.estimate lcl ucl
D log 1.331803e-01 2.177714e-02 9.686607e-02 1.831083e-01
g0 log 6.150240e-04 5.707113e-05 5.129489e-04 7.374117e-04
sigma log 4.548505e+09 6.777801e+01 4.548505e+09 4.548505e+09
session = RoberstonPhase2
link estimate SE.estimate lcl ucl
D log 1.331803e-01 2.177714e-02 9.686607e-02 1.831083e-01
g0 log 6.150240e-04 5.707113e-05 5.129489e-04 7.374117e-04
sigma log 4.548505e+09 6.777801e+01 4.548505e+09 4.548505e+09
Problem is with a buffer 8.8 leopard per hectare makes no sense. Using Capture and other more basic density equations, we have a ballpark idea that leopard density should be maybe 1.3 to 1.6 leopard/100km.... and this density would mean there are 880 leopard/100km2!?! I chose the 7000m (7km) buffer because its about the average distance moved per day by both genders. Though, without the buffer I'm still getting 133 leopard/100km2. Am I using the buffer completely wrong? I tried to use the suggest.buffer command:
suggest.buffer(PP_robCH_allCov, detectfn = NULL, detectpar = NULL,
noccasions = 545, ignoreusage = FALSE, RBtarget = 0.001,
interval = NULL, binomN = NULL)
and got: [1] 19061
I'm not sure how to use this? Apologies for throwing so much into this message, but could you please explain if and how to use the buffer and if you think my script would greatly benefit from adding or omitting any covariates, commands, etc. Am I using some of these commands incorrectly? Am I interpreting this wrong?
I'd eventually love to try to use the mask and Dsurface options, but I am first desperate to get a density estimate that makes sense!
A huuuuge thank you in advance!!! (apologies for the novel!)