hi!
I am studying the survival patterns of a growing population of mediterranean gulls in the Netherlands and Belgium. We have 2 age groups (juveniles, adults).
We are testing the hypothesis if (local) survival depends on the density (d) of breeding pairs. It appears that this is the case, in a non-linear way (model phi(a+d) versus phi(a+d+d^2): ∆AICc=30.20, X^2=32.227, df=1, P<.0001).
When i then run the model a+d+d^2+a*d+a*d^2 (full age dependent density trends) i get the following results:
to show you, the designmatrix of a+d+d^2+a*d+a*d^2:
int age d d^2 d d^2
1 0.096 0.00922
1 0.141 0.01988
1 0.118 0.01392
1 0.018 0.00032
1 0.218 0.04752
1 0.272 0.07398
1 0.426 0.18148
1 0.456 0.20794
1 0.663 0.43957
1 0.878 0.77088
1 1.013 1.02617
1 1.367 1.86869
1 1.28 1.63840
1 1.025 1.05063
1 1 0.141 0.01988 0.141 0.01988
1 1 0.118 0.01392 0.118 0.01392
1 1 0.018 0.00032 0.018 0.00032
1 1 0.218 0.04752 0.218 0.04752
1 1 0.272 0.07398 0.272 0.07398
1 1 0.426 0.18148 0.426 0.18148
1 1 0.456 0.20794 0.456 0.20794
1 1 0.663 0.43957 0.663 0.43957
1 1 0.878 0.77088 0.878 0.77088
1 1 1.013 1.02617 1.013 1.02617
1 1 1.367 1.86869 1.367 1.86869
1 1 1.28 1.63840 1.28 1.63840
1 1 1.025 1.05063 1.025 1.05063
(if anyone can tell me how to insert the space between the colums.. i can post a neater format of this design matrix)
Results from this model:
Both the lower and upper 95% CI for d are positive for juveniles, meaning that at low densities, survival goes up witn increasing density. For adults the the beta of d is not different from zero (one CI is neg, otherone positive), suggesting stable survival at low densities.
this is all pretty straight forward. Now the tricky part...
When I look at the betas of the age dependent term of d^2, i see that these are negative, for both juveniles and adults. This means that the increase in survival at low densities decreases after a while. This does NOT suggest that at higher densities survival decreases with increasing density (parabolic relation). It can also mean that survival increases and later levels off... I want to find out of survival at high densities decreases with increasing density or not.
I thought to investigate this, i look at a raw model (a*t where time is a factor --> separate estimate for each year for each age group) and plot these survival rates agaisnt density. This graph look like a parabolic function. I can now take the optimum (density for which survival is highest; 0.663 in my case (graph not shown)). Knowing this optimum i can now make a model in which i ONLY model a linear density dependence including ONLY those densities equal and higher then the optimum density. A design matrix looks like this:
int age d d
1 0 0 0
1 0 0 0
1 0 0 0
1 0 0 0
1 0 0 0
1 0 0 0
1 0 0 0
1 0 0 0
1 0 0.663 0
1 0 0.878 0
1 0 1.013 0
1 0 1.367 0
1 0 1.280 0
1 0 1.025 0
1 1 0 0
1 1 0 0
1 1 0 0
1 1 0 0
1 1 0 0
1 1 0 0
1 1 0 0
1 1 0 0.663
1 1 0 0.878
1 1 0 1.013
1 1 0 1.367
1 1 0 1.280
1 1 0 1.025
This model now gives me an idea of the density trend (for only high densities) for each age group seperately.
IS THIS CORRECT? I have the feeling this is not correct, since the betas are not really conform the graph with the raw data... Does not (specifically) modelling the years with low density effect the density relation at higher densities? Is maybe a better way of doing this just running a linear density model on ONLY the years with high density (meaning making a new file with only the recapture history of the high density years, which is possible since they are simply the later years)??
so.. what i need to find out is... does survival at high densities decrease of not with increasing density...
any ideas?