RMark Memory Issue

posts related to the RMark library, which may not be of general interest to users of 'classic' MARK

RMark Memory Issue

Postby EmmaRigby » Wed Apr 22, 2009 9:30 am

Hello,
I've been trying to run some multi-state models on a large dataset using MARK, but moved to RMark as the interface kept crashing. I've created my scriptfile and tried to run the models but keep getting the following error:

Error : cannot allocate vector of size 500.8 Mb

I'm using a 4Gb RAM 2.13GHz 32-bit Windows XP machine. I've had a play with the memory.limit and memory.size options and tried looking at the helpfile but nothing seems to work. I just wanted to see if anyone else had encountered a similar problem, and if there is any way around it other than using a 64-bit machine and/or Linux, as it's going to be difficult to get hold of one of those machines and I presume even harder to work with RMark and MARK if I'm unfamiliar with a non-Windows interface?

Any help would be much appreciated!

Thanks
Emma
EmmaRigby
 
Posts: 8
Joined: Sat Nov 03, 2007 9:33 am

Postby jlaake » Wed Apr 22, 2009 9:55 am

Have you tried external=TRUE? It will save the R object in a file rather than trying to hold it in memory. I think we communicated about that previously. If you are using external=TRUE then the only thing else I can suggest is to make sure you start with an empty workspace.

--jeff
jlaake
 
Posts: 1479
Joined: Fri May 12, 2006 12:50 pm
Location: Escondido, CA

Postby EmmaRigby » Wed Apr 22, 2009 10:37 am

Hi,
I made sure I deleted everything in the workspace before I started and used:

#REMOVE ALL CURRENT DATA FROM MEMORY:
rm(list=ls(all=TRUE))

at the beginning of my script. With my models I used external=TRUE as here:


#CREATE MODELS TO RUN
Phi.dot.p.dot.Psi.distance=mark(external=TRUE,wharfedale.process,wharfedale.ddl,model.parameters=list(S=Phi.dot,p=p.dot,Psi=Psi.distance))


but even though this is my simplest model I get the following message:


Error : cannot allocate vector of size 500.8 Mb

********Following model failed to run: S(~1)p(~1)Psi(~distance)**********


When I look in the working directory my markxxx.vcv file is 501MB - does this mean the external=TRUE command has not worked, or that it has and my computer simply isn't able to run the models?

Thank you!
EmmaRigby
 
Posts: 8
Joined: Sat Nov 03, 2007 9:33 am

Postby jlaake » Wed Apr 22, 2009 12:28 pm

The problem is the vcv matrix of the real parameters being read back into R is too large for the available memory although .5GB is not that large in comparison to your available memory. The external=TRUE won't help if it can't first construct the object to write it out do disk.

The only other suggestion I have is to use logit link for Psi because that will reduce the number of real parameters because it can simplify the real parameter structure with a logit link but not with an mlogit link.

I'm a bit confused because I thought we already covered this ground and you were up and running. What changed?
jlaake
 
Posts: 1479
Joined: Fri May 12, 2006 12:50 pm
Location: Escondido, CA


Return to RMark

Who is online

Users browsing this forum: No registered users and 2 guests

cron