Model averaging with >300 parameters

questions concerning analysis/theory using program MARK

Model averaging with >300 parameters

Postby gillian » Mon Aug 20, 2007 4:35 pm

I am wondering whether there is a way to have MARK model average more than 300 parameters when using the non-interactive model averaging parameter specification window (selected under File->Preferences because complexity/size of models was previously causing MARK to crash when I tried the usual model averaging route). In the non-interactive window, number of parameters is limited to 300 and when I enter any parameter number higher than 300 that I would like model averaged, MARK crashes.

Thanks for any assistance,
Gillian Hadley
gillian
 
Posts: 4
Joined: Thu Apr 22, 2004 4:32 pm
Location: Lakeview, MT

Model averaging with >300 parameters

Postby gwhite » Tue Aug 21, 2007 11:07 am

Gillian:
There is some kind of memory limitation with Visual Objects such that I could never do more than 300 parameters. I haven't physically coded this limitation (that I could find), but remember an issue with >300 as you describe.

Gary
gwhite
 
Posts: 340
Joined: Fri May 16, 2003 9:05 am

Re: Model averaging with >300 parameters

Postby cooch » Tue Aug 21, 2007 12:22 pm

gwhite wrote:Gillian:
There is some kind of memory limitation with Visual Objects such that I could never do more than 300 parameters. I haven't physically coded this limitation (that I could find), but remember an issue with >300 as you describe.

Gary



One option might be the RMark package, which I *think* can handle this...worth a look, at any rate.

See Appendix C of 'the book'.
cooch
 
Posts: 1654
Joined: Thu May 15, 2003 4:11 pm
Location: Cornell University

model averaging >300 parameters

Postby jlaake » Tue Aug 21, 2007 12:42 pm

The only limitation with RMark will be available computer memory for the var-cov matrix which is the square of the number of parameters. If you choose to use RMark and have problems with memory I can recode it such that it will only compute the std error and not the full var-cov matrix. Right now there is a vcv argument but if you make it FALSE, I don't think it will give you std errors.

--jeff
jlaake
 
Posts: 1480
Joined: Fri May 12, 2006 12:50 pm
Location: Escondido, CA

Re: model averaging >300 parameters

Postby cooch » Tue Aug 21, 2007 12:53 pm

jlaake wrote:The only limitation with RMark will be available computer memory for the var-cov matrix which is the square of the number of parameters. If you choose to use RMark and have problems with memory I can recode it such that it will only compute the std error and not the full var-cov matrix. Right now there is a vcv argument but if you make it FALSE, I don't think it will give you std errors.

--jeff



I've managed to recompile R and related things on a 64-bit Linux-based machine with >16 Gb usable RAM. Things could always be tested there. Gary has a compiled version of the numerical routines RMark calls, so in theory, it should be fairly easy to link RMark and MARK *natively*.

However, things that should be easy are often anything but...I'd try RMark on a Windows machine with as much RAM as you can find first (although in fact stupid Windows won't actually let you use more than 4 Gb).

p.s. For you tech-heads who care, in the 32-bit Windows world (meaning Win 2000, XP, and Vista), each application has its own “virtual” 4GB memory space. (This means that each application behaves/functions as if it has a flat 4GB of memory, and the system's memory manager keeps track of memory mapping, which applications are using which memory, page file management, etc)

This 4GB "virtual" space is evenly divided into two parts, with 2GB dedicated for kernel usage, and 2GB left for application usage. Each application gets its own 2GB, but all applications have to share the same 2GB kernel space.
cooch
 
Posts: 1654
Joined: Thu May 15, 2003 4:11 pm
Location: Cornell University


Return to analysis help

Who is online

Users browsing this forum: Bing [Bot] and 2 guests