jlaake wrote:The only limitation with RMark will be available computer memory for the var-cov matrix which is the square of the number of parameters. If you choose to use RMark and have problems with memory I can recode it such that it will only compute the std error and not the full var-cov matrix. Right now there is a vcv argument but if you make it FALSE, I don't think it will give you std errors.
--jeff
I've managed to recompile R and related things on a 64-bit Linux-based machine with >16 Gb usable RAM. Things could always be tested there. Gary has a compiled version of the numerical routines RMark calls, so in theory, it should be fairly easy to link RMark and MARK *natively*.
However, things that should be easy are often anything but...I'd try RMark on a Windows machine with as much RAM as you can find first (although in fact stupid Windows won't actually let you use more than 4 Gb).
p.s. For you tech-heads who care, in the 32-bit Windows world (meaning Win 2000, XP, and Vista), each application has its own “virtual” 4GB memory space. (This means that each application behaves/functions as if it has a flat 4GB of memory, and the system's memory manager keeps track of memory mapping, which applications are using which memory, page file management, etc)
This 4GB "virtual" space is evenly divided into two parts, with 2GB dedicated for kernel usage, and 2GB left for application usage. Each application gets its own 2GB, but all applications have to share the same 2GB kernel space.