[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
In case it wasn't clear, I was *not* trying to address the issue of the
relation between number/size of objects and size of memory available. I'm
well aware that there are many problems, *and* very little research on them
(if anybody knows of anything, I want to hear about it). I shouldn't have
said anything about embedded systems, they're irrelevant.
What can and should be done is to formalize the limitations that implementors
wire into the representations of data objects. Those limits will be there
no matter how much or how little memory is available, since they have to
do with the bit patterns of data structures. If you have a program that
wants a 1025 element array in a Lisp that limits arrays to 1024 elements,
making the heap bigger isn't going to help.