[chuck-users] memory exhausted
lieber at princeton.edu
Sun Apr 5 11:15:09 EDT 2009
On Sun, Apr 5, 2009 at 10:37 AM, Gregory Brown <gwbrown at uga.edu> wrote:
> I have written a script that uses a rather data file to generate a chuck
> file that is about 24000 lines long and 564KB. When I run the script I get a
> "memory exhausted" error. Are there chuck-imposed limits that I am running
> into? Is there a way to allocate more memory to chuck?
> I have tested this on two machines, both running OS X, ChucK 126.96.36.199, 2GB of
> RAM. It does not seem to matter what other processes are running or how much
> memory is available (real or virtual)… it always exits at the same line.
> Also -- when I cut the file down to a few thousand lines, it runs fine.
> Any thoughts would be great. Thanks,
This came up on the forum a while back:
Boils down to inefficiencies in the way the ChucK grammar is parsed
(right recursion instead of left). The method of solving the problem
is introduced near the bottom of the thread -- rewriting the grammar
-- although it's a little more complicated. ChucK likely uses the
"wrong" type of recursion because that way a lot of things can be
easily represented as singly-linked lists rather than doubly-linked.
If the grammar file chuck.y is altered to use left recursion, all of
the calls to prepend_*() functions defined in chuck_absyn.cpp need to
be replaced with append_*() calls which do not currently exist.
An afternoon's work, I think, if only I had an afternoon today. ;p
It's just 8 functions that all need to be changed in the same way, and
a find/replace on chuck.y, in case anyone else is up for it!
P.S. There's also a hack mentioned on the forum thread that would
allow you to rebuild your ChucK with a slightly higher tolerance for
large files. If you're very near the limit, that might get you through
this tough time.
More information about the chuck-users