On 24 Jul 2009, at 01:50, Kassen wrote:
If you have a longer composition that uses a lot of previously recorded instruments that you'd like to mix to a single file using ChucK you may need a amount of memory that starts to approach the order of magnitude that causes swapping. That's the only case I can think of right now but a) likely that's not the sort of case that will trigger lots of GC on that data and b) I think we are already re-claiming space for samples that are no longer used and so far nobody had complained that this caused huge breakdowns in the realtime paradigm. In fact I can't remember a single complaint about this at all. Loading huge samples from HD while using a small latency is likely a much larger issue.
The GC question might be irrelevant, because music does not require so much power relative graphics, and standard malloc/free is just perhaps hundred times or even less slower than a fast GC. By Moore's law (which I checked on Macs), the chip size double every second year, and for CPU frequency it is about every third. So combined (using multicore) that gives five doublings in six years. On the other hand, implementing an advanced GC takes up a lot programming time. A slow HD might be countered by using SSD instead. They are quite expensive right now, but prices will surely drop. HDs are buffered, and there are hybrid HD-SSD (by Samsung). Hans