[chuck-users] Bug; Delay floating point denormalisation

Kassen signal.automatique at gmail.com
Sun Aug 5 18:06:42 EDT 2007

Fellow Chuckists,

"Delay" definately has a floatingpoint denormalisation issue (well, either
Delay or Envelope).

My feedbackloop went something like this;

soundsource => Delay del => dac;
del => Envelope feedback => del:
.9 => feedback.gain;

In this form it has to loop for quite a while before the issue shows up in
cpu usage and it will probably be most evident in situations where other
Ugens are also using floatingpoint opperations. A smaller value then .9 for
the gain should speed up the phenomenon.

In my case my "soundsource" had filled the Delay with sound, then went
quiet, after some time the CPU meter rose fairly sharply, because of the
controll using HID that I had over the feedback envelope I could make the
CPU usage go down again with one switch, meaning I'm absolutely positive it
was this loop that was to blame.

This is on Windows, using the unnoficial ASIO build that Philippe posted,
delay time was around one second or so, neither of which should matter that

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.cs.princeton.edu/pipermail/chuck-users/attachments/20070806/4e15c826/attachment.htm 

More information about the chuck-users mailing list