[chuck-users] Seqs, Design and Usability (Slightly OT)

Kassen signal.automatique at gmail.com
Sat Mar 10 14:45:02 EST 2007


Veli-Pekka Tätilä  wrote:
>
> Hi,
> As you mentioned off-list some of the accessibility stuff is really OT
> here.
> SO I'll skip and snip more than usual.


It all kinda interacts. I imagine there must be more people who want to put
ChucK in clubs.



> Again the heuristics here are that the
> right-click menu is a power-user feature, you should not depend on people
> using it and so the functionality should be mirrored somewhere.


I think you can seriously wonder how suitable sub-menus are for instruments
at all. Verious synths -including ChucK if we look at programing and using
the HID/MIDI/whatever a seperate- use two different interfaces for
configuring and playing.

It gets more tricky with pro-level keyboards and grooveboxes and so on. I
don't think sub-menus are such a good idea there. Generally I think many
instruments would be better with less but more carefully planned out
features. The piano lasted centuries already; I don't think the mc303 will.


Acessibility and playability have a huge overlap. You might find this PDF
worthwhile, it's by ChucK co-author Perry Cook and on basically this topic;
http://soundlab.cs.princeton.edu/publications/prc_chi2001.pdf

You mean a tablet PC, Yeah I can believe that.


Actually I meant one of those graphic designer tools to replace a mouse with
a pen and use absolute (not relative) controlls. I had hoped the absolute
desk-screen mapping would make me faster thanks to muscle memory but instead
auto-scaling resulutions made it very unpleasant affair in Ableton.


ANd how do you drag and drop
> with a pen interface anyway?


Exactly like with a mouse or with a chopstick on a dinner-plate.
 <grin>

That's not the hard bit, asuming you are used to a mouse anyway.

I'm sure the screen reader mouse emulation does
> not like it, either. IT can drag and drop in software, but only if the
> source and destination show up as different focucible objects to the
> reader.
> Needless to say dragging an arbitrary amount away from the center of a
> knob
> canot be emulated with current reader software.


That would probably mean the same effect would be seen when drawing
envelopes.


> Yes, especially as mine has got five buttons. YOu could make it modulate
> different things depending on which button or combo of buttons is pressed.


There are some examples here;
http://smelt.cs.princeton.edu/


[Quest for Fame V-pick]
> The game is great fun but the controller is not. It is plugged in the
> serial
> port and only transmits simple pulses when the pick is used. So I'd say it
> is about as good as a single keydown on the keyboard for control purposes.


Oh. Right. That's disapointing. I had hoped it would be based on tilt/
inertia sensors.



[ChucK v.s. modulars]
> Yes, all timing is auto-synchronized to the sample the way I understand
> it.
> Still ChucK  does require rethinking common modular synth designs as you
> say, in terms of how to implement them.


Exactly. Sticking to "modular tactics" will lead to ugly code and much cpu
overhead.



> > back and forth between audio and controll signals tends to be cpu
> > intensive.
> is there a difference? I didn't know that. I was kinda hoping I could use
> any audio signal for control or vice versa.


Well, you can if you need to.


Which raises questions like, how
> do you convert between the two and what's the event rate?


You can convert controll signals (the code you write yourself) to audio
using either "step" or "impulse", both are Ugens you might want to look up.
You can get audio-rate values to base
conclusions/events/modulations/whatever on by polling myugen.last(). The
event rate is whatever you make it; it equates directly to the amount of
time you advance each loop itteration.

However; to detect things like edges and zero crossings you'll need to
advance time by just one sample each itteration and loops like that eat a
lot of cpu.


I think the
> manual, though it has example code, lacks formalism in the style of say
> the
> K&R C book, of which I'm a big fan.


Fair, but our manual costs less money then that book (in fact it's free) and
hence it has a volunteer maintainer who's currently bussy with more urgent
stuff. To asist in that I started a page on the WIKI to list things that are
unclear to people or outdated, missing or downright likely to cause nuclear
melt-down or family dramas.

If there's something identifiable that you find confusing or unclear then
you are mroe then welcome to join in; anybody can join the WIKI and add
pages.


> I see, I'l try something like that out. AS with so many other coding
> things,
> the best way to learn is to write and make mistakes, <smile>. But I've
> noticed that as a learner I usually like to start with a good book. In
> this
> case, the manual is far from complete, so looking at the examples is
> something I'll have to do eventually.


A book would be nice yes, but that's far off. I think the closest thing yet
is Ge's accedemic papers that basically attempt to convince that running
ChucK is a good idea at all.

C does have books but C is quite old, proven to be usefull, not widely
regarded as likely to explode nor is it -currently- a very experimental sort
of thing. Perhaps most importantly; when you confess to using C people don't
look at you attempting to determine wether you are mad. This is all quite
unlike ChucK <grin>.


> AH good info,, I think this will get me going, thanks. I read about class
> files but diddn't realize I could use the same add operation to include
> code.


You can. Adding a file that has a public class to the VM will instantly make
that class available to other ChucK programs. In fact you could allways
start Chuck using;

chuck --loop my_class1.ck my_class2.ck my_class3.ck

Perhaps through a batchfile, and all your classes would always be there for
you to use.


> Yes, and you might also want to add the ability not to quantize user
> input.
> The Alesis SR-16, whose step editing I already dissed, actually is one of
> the few drum machines which support both quantized and unquantized input.


That's a thing for the future. Considdering the latency that the Windows
drivers give I'm quite happy to have implemented realtime input
quantisation... This is still good fun, I can now hammer buttons randomly
and the output will still be in perfect sync.



>
> Yes, that could work. Another thing I'm going to try at some point is tap
> tempo and then the ability to loop a portion of a song when you hit a key,
> such that you can determine the unit in looping, as musical time.


Go for it!


I've
> noticed that when I use the software based volume control buttons on my
> laptop, Winamp playback studders in a cool and half-musical way. If only I
> could sync that and make the process more controllable ...


Most likely you can controll it using "keyboard repeat rate", that's in your
Windows keyboard settings.



> > one of my prime demands is that it needs to be usable in the dark,
> Which is interesting. ONe academic paper I read recently on accessibility
> said that in addition to aiding say people With no sight, accessibility
> also
> helps in special circumstances like smoky or dark environments. So here we
> are.


That's exactly what I was aiming at. Another thing I learned in a interface
design class was that people can learn a lot, interface wise, but as soon as
they panic this breaks down. The example given was aircraft design. That's a
good example of a highly complicated interface that needs to stay usable
under high stress....


Kas.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.cs.princeton.edu/pipermail/chuck-users/attachments/20070310/71ba14ca/attachment-0001.htm 


More information about the chuck-users mailing list