[chuck-users] UI Idea and Accessibility, HID API and Bug?
signal.automatique at gmail.com
Tue Mar 6 14:38:15 EST 2007
Hi, Velli-Pekka! Wonderfully interesting points!
> [UI for synth panels]
> Cool, I've only seen references to that Mini Audicle thing and tried out
> some binary in WIndows. Itr looks to me like a ChucK IDE, though I've yet
> figure out how to run scripts in it, <embarrassed emoticon>.
Ah, yes. I'm on Win and Linux as well but my eyes still work (though they
are going down....) so I saw the screenshot of the Mac version. This
screenshot shows sliders and leds and so on and the idea is of course that
this will be extended. Making those screen-readable now seems like a good
idea, especially since I imagine that in the long run more visually impaired
people will try programing for their electronic music needs, the idea seems
That's right. Although I was thinking the panel objects could maybe export
> their serialized state. That info could be then stored in a file directly.
> Do chucK objects have built-in serialization capabilities as in Java? Or
> maybe a class to inherit from and a bunch of callbacks for preset
> management, kind of like in VSt plugs.
Right, yes. I think that here Spencer's point of view is close to my own. As
I understood it Spencer wants to integrate such things with ChucK itself,
not make it into a exception. I think that this would make it more versatile
and powerful in the long run. For novice users it might be harder to use at
first but that trade-off might be worth it. I think we can assume that
anyone who is using it will be already familiar with implementing
functionality using ChucK. The same goes for MIDI learn. You are quite right
that MIDI learn needs a way to indicate what parameter is meant and right
now -without a graphical interface- that would be a rather primitive affair.
As soon as the graphics get here it would need a night of clever thinking
but be very possible.
This might be a good topic for a future example to be distributed with the
Not necessarily. I didn't mean the GUI would be customizable in the app
> itself. In stead, it would use the colors and fonts you had picked for
> OS. I'm a low-vision user myself so I've chosen Windows colors that are
> sufficiently high contrast but virtually no VST plug ever respects those,
> for example.
Ah! I think that I myself run one of the Windows settings meant for disabled
people; I use one of the high-contrast B&W ones because I find them easier
on the eye and more clear.
> Basically yes. My rather technical definition is that they are apps that
> programmatically reduce the graphical user interface to text, which is
> rendered as speech and or Braille. Other things they often do include
> following focus changes, heuristically guessing which pieces of text are
> labels and augmenting the keyboard interface by providing mouse-emulation
> and virtual focus for apps that either lack a keyboard interface or have a
> broken tab-order.
Got it. This is the sort of thing why we need alt tags for images and Flash
is of dubious value and so on.
It's not so much the interfaces as such, but the fact that they are custom
> controls screne readers are not able to read programmatically. They also
> lack keybord access unlike say your average Office app. I'm actually a fan
> of GUIs myself and wouldn't want to go back to a command prompt, though I
> like programming. It's just GUis that don't cleanly reduce to text that
> the problem. Usually direct manipulation with the mouse, graphs and such.
> graphically editable envelope would be a good example, though again
> no reason why the points could not be managed via a list view and some
Absolutely. Now that I think of it it's quite remarkable that nobody -that I
know of- pointed out the interesting benefits that livecoding and generally
using programing as a musical instrument might have for the visually
> >synth programs for people with perfect sight as well; it seems to
> >from the sound.
> Not to mention all the people with high res TFT displays, glasses and or
Oh, yes, and the need for split-second decisions while in chaotic
environments like when performing in a nightclub.
> Yup, I've noticed. Although I think I found a bug in it. In:
> The keystrokes get delivered to chucK even when that window has not gotten
> the focus. I find that counter-intuittive and a potential problem if you
> multi-task several keyboard instruments in different consoles. Sure it is
> expected behavior for MIDI but not for the keyboard in Windows apps at
> least. Besides such an implementation likely means system-wide Windows
> keyboard hooks may need to be used, which can be a resource hog at times.
> screen reader already has its hooks in the chain.
Personally I like this. Right now I work with ChucK and the Nord Modular and
I find it very convenient to be able to play ChucK with the keyboard while
the Nord editor is in focus and addressed by the mouse. If this isn't what
you want you might want to try the non-hid keyboard interface; that one
doesn't do this and demands focus. You would lose key up and key down
How good is the HID API by the way? I've got a gamepad with a mini joystick,
> a v axis, the throttle, pov switch and 6 buttons and hav always thought it
> would be cool if I got it to send out MIDI. Also this laptop has a
> touchpad that can send pressure in addition to the xy coords. So I guess
> could use it as a 3D mini Khaos pad if you will.
I have some good news for you. As I mentioned; my eyes are fine for normal
computer usage but I got tired of staring at the screen all the time in a
performance context so I'm developing my own instrument (a house style
sequencer) which is build around two HID gaming devices and is playable
without using the screen at all. I still use screen prints to know what BPM
I'm currently at and how much shuffle I'm using but only to help me make
"presets". This is very feasible and I'd like to encourage you to try to do
the same. A good pianist can play blindfolded so why not a techno producer?
One thing that I've become very interested in is interface sonification for
musical programs. I'm in the very early stages of testing that idea. My aim
here is to stylise the interface sonification in such a way that it blends
with the music, using some quantisation where it may be necessary. You might
want to experiment in that direction as well. As Asio gets here this will
feel more natural; realtime interface sonification depends heavily on low
latency if you want to make it pleasing. I found that I tend to input
commands rhythmically after listening to a beat for a while regardless of
whether those commands have anything to do with the beat directly.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the chuck-users