[chuck-users] UI Idea, Uses, Accessibility, HID and MIDI

Veli-Pekka Tätilä vtatila at mail.student.oulu.fi
Wed Mar 7 08:18:48 EST 2007


Hi Kas and Spencer,
As I wanted to comment both of your replies at once, I'm switching quoting 
style in mid-thread, IF this new one is problematic, let me know in a reply 
and I'l'll change back. AT least it makes it clear with screen readers who 
is saying what without having to parse "greater than  x 3" in my head, 
<smile>. Feel free to snip heavily.

V, S, K = Veli-Pekka, Spencer, Kas

[UI for synth panels]
[mini-audicle]
VV: I've yet to figure out how to run scripts in it, <embarrassed emoticon>.
S: Its easy! (hopefully...)  You just have to type or paste some ChucK 
code into the editor window, start the virtual machine (Alt .), and  then 
add the shred with Alt + .   Once the virtual machine is active,
V: Yes, that works, thanks. WHIch reminds me of a problem I found with the 
organ, more about it in the HID section. The step I failed to do here was to 
start the machine, I thought one could add a shread to it which would 
implicitly start it, or something like that. I actually tried Audicle before 
the manual so that explains things, too.

S: you can also use the buttons at the top of the document window to   add, 
remove, and replace shreds.
V: Ah now that you mentioned them, I used virtual focus and found it. Here 
the problem is that they are toolbar buttons and in WIndows toolbars are not 
in the tab order. From an accessibility point of view, buttons close to the 
start virtual machine button in which you can tab or shift+tab to would be 
better. But it is good enough to also mirror the choices in the menus so 
there's at least one accessible, and discoverable route.

K:  screenshot of the Mac version. This screenshot shows sliders and leds 
and so on and the idea is of course that
this will be extended.
V: Ah I see, sounds good to me. A classic accessibility question, does that 
panel UI have the concept of keyboard focus, tab order and the ability to 
interact with the controls using the same hotkeys for knobs that Mac and 
WIndows use for sliders? THese are the most frequently overlooked points in 
synth UI design, in terms of accessibility, and make life hard if your 
primary input medium is keyboard and output comes via synthetic speech.

K: especially since I imagine that in the long run more visually impaired 
people will try programing for their electronic music needs
V: That's right, and also because sighted folks might like precise keyboard 
control from time to time. Precise adjustment of values, and the ability to 
type in values using numbers or modal dialogs would rock. I've seen a 
zillion synth UIs in which I'd like to type in an exact cutoff value and 
cannot, though I have 102 keys at my Disposal.

VV: was thinking the panel objects could maybe export their serialized 
state. That info could be then stored in a file directly.
K: As I understood it Spencer wants to integrate such things with ChucK 
itself,
not make it into a exception. I think that this would make it more versatile
V: Most definitely. If chucK had serialization across the board, you could 
also store modules efficiently in files and load them directly. Maybe, you 
could even send objects like Synths over the network.

K: For novice users it might be harder to use at first but that trade-off 
might be worth it
V: Yeah, although I like how much ChucK ressembles C-like langs and Java in 
particular. It made it real easy to learn. I wish the manual would have a 
quick conversion guide, too. Regarding serialization it need not be that 
hard. Many languages have one function to turn an object into binary and 
another to get it back. In scripting langs that's generating a string 
representation of the data structure and evaluating it on the fly, in simple 
cases.

K: quite right that MIDI learn needs a way to indicate what parameter is 
meant and right
now -without a graphical interface- that would be a rather primitive affair.
V: Yes, but could be done. Most old games would have you input controls in 
exactly the same order every time with no undo or step back facility. That 
could work for simple MIDI instruments, though is of course, very clumsy.

[accessibility]
VV: didn't mean the GUI would be customizable in the app <snip> it would use 
the colors and fonts you had picked for your OS.
K: think that I myself run one of the Windows settings meant for disabled 
people; I use one of the high-contrast B&W ones because I find them easier 
on the eye and more clear.
V: Well, here we go again. ANd one advantage of truetype fonts compared to 
bitmaps burned in the synth panel is that the text size is changible for 
highRes or low-vision users, and localizable, too. It's a common observation 
that sometimes accessibility can improve usability, too, especially for 
keyboard powr-users. At other times, the two are in conflict, e.g. whether 
to skip over grayed out menu entries from the keyboard.

[screen readers]
V: apps that programmatically reduce the graphical user interface to text 
rendered as speech and or Braille. <snip> following focus changes, 
heuristically guessing which pieces of text are labels and augmenting the 
keyboard interface <snip>
K: Got it. This is the sort of thing why we need alt tags for images and 
Flash
is of dubious value and so on.
V: Yes, although Flash could be made accessible, only people lack the 
interest and or knowledge. It's sad people think of the Web when they think 
accessibility, where as it is  desktop apps your average Joe or Jane is 
using most of the time. Often their accessibility is much worse. Nough said.

[uses for ChucK]
K: it's quite remarkable that nobody -that I know of- pointed out the 
interesting benefits that livecoding and generally
using programming as a musical instrument might have for the visually 
impaired.
V: It's funny you mention this. I didn't think of live performance when I 
looked into ChucK, although I imagine it could be of great value in that, 
too. As I input virtually everything via MIDI in my music, and record it 
live in a seq, the live aspect of chucK wasn't my cup of tea, though. What 
I'm looking for is a more accessible equivalent to Reaktor eventually, whose 
UI has been going down and down for years, in terms of accessibility. NAmely 
the ability to patch modular synths together and process MIDI  eventes 
either in real time or off-lien in MIDi files like you would in Cakewalk 
application language. Another interest is as a simple testbench for audio 
and MIDi mangling ideas comparable to VST but much easier and faster to work 
with.

I've noticed that graphical programming, as in Reaktor, has a limit after 
which it would be faster and more natural to express those ideas in code, 
particularly if you have to use magnification and speech to begin with. 
Laying out math formulae as a series of modules is one obvious example. And 
another is counting stuff. To count a 16-bit integer quantity in Reaktor, I 
Would have to worry about the range of events and chain too binary counters 
together to get the precision. Where as in ChucK I can say:
0 => int myCounter; // And that's that.
++myCounter; // to increment, and modulo to wrap.

Similarly in computer science, as I have to view graphs one node at a time 
magnified, it is much easier to render the stuff on paper (laptop) or in my 
head as matrices, despite being much more a  programmer than a mathematician 
of any kind.

Also in Reaktor I have to spend time arranging modules on screen, which has 
absollutely nothing to do with how the synth operates or even looks to the 
user. Programming also gives you dynamic control for virtually everything. 
SAY I want to build an n to 1 audio switch, no problem, as the number of 
mixers instanciated can be parameterized. I have never used such a tac but 
howabout a synth which directly alters the patching and set of modules in it 
based on previous input and its current state, a complex Mealey state 
machine if you will.

K: One thing that I've become very interested in is interface sonification 
for musical programs.
V: I wonder if that's bene studied. As part of my Uni graduation stuff for 
investigating issues of current screen readers from the user point of view, 
which is just in its infancy, I've read plenty of stuff on accessibility. I 
do know auditory icons have been researched.

K: so I'm developing my own instrument (a house style sequencer) which is 
build around two HID gaming devices and is playable without using the screen 
at all.
V: Quite nice, actually. But where do you store the sequences and in what 
kind of a format? The japanese music macro language (mML) would be 
especially nice for expressing modulation and monophonic parts in 
ASCIIbetical notes. I mean, I've used it to compose mini tunes for the 8-bit 
Nintendo on the PC. I still lack an official spec, though I've written a 
simple MML parser for myself in Perl.

K: found that I tend to input commands rhythmically after listening to a 
beat for a while regardless of whether those commands have anything to do 
with the beat directly.
V: But what kind of params would you adjust via timing? tap tempo comes very 
naturally but I don't know about the rest. Of course, switching patterns 
needs to be done in time, too.

[HID API and MIDI]
VV: think I found a bug in it. In:
.\chuck-1.2.0.7-exe\examples\hid\keyboard-organ.ck
The keystrokes get delivered to chucK even when that window has not gotten 
the focus. I find that counter-intuittive and a potential problem if you 
multi-task several keyboard instruments <snip>
K: Personally I like this. Right now I work with ChucK and the Nord Modular 
and
I find it very convenient to be able to play ChucK with the keyboard
V: I see, after choosing file/send message in the menu, I realized it all 
depends on the context. Most certainly, if you need to work with several 
music apps at the same time, that's handy. But suppose, if that's 
possible[?], I'd like to test my keyboard instrument, write some more code 
for it, and then reload the instrument without closing the chucK VM. In that 
case I wouldn't want to hear blips and bleeps while I'm coding. But one 
could remove the shread before coding, admitted.

S: This is actually a feature, not a bug.  The idea is to abstract the
keyboard as just some device with buttons, <snip> it simply uses
DirectInput.
V: Ah I see, and agree that in this case, as there's an alternative API, the 
choice is quite sound actually. Of course it would be better to have a 
uniform API and a boolean flag to decide how to handle the focus. 
DirectINput is good, I do know those multi axes and button gamepads are well 
exposed via it.

V: A funny thing about the organ. I've tried playing very fast in WIndows 
and it seems to me As though fast input is auto-quantized to some pretty 
coarse note value. I suppose this is a limitation of the keybord scanning, 
right?

VV: this laptop has a   Synaptics touchpad that can send pressure in 
addition to the xy coords. So I   guess one could use it as a 3D mini Khaos 
pad if you will.
S: ChucK probably won't be able to read the pressure of   your touchpad at 
this point.
V: No prob, I really didn't expect it to. I've only looked into this 
minimally and it seems as though I'll need to download an SDK to begin with, 
to even use the pressure in my apps. It is no wonder the WIndows mouse API 
doesn't support a third axis. Here's an odd idea, using pressure to navigate 
window z-order i.e. switch between the children in a top-level window like 
an MDI app.

[MIDI]
V: Lastly, some feedback about MIDI:
I mentioned MIDI processing as one potential use and find the current API a 
bit raw. It would be great if I could transparently handle either off_line 
MIDI data in a file or stuff that's coming in on the wire in real time. 
Further more, if it is off-line, would be very cool if the data could be 
idealized a little: note on/off pairs to note events with length, a bunch of 
controller messages to 14-bit NRPN and RPN messages and full sysex messages 
whose checksum would be auto-calculated basedd on the manufacturer ID, you 
get the picture. I think the current API is about as raw as in VSt which 
means getting the job done but reinventing wheeels in the process. Ideally, 
I'd like to work at the musician level without having to know about running 
status, handle channels by masking off nibbles in binary data and recalling 
or defining constants for the various event types. A Java-like API with 
dedicated event classes for each one would be great. Even a more primitive 
presentation as in MIDI Perl using arrays would be quite all right.

-- 
With kind regards Veli-Pekka Tätilä (vtatila at mail.student.oulu.fi)
Accessibility, game music, synthesizers and programming:
http://www.student.oulu.fi/~vtatila/ 



More information about the chuck-users mailing list