[chuck-users] Controling Chuck - building accessible GUI
rjc at MIT.EDU
Mon Nov 19 16:05:55 EST 2007
There was a thread on this list back in January about control of chuck via OSC, a Python package. I have a few questions:
First, I'm blind and use a screen reader. I want to build something that allows me to control chuck sreds via a GUI. I don't like Python much for programming, and am not sure if OSC would even work with my screen reader (anyone have any quick demos I could try just to see if the screen reader will deal with the UI toolkit at all)?
What I was thinking of doing is building something in Mozilla's XUL language. This does have the ability to send network packets, and also can execute shell commands. So, was thinking about writing code in chuck and then pass arguments to each .ck file to change parameters. Of course, this might not work so smoothly if we wanted to change things on the fly. FOr this I guess I'd need to implement something like OSC's message passing scheme in XUL.
Any advice, or suggestions? Anyone interested in rewriting audical so it works with a screen reader? There are so many great software synthesizers on the market now that work as both stand alone and/or plugins to popular hosts like Cakewalk's Sonar, but none that I've ever tried will allow the screen reader enough control to do real sound design. With effort, one can usually figure out how to change presets, but in a large number of cases even this is not possible. I was hoping that chuck could be used to build a fully accessible sound designers toolbox that could allow one to do things quickly and easily without having to write too much code. I've obviously never used it because it is very graphically oriented, but think of something like PD "Pure Data". You aparently can build all sorts of neat stuff by just plugging stuff together, without having to write a line of code.
Thanx for any thoughts/suggestions...
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the chuck-users