[chuck-users] UI Idea and Accessibility, HID API and Bug?

Spencer Salazar ssalazar at CS.Princeton.EDU
Tue Mar 6 17:56:09 EST 2007


On Mar 6, 2007, at 6:14 AM, Veli-Pekka Tätilä wrote:

> Kassen wrote:
> [UI for synth panels]
>>> in the future a sufficiently powerful, cross-platform toolkit for
>>> effortllessly building synth panels as in Reaktor?
>> I think the mini-audicle user-interface element thingies that are  
>> there on
>> Mac right now and will be here for Win and Linux sometime soon-ish  
>> would
>> come quite close to something like that?
> Cool, I've only seen references to that Mini Audicle thing and  
> tried out
> some binary in WIndows. Itr looks to me like a ChucK IDE, though  
> I've yet to
> figure out how to run scripts in it, <embarrassed emoticon>.

Its easy! (hopefully...)  You just have to type or paste some ChucK  
code into the editor window, start the virtual machine (Alt .), and  
then add the shred with Alt + .   Once the virtual machine is active,  
you can also use the buttons at the top of the document window to  
add, remove, and replace shreds.

We hope to soon improve the usefulness of miniAudicle by writing some  
decent documentation.

>> Reactor does and doesn't do there but I imagine that the more  
>> advanced and
>> configurable it gets the less "effortless" it would be...
> Yes, if these user interfaces are built programmatically, it would  
> be quite
> nice if the app could auto-layout panel controls for you.  
> Alternatively,
> I've grown pretty fond of the way how Gnome apps are able to build  
> complex
> layouts by nesting panels that layout their elements horizontally or
> vertically.

This sort of auto-layout functionality is planned, but for now manual  
layout is/will be required.

> [HID API]
>> Keyboard reading we have already
> Yup, I've noticed. Although I think I found a bug in it. In:
> .\chuck-1.2.0.7-exe\examples\hid\keyboard-organ.ck
> The keystrokes get delivered to chucK even when that window has not  
> gotten
> the focus. I find that counter-intuittive and a potential problem  
> if you
> multi-task several keyboard instruments in different consoles. Sure  
> it is
> expected behavior for MIDI but not for the keyboard in Windows apps at
> least. Besides such an implementation likely means system-wide Windows
> keyboard hooks may need to be used, which can be a resource hog at  
> times. My
> screen reader already has its hooks in the chain.

This is actually a feature, not a bug.  The idea is to abstract the  
keyboard as just some device with buttons, rather than a producer of  
sequential character input.  Fortunately this approach doesn't  
require much in the way of system-wide hooks, as it simply uses  
DirectInput.

As Kassen mentioned, there is the similar KBHit class available,  
which does require the ChucK window to have the focus and may be  
better suited to your preferred usage.  There are a few other  
differences in semantics, such as key repeating when a key is held.   
See examples/event/kb.ck for an example.

> How good is the HID API by the way? I've got a gamepad with a mini  
> joystick,
> a v axis, the throttle, pov switch and 6 buttons and hav always  
> thought it
> would be cool if I got it to send out MIDI. Also this laptop has a  
> Synaptics
> touchpad that can send pressure in addition to the xy coords. So I  
> guess one
> could use it as a 3D mini Khaos pad if you will.

As Kassen said, its definitely possible to use gamepads (see the hid  
examples), but ChucK probably won't be able to read the pressure of  
your touchpad at this point.

thanks for your comments!
spencer



More information about the chuck-users mailing list