
See guys, I wouldn't raise a hellcry without having tinkered
enough with what was available on the internet. Far too many
people have suggested that I use PySndObj without having
done any kind of realtime audio analysis on it themselves.
Allow me to restate my objective. I want "to read the sound of a live flute
off the audio port in realtime, and analyse it using Python". Now...
This is a reply to me from one of the main PySndObj developers:
thinking a little more about this, I think there is no pitch tracker
there (I need to add one...). So you can try csound:
See that? at least he understood my question somewhat.
Now here's a response from the gentleman at Pymedia:
Can you please check voice_recorder_player.py or voice_recorder.py
from examples tar ball ?May be it will resolve most of the issues.
He is answering a completely different question! I'm talking about
intercepting data off a port, and he's talking of recording it. I had seen
the example he's talking about but it made no sense in the context.
And I was kinda lucky in that I know what a tarball is - most artists
who dabble in technology come from diverse background. ( I am one
of the 2 or 3 new media artists in India). So I find it odd that when
newbies
ask questions, developers answer very sweetly, but in code.
Perhaps the truth really is that adc => FFT => dac, which is so simple
for ChucK etc - has no analog in Python, and people are just too
ashamed to admit that they don't know how its done. To use Chuck to
do this, I will need to learn YET ANOTHER LANGUAGE called OSC
or something, which will talk to messages from Python (which are
messages originating in my phone coming via Bluetooth) so I can
pretty much give up on realtime.
I hate Python. Ugh, no! I love it, but I hate where I am with this
damn project.
*looks despondently at the wall picture of Lord Shiva,
who has a familiar serpent tied around his neck like a
scarf*
-------
1/f )))
-------
http://www.algomantra.com
On 9/17/07, robin.escalation
--- AlgoMantra
wrote: If you prepare your files and code in advance and then just chuck the shreds in and out of the VM, it really is a a bit like sequencing, rather than livecoding. And if I change the code in the file, save it, then the effects don't appear live, do they?
In my little free time that I am spending with ChucK I am trying to figure this out as well! The best I get is editing one file while another is playing. This feels more like batch programming than real time.
Maybe i'm missing something freakin obvious, but I'm so frustrated having had to learn Csound, Chuck, SuperCollider and all sorts of new languages just because Python did not provide me with a simple audio processing module. All I wanted to do using Python was analyse the sound of a live flute playing and plot its frequency, and other characteristics, straight off the audio port.
It is annoying that no-one has wrapped a decent library for Python. But haver you checked out my article on this topic? It could be that if you have simple needs PyMedia or one of the other mentioned tools might do.
Surf:
http://diagrammes-modernes.blogspot.com/2007/08/music-control-tools-python-b...
-- robin
----- Robin Parmar robinparmar.com _______________________________________________ chuck-users mailing list chuck-users@lists.cs.princeton.edu https://lists.cs.princeton.edu/mailman/listinfo/chuck-users