
Kas, Your idea about what kind of DSP is happening inside the phone, and the access available right now was answered quite nicely by Gatti Lorenzo. That IS how it is for now. I do think, however, if one obtains sufficient knowledge of PIC/ARM or some sort of microcontrollers, one might be able to add on extra sensors (thermistor, accelerometer, gyroscope, piezoelectric) and a small microprocessor to go with it to eat all the data where it is collected and sing a nice tune to the intestines of the phone. I should be able to comment more authoritatively on this subject in the next few months. Modding the phone is the way out, for sure. That's what I intend to do. I just did some experiments and now finally I have decided to take the following route. Motion detection on my phone is proving too slow (there's a 1-2 second raster, which needs to be cut by ten times), so..... I have the camera of the phone watching me and sending pictures every few microseconds via Bluetooth to a buffer in the PC where they are analysed for motion. I am not sure whether to go with the regular pixel matcher algorithms, or try something new like R-G-B comparison, edge analysis. I don't know yet - but finally it is the result of this analysis that will drive a graphic board, the Chuck sequencer. The graphical representation of the Chuck sequencer has to appear in sync with, and its dynamics a direct result of my movement. That's the project, actually..............now back to burning my brain up. Now you might say, why not a webcam then? The answer to that I have, but its pretty long winded and I will provide it if you ask ;) Unless you can guess it first. Heh heh! - 1/f