A question related to the ongoing changes in lejos sensor implementation:
What would be the recommended method/design pattern of integrating remote networked (non-RMI) sensors with lejos.
E.g.: which interfaces to implement, and classes to extend.
I've been going over the latest sources in SVN, but still need guidance to do it the "right way".
I've been toying around with the idea of using Android phone sensors with the EV3.
I know there's been a number of projects which are doing similar things, but I hadn't found anything which worked for me.
Here's what I have working as proof-of-concept implementation:
* Server running on Android, serving queries over UDP
* Client-Server protocol using google protocol buffers (protobuf)
* Java client prototype (currently only tested on PC)
* Client fetches Android Orientation, Gyro, Accelerometer, Magnetic Field sensor readings. Additional sensors will be trivial to add.
* Server also serves Text to Speech requests (unlike my previous go at TTS - this one works nicely)
This prototype hasn't been tested on EV3 yet. Previous prototype was, but it was using continuous streaming of sensor data from android to ev3, and actual read-outs where lagging too much.
The above will be open-source'd as soon as I get necessary rubber-stamp from my workplace.
P.S.: I understand everyone involved with lejos is very busy, and may not have time for any manner of detail. Any brief guidance would be appreciated, and in the meanwhile I'll try to figure this out myself