New project: Dynamically extendable NXT robot platform

Post your NXJ projects, project ideas, etc here!

Moderators: 99jonathan, roger, imaqine

New project: Dynamically extendable NXT robot platform

Postby emh » Mon Jan 21, 2008 2:26 pm

Hi,

I am currently developing a java software for controlling a NXT robot (in my case: some kind of JohnNXT model).

I am mainly concentrating on developing the software architecture. Basically the software is for monitoring and controlling the robot.
The architecture roughly consists of 3 parts:

Ops (Operations Center): Does not belong to the actual robot intelligence. Serves for monitoring and controlling the robot (possibly remotely).

Master: The robot intelligence (for a start, it runs on a PC). The NXT access is done via Bluetooth and a modified iCommand version. The specific robot intelligence is located in extendable and exchangeable modules. For this, the master provides interfaces, e. g. an interface for accessing the bricks (iCommand), an interface for database access, and so on.

Exchangable modules: The specific robot intelligence, whereas the Master itself can be described as a generic framework which contains these modules. Every module encapsulates a specific intelligence, i.e. "voice output", "motion control", ...

Ideally, the platform provides interfaces, against which specific robot modules can be programmed. For example, for the different robot skills like "voice output", "motion control", ... there will be interfaces, which can be implemented for different robots individually - an implementation for AlphaRex, an implementation for JohnNXT, and so on. These implementations will be wrapped up in a jar file.

The master uses these jar files and thus contains the complete robot intelligence. To sum up, the Master can be used to control different robots, depending on the jar file.

Some (small) parts of the implementation already exist, I am currently working on enhancing the framework. Ideally I am going to release the Master when it is implemented in a way that specific module implementations can be added. For testing, I will probably implement some AlphaRex functionality and then concentrate on modules for JohnNXT. Of course it would be great if other people would contribute their own implementations or give input for different functionality.

What do you think of that idea? Any questions? - Did it roughly become clear what I was trying to say?
And which kind of specific implementable intelligence can you think of?
- Which kind of interface can be shared by different (or even all) robots?
- Which ones are completely generic? (e.g. power management?)

I am looking forward to a discussion!
emh
New User
 
Posts: 12
Joined: Sat Dec 15, 2007 6:48 pm

Postby lawrie » Tue Jan 22, 2008 3:19 pm

Is all the intelligence on the PC in this architecture or do the robots have autonomous control? iCommand uses the Lego communications Protocol which the leJOS menu supports via the LCP class, but this is limited to the remote commands that Lego have defined. It does not allow for much intelligence in the robot itself.
lawrie
leJOS Team Member
 
Posts: 909
Joined: Mon Feb 05, 2007 1:27 pm

Postby emh » Tue Jan 22, 2008 4:32 pm

Yeah, intelligence has to be defined in the modules which resides on the PC at the moment.
Although on the PC, this intelligence belongs to the Robot's "brain". Maybe later I will port this intelligence to a mobile device.


Either, the intelligence uses the iCommand commands to actually control the robot, or it may use iCommand to upload programs to the nxt, if it should be necessery for performance reasons.
emh
New User
 
Posts: 12
Joined: Sat Dec 15, 2007 6:48 pm


Return to NXJ Projects

Who is online

Users browsing this forum: No registered users and 2 guests

more stuff