Laser sensor and localization

Post your NXJ projects, project ideas, etc here!

Moderators: 99jonathan, roger, imaqine

Laser sensor and localization

Postby gloomyandy » Mon Aug 09, 2010 2:03 pm

Folks,
I thought you might be interested to see my latest project:
http://www.youtube.com/watch?v=IMI63k5W3sU

I'll try and post some more details later, but if you have any questions ask away...

Andy
User avatar
gloomyandy
leJOS Team Member
 
Posts: 4184
Joined: Fri Sep 28, 2007 2:06 pm
Location: UK

Postby skoehler » Mon Aug 09, 2010 2:18 pm

Looks like a lot of trigonometry :-)
I hope my Math functions don't let you down ;-)

Does the robot ever miss the reflection because it passed by to fast?
skoehler
leJOS Team Member
 
Posts: 1452
Joined: Thu Oct 30, 2008 4:54 pm

Postby gloomyandy » Mon Aug 09, 2010 4:01 pm

Hi Sven,
Your math functions worked fine, thanks! Though I think my math capabilities made things a little harder!

I had a few problems with detecting the reflected light, the main one was that you actually get multiple reflections from the beacons (possibly due to the cheap reflective tape I used - actually one of those bicycle safety bands!) and so I need to merge them together and/or select the best. I could probably scan at a faster rate but things work pretty reliably at this speed so I didn't push things too much. One of the other problems I had is that I modulate the laser (i.e. turn it on and off!), to allow me to easily ignore background light. But the module I used takes a relative long time to power on and off (can't remember the actual numbers but certainly 10-20ms or so), also the 100Hz flicker of most lights (2x50Hz), means that you get some odd peaks in the overall light levels, so I have to sample a number of times. All of this happens while the scan is in progress which tends to limit the scan rate...

Andy
User avatar
gloomyandy
leJOS Team Member
 
Posts: 4184
Joined: Fri Sep 28, 2007 2:06 pm
Location: UK

Postby skoehler » Mon Aug 09, 2010 4:47 pm

gloomyandy wrote:also the 100Hz flicker of most lights (2x50Hz), means that you get some odd peaks in the overall light levels


Hmm. I guess, one could built a filter - in hardware - to filter out those signals. In software, it's probably not doable - at least we don't have an API to sample one of the ADCs at a given rate (which is probably hard to implement).
skoehler
leJOS Team Member
 
Posts: 1452
Joined: Thu Oct 30, 2008 4:54 pm

Postby gloomyandy » Mon Aug 09, 2010 7:40 pm

Hi Sven/Folks,
You might find this graph interesting...
http://www.gloomy-place.com/lejos/lasergraph.gif
It is basically the light sensor reading (after subtracting an average background value). The reading is sampled every 1ms and the laser is turned on every 40ms left on for 20ms and then turned off. You can clearly see a few things...
1. The way the laser takes about 8-10ms or so to turn on reach full power.
2. The way it takes 5ms or so to turn off.
3. You can clearly see the mains light ripple (from a room light). These readings are with the reflector pretty close to the sensor. With it further away the ripple is a much bigger part of the readings.
4. You can see the odd groups of 2 followed by 4 readings all of the same value. Which I think comes from the way that the sensor is read every 3 ms by the AtMega and then transferred to the AT91 every 2ms...

In general it is tricky to filter things on the NXt in software but in this case the frequencies are pretty low and I can sample a few readings (in this case I think I simply read 10ms worth and take the peak reading over that time. Not perfect but it seems to work pretty well...

Andy
User avatar
gloomyandy
leJOS Team Member
 
Posts: 4184
Joined: Fri Sep 28, 2007 2:06 pm
Location: UK

Re: Laser sensor and localization

Postby acaine » Tue Aug 10, 2010 12:04 am

gloomyandy wrote:Folks,
I thought you might be interested to see my latest project:
http://www.youtube.com/watch?v=IMI63k5W3sU

I'll try and post some more details later, but if you have any questions ask away...

Andy


I really enjoyed watching your professional looking video. I wish that I could do as well with mine!

It would be interesting to know if it were possible for the robot to differentiate between the three posts by using different kinds of reflective tapes; i.e. one post is highly reflecting, while another less so.

If the robot could differentiate between the three posts (and perhaps it already can) then the robot would be able to start is any random position in relation to the three posts and navigate around and through them.

It's really wonderful work that you have done.

Your work has inspired me to think of ways of improving the Monte Carlo Localization.
acaine
New User
 
Posts: 5
Joined: Sun Aug 01, 2010 7:22 pm
Location: Ontario Canada

Postby bbagnall » Tue Aug 10, 2010 3:37 pm

I have many questions, some of them banal so I hope they don't demoralize you.

1. What software did you use to edit the video together? And how long does it take to put together a video like that?

2. For the modulation algorithm, is it basically taking a reading, turning off the laser, then taking another reading. If the bright spot is present only when the laser is on, it counts it as a hit? But if a bright spot is there both when the laser is on and also off, then it ignores it?

3. Is there a laser sensor like yours on the market? If not, is it easy to build? It would be cool to have a laser sensor/actuator class in leJOS. We could include some assembly instructions in the API if it isn't too complex.

4. Are there limitations to the positional detection based on the orientation of your three reflectors? I assume this works a little like GPS. Without the sensor being able to detect unique reflectors (ID 1, 2, or 3) then I assume all three must be up against a wall. Do you think you get 100% room coverage, or are there blind spots and distance limitations for this setup?

5. What is your algorithm for correcting the position after it does a move and then checks position?
User avatar
bbagnall
Site Admin
 
Posts: 392
Joined: Fri Aug 04, 2006 4:03 pm

Postby gloomyandy » Wed Aug 11, 2010 12:42 pm

Hi Brian...
Nothing wrong with the questions as far as I can see!

1. What software did you use to edit the video together? And how long does it take to put together a video like that?


I used the latest beta of Microsoft Live Move maker. I have mixed views of this on the one hand it is great and lets you do some pretty slick things very easily. On the other hand it drove me crazy by crashing lots which as a result I spent much longer than I really needed to. Part of this was down to me. I used my Canon T2i/550d SLR as the camera and this creates full hd h264 in .mov format. This format is notoriously hard for editing programs to work with (do a few searches on the various canon forums). Many experts recommend that you transcode the files into a more edit friendly format first. There are some formats designed especially for this and I think the high end editors transcode into those in the background to make editing easier. But I didn't try doing that! As a result I'm still trying to find a good editor...

2. For the modulation algorithm, is it basically taking a reading, turning off the laser, then taking another reading. If the bright spot is present only when the laser is on, it counts it as a hit? But if a bright spot is there both when the laser is on and also off, then it ignores it?

That is pretty much it. It is slightly more complex in that you get multiple reflections from the beacon as the light scans over it, and you also get a fair bit of main ripple on top (particularly when the beacon is far away. So I used a number of reads during on and off periods and performed peak detection on the result. I also ran a FIR filter over the results to merge multiple reflections together and again selected the peak...

3. Is there a laser sensor like yours on the market? If not, is it easy to build? It would be cool to have a laser sensor/actuator class in leJOS. We could include some assembly instructions in the API if it isn't too complex.


I'm not aware of a commercial version of the sensor. I don't have a step by step guide to building it but this thread:
http://forums.nxtasy.org/index.php?show ... 5&hl=laser
describes on that is very similar. The main difference was my use of a laser pointer rather than a module. The only tricky bits are:
1. Opening the Lego sensor case.
2. Extracting the guts of the laser pointer and knowing which bits of the board can safely be removed (a good reason to use the laser module).
3. Soldering the wires to the pcb.
4. Getting everything back in the case!

4. Are there limitations to the positional detection based on the orientation of your three reflectors? I assume this works a little like GPS. Without the sensor being able to detect unique reflectors (ID 1, 2, or 3) then I assume all three must be up against a wall. Do you think you get 100% room coverage, or are there blind spots and distance limitations for this setup?

The sensors can actually be in any known position there are some robot/sensor positions for which the algorithm does not work. You need to be able to identify which sensor is which (which you can possibly do by using the relative positioning of them). Take a look at this paper for lots of details:
http://repositorium.sdum.uminho.pt/bits ... 003328.pdf


5. What is your algorithm for correcting the position after it does a move and then checks position?


I used the BasicNavigator (name may have change by now) as a basis of my code. To correct the position I simply set the current pose to be the information returned by the scan at the end of each move sequence. I used a simply set of goto commands to perform the move...

All the best

Andy
User avatar
gloomyandy
leJOS Team Member
 
Posts: 4184
Joined: Fri Sep 28, 2007 2:06 pm
Location: UK

Postby bbagnall » Fri Aug 13, 2010 5:50 pm

You've come up with a creative beacon system, similar to what Claude Bauman has done except you've made the hardware much more simplified. I recall he had to make each beacon have its own power source, but you're using the laser to "excite" the beacon by bouncing light off the reflective tape.

Are there other uses we can get from this laser sensor? Something along the lines of a range sensor with a pencil-thin beam, which would make it superior to the ultrasonic cone.

I'm wondering if we could detect the laser dot without reflective tape. How close does an object need to be in order for it to detect the reflection without reflector tape?

To achieve longer distances, I wonder if the LegoCam would be able to detect the dot? (using the filter algorithm to separate it from other lights)

If there was a way to detect the red dot, then we could mount two sensors at a set length apart (stereo vision). Each sensor could determine the angle towards the dot, and then it could use simple triangle math to determine the distance to the dot.
User avatar
bbagnall
Site Admin
 
Posts: 392
Joined: Fri Aug 04, 2006 4:03 pm

Postby gloomyandy » Fri Aug 13, 2010 7:50 pm

Hi Brian,
great minds think alike.. my next project is to try and do something like this...
http://sites.google.com/site/todddanko/ ... ser_ranger
I'll be using the NXTCam probably with custom firmware (as you know I've already created some of that to allow image capture). I think it should be possible to detect the brightest spot in the image. This is actually a little harder then it first seems due to the auto exposure handling of the camera. The bright spot basically exceeds the dynamic range of the camera, so I'll probably have to take multiple images using different gain settings... All good fun though!

All the best

Andy
User avatar
gloomyandy
leJOS Team Member
 
Posts: 4184
Joined: Fri Sep 28, 2007 2:06 pm
Location: UK

Postby gloomyandy » Fri Aug 13, 2010 7:55 pm

ps here is the previous project doing image capture...
http://www.youtube.com/user/gloomyandy# ... ppNCzDb7lU
User avatar
gloomyandy
leJOS Team Member
 
Posts: 4184
Joined: Fri Sep 28, 2007 2:06 pm
Location: UK

NXTCam to improve the project

Postby esmetaman » Sat Aug 14, 2010 6:12 pm

Hi Andy,

Did you consider to replace laser for NXTCam to detect beacons directly?
If you center the blog in the center you would the right angle.

I found a PDF which explain the calculus:
http://docs.google.com/viewer?a=v&q=cac ... xCfAkOx4pA

Are right the calculus showed in that document?

Nice project

Cheers
Juan Antonio Breña Moral
http://www.juanantonio.info/lejos-ebook/
https://github.com/jabrena/livingrobots
http://www.iloveneutrinos.com/
User avatar
esmetaman
Advanced Member
 
Posts: 301
Joined: Wed Sep 13, 2006 12:16 am
Location: Madrid, Spain

Postby gloomyandy » Sat Aug 14, 2010 6:49 pm

Hi,
It may be possible to recognise a beacon directly but I doubt if the NXT when combined with the NXTCam has enough cpu to be able to do it the necessary image processing. Looking for modulated reflected light is much easier...

I posted a link to the paper I used as the basis of the calculations in a previous post. Here it is again...
http://repositorium.sdum.uminho.pt/bits ... 003328.pdf
This is to a pdf version of what I think is the same paper you have referred to...

Andy
User avatar
gloomyandy
leJOS Team Member
 
Posts: 4184
Joined: Fri Sep 28, 2007 2:06 pm
Location: UK

Postby esmetaman » Sat Aug 14, 2010 7:49 pm

Hi Andy,

yes but imagine the same scenario which you record in your video.

with NXTCam you design 3 beacons with a different color, for example Orange (This year, I found a orange really useful to detect objects)
http://www.roboticaenlaescuela.es/blog/ ... obot-leon/

With a unique sensor you could:

Detect beacons
Detect other objects (Ex: Other robots)

I am going to read your paper. Sorry, I didn't read it. :p
I think that with latest firmware from Xander (http://www.youtube.com/watch?v=ZvPb8yztpbw), I think that the blob detection is better than original firmware from Mindsensors. In my test with NXTCam V3 I used original. In my personal opinion the key is the color calibration with NXTCamView.

I have a NXC code stablish the color patterns by code not using the tool NXTCamView but I didn't port to LeJOS yet. I think that it could be a nice improvement to our NXTCam support for LeJOS. Do you have that code?

Anyway the example using laser or using other kind of sensor is really great because show the classical problem with odometry and accumulated error which is able to be solved with localization.

Cheers
Juan Antonio Breña Moral
http://www.juanantonio.info/lejos-ebook/
https://github.com/jabrena/livingrobots
http://www.iloveneutrinos.com/
User avatar
esmetaman
Advanced Member
 
Posts: 301
Joined: Wed Sep 13, 2006 12:16 am
Location: Madrid, Spain

Postby gloomyandy » Sat Aug 14, 2010 10:25 pm

Hi,
The problem with using color detection is that it is really hard to get it to work in a real world situation (with changing lighting and normal room contents). I don't have the space to setup a special test room to do this sort of thing in and I really want my robots to be able to move over a reasonable distance. I suspect that the problem of the beacons varying in size may also make it tricky to identify with a NXTCam. But who knows it may be possible to make it work...

Andy
User avatar
gloomyandy
leJOS Team Member
 
Posts: 4184
Joined: Fri Sep 28, 2007 2:06 pm
Location: UK

Next

Return to NXJ Projects

Who is online

Users browsing this forum: No registered users and 0 guests

more stuff