landmark recognition within probabitlity grid

I have 20X20 cell map sebments
(ps thank god for the bomb)
how do I convert the coordinates to relative coordinates instead of absolute coordinates?
How do I match relative coordinates to current map information?
I have map information, IE what robot has already explored.
Robot scans with sensors. Gets sensor information and updates IMMEDIATE memory of what is around it (it has explored this area... just scanned area it has already scanned)
It gets information about surrounding cells.
how to convert this information (absolute coordinates) into relative....
and how to convert previous exploration data into relative.....
and compare?
and relocate robot within it's memory?
I have this idea, convert map information (what has already been explored) into relative coordinates.
convert sensor data into relative coordinates.
compare the 2 relative coordinates.
but how do I match up the two relative coordinates before I start the "search for match"?
I can do the "search for match" with and AND with assembly language. I just want to match up previously explored map data with current sensor data.
Rich aiiadict AT hotmail DOT com rich1235.tripod.com
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
Rich J. wrote:

Remember that even if you scan from the exact same position twice, chances are the data wont be exactly the same so you are probably going to have to come up with some sort of "best match" way of comparing the data.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

when I can get the explored map data to align with the sensor data, I will AND the cells together, and count the number of matching cells to end up with a number that tells me how many cells in the sensor grid matched up with the map grid. if the sensor grid is 10X10 cells, has 5 obstacle points (cells with high probability of being obstructed) and the program spits out a 90 after aligning it to map data and comparing the two, it is likely that the grids match. 90 out of 100 cells matching is close.
it will be a percentage, and I will always just look for 80 percent or more matches. maybe I'll lower the percentage a bit.
I need help figuring out how to align the sensor grid and map grids...
the sensor grid below shows what the robot just scanned with sonar.. the X represents where the robot is at compared to what he sensed.
the map grid shows where the robot THINKS he is on his map grid.
so he just took a sensor sweep. filled the sensor grid with data.
now he needs to realign himself within his map grid, IE adjust where he thinks he is.
he is a few feet off because the wheels slipped, my room-mate got mad and kicked him, the dog chewed on him a little.
he's already explored the house.
see the last illustration for what I want to do. this is fantasy data. sensor grid is never this clean. neither is the map grid. just trying to get the idea across. to see if sensor data matches map data you first have to get them aligned somehow.
I could detect lines within the map grid, then detect lines within sensor grid.... match them up that way. it seems someone would have a neat clever way to do this.
Rich aiiadict AT hotmail DOT com http://rich12345.tripod.com
sensor grid: ********** X= robots position relative to the cells * . just sensed * X . * . *=obstructions *......... ... is just the edge of the grid
map grid:
************************* * * * o * X = where robot thinks he is according to * * odometers and drive motor activity * * o = where he is really at * * * * * X * * *
map grid aligned with sensor grid:
XXXXXXXXX***************** X . * X= sensor grid data X . * * = map grid data X........ * * * * * * * *
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
snipped-for-privacy@hotmail.com (Rich J.) wrote in message

You might want to check out the localization paper on the Rossum project website...
http://rossum.sourceforge.net/papers/#r1
I'm not sure it's exactly what you are looking for, but it might give you some ideas.
Jeff Kroll http://home.pacbell.net/jkkroll
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
snipped-for-privacy@gmail.com (robotguy) wrote in message

Exactly what I was looking for. Very good easy to understand information! all can be easily coded in assembly.
thanks a bunch for sharing! with dead reckoning as input to the localization routine, the robot can guesstimate where it is at fast! and with small ammounts of memory.
I wonder what the authors probability grid cell resolution is.
thanks again,
Rich aiiadict AT hotmail DOT com http://rich12345.tripod.com
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
: Exactly what I was looking for. Very good easy to understand : information! all can be easily coded in assembly.
At the risk of sounding like an idiot :-) Is there a pseudo-code version of a localization algorithm anywhere ?
While this paper is very straightforward, I don't really see how to implement it from this, the mathematical expressions in other papers remind me of the parts of math I had trouble with back in school.
--
==========================================================
Chris Candreva -- snipped-for-privacy@westnet.com -- (914) 967-7816
  Click to see the full signature.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
wrote:

psuedo-code would be nice.
Rich
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
snipped-for-privacy@hotmail.com (Rich J.) wrote in message

this is what I've been able to come up with so far. see http://rich12345.tripod.com/woz/local.html for the charts/ diagrams I was starting with.
watch out...this is just me rambling to myself, trying to figure it out in my own head and on paper.
from what I've been able to process so far:
the algorithm requires an orientation input, IE compass or (dead reckoning with encoders, yuck). it doesn't provide for matching up local and global maps if you don't know what orientation you are in (compass direction) and/or if the global map data was updated in the past with the WRONG orientation data.
local map = small map you create after taking a sensor sweep. this is a probability grid.
global map = map data...this is the road-map the robot has built from past experiences of roaming around. it is a probability grid, each cell representing X inches in width, Y inches in height. the data stored at each X,Y represents the probability that the cell is occupied by an obstruction.
feasibility grid = a grid to store the 'feasibility' that the robot occupies the cells within it.
"global" map data (IE, areas that have already been explored) was processed and stored using the bearing of the compass as a reference. "global" map data therefore has the inaccuracy of the compass bearing, and inaccuracy of the sonar readings.
"local" map data, IE sensor sweep data also has the inaccuracy of compass bearing and sonar.
to add unexplored areas to map:
1)when you reach an area that is unexplored, check how long ago it has been since you localized...if it is recent, go to step 3
2)back up, localize, and drive forward again to where the unexplored area is.
3)scan the unexplored area, update map (see scan/ interpret sonar/update map section)
to localize in previously explored areas:
1) scan with your sonar. for every sonar measurement taken, store 1)distance of echo received 2) angle at which reading was taken from (this will be the compass direction that the sonar was pointing in when the sonar reading was initiated.
2)interpret sonar and make local map
-sonar beam spreads out in "cone" shape, ie, the beam scatters farther as the distance from the transmitting transducer gets farther away. this means that an echo at 2 feet away could come from any number of cells. -start with reading one on the sonar. use the formula (don't have it yet) to update the probability (in the local, or sensor sweep grid) of the cells that that echo could have come from. also, zero (IE, mark as unoccupied) the cells between (the sonar sensor) and (the cells that the echo could have come from)*. go through each echo you got from your sonar scan, increasing probability of the cells you may have got echo from, decreasing (or zeroing?)* the cells between the sensor and echo points. *if you get an echo at 2 feet, that means that there is no obstruction between the sonar sensor and two feet out. this is why we zero the cells between. -convert the cells in the local map into relative coordinates.
(see local map diagram)
3) initialize a "feasability" grid. this will be the width/height of the "global" map that you want to try to match your "local" map to. IE, the area of the map you want to try to match your sensor readings to. the feasibility grid is X cells wide by Y cells tall, and holds the probability that this cell is where the robot is currently occupying. the cells within the feasibility grid are initialized to 0.
4) start at cell 0,0 in the area of the global map which you want to match your sensor readings (local map) to. variables: GX=0, GY=0
4.1- start at first coordinate of local map. IE, cell at bottom left (-3,0 in the diagram) variables: LX=-3, LY=0 B = LOCALMAP(LX,LY) B = state of cell at LX,LY in local map
for LX = -3 to 3 ;loop through local x coords for ly = 0 to 6 ;loop through local Y coords 4.1a) does cell we are looking at in global map have the same data in local map, in the same position? to check this we take x offset of local cell (LX,LY) and add them to coordinates of global map cell (GX,GY)
CHECKX = GX+LX: CHECKY = GY+LY
A = GLOBALMAP(CHECKX,CHECKY)
(A = data at cell CHECKX,CHECKY in global map.) does A = B? -yes, 4.1a.1)
on the line pt. (GX,GY)-->pt. (CHECKX, CHECKY) in the Global map, are there occupied cells between the endpoints? -no- increase the cell in the feasibility grid which corresponds to this global map cell. feasiblegrid(GX,GY) = feasiblegrid (GX,GY) + WEIGHT weight will be the increase you want to add to the feasibility that this cell (GX,GY) is the one that the robot is at. -yes decrease feasibilitygrid (GX,GY) by WEIGHT
-no, decrease the number in the feasibility grid feasiblegrid(GX,GY) = feasiblegrid(GX,GY) - WEIGHT
4.1b) go to next cell in local map this would involve incrementing the X or Y and have logic to loop through all x's for y's next ly next lx
now you have to loop back up and check the next spot on the global map .
maybe 4 above should read:
for GX = 0 to (width of area robot could be) for GY = to ("height" of where robot could be) <do 4.1> next GY NEXT GX
now you've compared each cell where the robot COULD be to the sensor data. scan through the feasibility X,Y's and find the highest number. that is where the robot should be according to this algorithm.
variable: HIGHESTFEASIBLE = 0 (this is the highest ammount found in the feasible grid so far) FX = X coordinate on feasible grid with highest ammount FY = Y coordinate on feasible grid with highest ammount
for x = 0 to (width of feasible grid) for y = 0 to (height of feasible grid) if feasiblegrid(x,y) > HIGHESTFEASIBLE then HIGHESTFEASIBLE = feasiblegrid(x,y). FX = X: FY=Y next y next x
to increase sloppiness (IE, try to match local data to global data, when the global map has inaccurate compass readings associated with it, and local map also has the same inaccuracies) you can search in a circle around the spot that the local data says there is an obstruction...
local relative coordinate = +2,+6
within "search for match" (4.1a) you would
compare the pts (+2,+6) on local map to global map first, as you usually would,
but also compare the local map data to the cells surround the supposed corresponding cells on the global map: ------ -------- ------ |+1,7| | +2,7 | |+3,7| ------\ ----|--- /------ \ | / ------ \==/ \==/ ------ |+1,6|---<|+2,6|>---|+3,6| ------ /==\ /==\ ------ / | \ / | \ ------ -------- ------- |+1,5| | +2,5 | | +3,5| ------ -------- -------
you could possibly better match the global map with the local map with the two maps containing inaccurate data.
Rich aiiadict AT hotmail DOT com http://rich12345.tripod.com
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

Polytechforum.com is a website by engineers for engineers. It is not affiliated with any of manufacturers or vendors discussed here. All logos and trade names are the property of their respective owners.