autonomous guided vehicle control system

Hi,
I am considering making my partly completed project open source, but I do not wish to do this if no one else is interested. The benefits of open source
include more extensive testing, other opinions about design, and possibly help with programming.
The project is a control system for an autonomous guided vehicle. It is spread over several processors networked together over ethernet and ACE/TAO corba. The basic system orientates itself with respect to known visual markers, which are observed by a number of cameras (typically 4 each on seperate computers). A scaat Kalman filter is used to work out the vehicle position in 3D (on a seperate computer). Another computer controls and monitors the vehicle controls. Most of the computers run Linux, but some would be better run on RTEMS. Opencv is used for the image processing. It is written in C++ on GNU/Linux.
People who want to work on autonomous robots (perhaps by remembering interesting features in the environmemt) could use this system as a base on which to add intelligence. If other systems had laser, radar or sonar instead of cameras, these could probably replace the cameras with little change to the rest of the system.
Is there a similar project out there? Is anyone interested, any comments, questions, suggestions, flames are welcome.
Jack
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
Sure, load it up on SourceForge and tell the DARPA Grand Challenge people about it.
            John Nagle
Jack wrote:

Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
Hi John,
Does everyone in this group repost everything when replying? Anyway...
Thanks for your encouraging response, however I am looking for something more. I have received one expression of interest from one person. Perhaps I should explain a bit more about the project.
I am making a distinction between AUTOMATED and AUTONOMOUS.
By automated I mean that the AGV will work in an environment that it knows about already, probably a structured environment. This is much easier to specify and test and hence, is an ideal environment for developing and testing the navigation and control parts of the system.
In contrast, autonomous guided vehicles must work in new environments and still achieve their goals. These AGV are much harder to specify and test. But will contain control systems similar to the automated system, but with additional higher level functionality, and probably more sensors.
Hence I hoped that a shared project to robustly implement the automated part, would be of general interest. In my opinion this automated part should consist of:
Hence I hoped that a shared project to robustly implement the automated part, would be of general interest. In my opinion this automated part should consist of:
-Corba interfaces (ACE/TAO) -a SCAAT Kalman filter (Greg Welch, 1996) -a hardware interface computer, controlling servos and motors, and reading shaft encoders -interfaces to environment sensors (cameras, radar, sonar, GPS, laser, whatever).
Since I am using cameras, for test purposes cameras seem a sensible default for environment sensors ( this is in a structured environment and the image processing is straight forward).
Jack Cawkwell

Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
Jack the way I see it, is that autonomous part is very difficult. You can try and understand it from here http://www.ecu.pwp.blueyonder.co.uk/new/darpagrandchallenge.htm My problem is that I have built robot vehicles but I can't predict what the grip will be 5 feet away from where I'm parked. I built vehicles with smooth wheels and they ride on polished surface - but do they corner properly - no! That got me thinking about the wider problem of computing the grip for a vehicle in the DARPA challenge. It seems to be a fundamental error to try and compute cornering grip for a speeding vehicle. (We can always drive slow but that is not competitive.)
I'm flumuxed as to what to do about it. I can make robot, and I can put in sensors, but I can't solve that one final problem.
Anyway, I'm certainly interested in the image processing part of the autonomous vehicle. I have experience of high speed neural nets working on systems that recognise number plates on speeding cars and peoples faces. It doesn't need structured lighting so I'm bringing to the table something you may not have. The robot bits are easy - I have tools for making custom gearboxes and mechanical prototypes in as short a period as one day.
Over to you - what do you want to do?
J
Jack wrote:

Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
J e7,

Yes, this is why I want to seperate out the easier automated part, common to both automated and autonomous, so leaving the difficult research parts to experimenters, who wont have to waste time on the automated part. ( the subject of the posting should have been 'automated' not 'autonomous', I miss typed it)

I have very limited experience of actual hardware, my target vehicle is a car or bus with rubber tyres on tarmac road, so I am not expecting any problems with grip. *BUT*, if you use the SCAAT Kalman filter (SK), with a PV state vector (that is Position and Velocity) then you do not need to model vehicle motion seperately. If it slides about the SK PV state vector will reflect this if the sensors feeding the SK are quick enough. I guess you need to model the vehicle dynamics simply, enough to move the steering, brake and accelerator to move the vehicle where you want from where the sensors say you are. Obviously if you cannot steer the thing because there is no grip you are lost, but if the steering just needs a lot of correction, then it might work (certainly better than an other method).
Greg Welch's example SK application is a hat that tracks the motion of a persons head using cameras sighting LEDS in the ceiling. That is, its unpredictable, there can be no model outside of the SK. As a rough guide, currently my SK filter can process about 100 observations per second on a 600Mhz processor. I expect to get perhaps as many as 800 observations per second from 4 cameras running at 25fps. So the SK should track the vehicle motion quite accurately. I am aiming to test at 60mph and operate at upto 30mph, initially.

This is all good stuff for the autonomous part. The project could include both, why not?

This is probably heresy in this news group ;-), but I am hoping to buy a vehicle, and get a mechanic friend to put on the controls. The project could include hardware designs as well.

I will think about it for a day or two. If I decide to go open source, then I will start to post stuff on a web site, and let people know about it. More expressions of interest would help me decide.
Jack
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

Polytechforum.com is a website by engineers for engineers. It is not affiliated with any of manufacturers or vendors discussed here. All logos and trade names are the property of their respective owners.