Robots: sharing ideas

I'm building a robot and I think I have a few ideas that are cool. I'd like any feedback ranging from "wow! great idea" to "dude, that's the lamest
thing I ever heard."
First of all, I'm using Linux. The reason is that it is a more transparent and flexable environment that is easier to configure to a specific task. (than Windows) Also, it comes with all the development tools you would ever need. Lastly, all that is free. I could use FreeBSD or even NetBSD, but development momentum is very much on the Linux end of things.
Using a mini-itx motherboard because it is low power, cheap, has most of the peripherals on-board, and has TV-OUT that I can use with a cheap LCD TV screen.
Using the wheels and motors from a childs ride around toy. This is probably temporary as there is a lot of play in the plastic gear train, but it was cheap and it should work for now.
Using a small 12V ATX power supply to run the computer motherboard.
Using a RF wireless Mouse/Keyboard for the robot, but may simply abandon them for a network-only interface.
Using a 5V and a 12V switching (9v-18v input) power supplies for other peripherals. (www.jameco.com)
Using a linksys Wireless Access Point (WAP) for communications. The ITX motherboard has an ethernet port bit not a wireless. I am using the WAP for now, I may get a Lynksys wireless router because there are a few projects that allow you to modify the internal software to be usable as a Linux machine. It may be useful to add more processing power.
Using a Velleman K-8000 for I/O. It was a bit pricy (~$120). I had to modify the circuit to run on a single external power supply. It uses a serial I2C (I squared C) interface off the paralell port. This leaves the standard I/O lines on the port available for further expansion.
I built my own H-Bridge PWM circuit using a few op-amps, comparitors, and MOSFETs. Basically an oscilator produces a linear ramp signal. The K-8000 produces an analog voltage that represents power. The analog voltage is fed into a comparitor along with the ramp signal. The analog voltage is compared against the ramp, the higher the voltage, the wider the pulse width.
I'm currently working on a few motor feedback circuits. Not sure how I'm going to do it.
On linux there is a program called Gnome Meeting. It is a two way video conferencing program. I was thinking of using this, with the wireless network, and a USB video camera to convey video from the robot to a stationary computer (or laptop)
Since Gnome meeting is "open source" and free, I may insert my own program in the video stream and try to process the images to do some vision processing.
Here is my robot so far: http://64.46.156.80/robot /
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

[snip]
My robot runs Linux also. I wanted to spend more time writing robot software than fighting the operating system.

Several members of our club use mini-ITX boards, and I run something very similar.

The above items are also popular at our club. I think one member used an IR mouse/keyboard rig with similar success.

I use DC/DC converters (basically an encapsulated switcher). Some are up to 90% efficient.

I just plugged a PCI wireless card into the motherboard.
[snip]

I have used xawtv to display output of the robot's webcam on a notebook. The notebook is only 400MHz, so the X server is not fast enough to make the video very useful.
Jeff.
--
Jeff Shirley
snipped-for-privacy@mindspring.com
  Click to see the full signature.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
Jeff Shirley wrote:

Ain't it the truth!

I didn't like I/R because of the whole line-of-sight issue, but they have better range than R/F (if aimed :-)

Yea, my switching power supplies are DC/DC converters, they are same thing.

Yea, that is nice, but I'm actually curious about using a Linksys wireless router with one of the Linux ROMS. I'm wondering if I can put some network interfacing in the router as well as a dead-man alert. If the router detects that the computer is no longer reponding, maybe it can do something.

There are a number of packages that will transmit a digital video stream. If your system is fast enough, it may even use mpeg compression. (A REALLY good camera may already provide an mpeg stream.) That will reduce your bandwidth.
At the "LOWEST" level, rip apart xawtv, lop off the display and take the reader part of it and send it out on a socket. Take the display part and read the socket as if it were the video.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
"mlw" wrote

like
I'm in the exact same spot as you, I'm starting the construction of a mobile robot and I'm researching what is going to be the "brain" of my computer. So far I had thought about serial wireless modems (maxstream), PDAs (compaq ipaq), tablet pcs (hp tc1000) and so on, but reading your post, it makes sense to use single board computers. I want the computing power of a desktop PC IN the rover, but with the size/weight of a microcontroller board with reasonable power consumption. tablet pcs provide that, but with lots of extra components (more weight, more power consumption) that are not necessary for the rover computations.

ever
Well, I don't like linux. I think it's personal. It's more prejudice than technical. I don't know. But, if it proves MUCH MUCH better than windows, then I'll probably give it a shot.

the
Which one are you using? I saw this one on mouser catalog: http://www.arbor.com.tw/ETX/EmETX-i701.htm Intel pentium M 1.4Ghz, 11x9cm, 0.8Kg, not bad hun?

probably
I'm using an RC monster truck. I will test both electric and nitro powered.

I'm thinking about using battery packs that are used in RC cars (usually 7.2V), they are small, rechargeable, and they don't expect to be recharged while in use.

My goal is to have the robot completely autonomous, no interface at all (except emergency stop). But I may use a wireless network card to do online monitoring of the system and why not teleoperation?
<snipped>

fed
I'm developing a PID controller for a DC motor right now (for school), and I want to avoid it for my rover, that's why I'm going to use an RC car, everything is already built in. If it is electric, I just have to send a 5V pwm signal to the electric drive. Same thing for the nitro version, pwm signal to the throttle control servo. (I'm working on suspension and gear reduction, this little things are velocity beasts!)

Why not doing the video processing in loco? See, this is the main reason why I'm avoiding wireless. If my robot had only low throughput sensors (encoders, gps, accelerometers, gyros, etc), then I'd go for the completely wireless solution, but because of the video stream (too much data to go wireless), then I'm leaning towards putting all the processing power in the rover.

Nice one! Congrats!
I'll start my robot-construction blog very soon, I'll post it here when I do it.
Cheers
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
Padu wrote:

It just makes sense to use a small PC motherboard, the mini-ITX jobs are great.

I have a lot of experience with both Linux and Windows. I actually published a couple articles on Windows programming, including 95/NT device drivers.
I can tell you from a technical perspective, if you want to do something cool with your robot, you'll want to use an open source unix. Windows is just a royal PITA to do this sort of thing on.
For instance, on unix, I can write application code that talks directly to hardware. It has to be market with special permissions, but it can be done. You can't in NT (XP).
Besides, all the tools, the OS, everything is free. Anyway, I would strongly recommend Linux.

I'm using an EPIA something right now, it is a year old and slower.

All the math says that NiMH batteries have the highest power desity. A small AA NiMH battery can have 2.3 AH (2300 mAH), that is simply amazing. If you can get your power requirements low enough, you should be able to use a few NiMH batteries.
I'm using sealed gel lead-acid batteries, I can get them cheap and my robot is pretty big.

There are times when I direct interface is helpful. Every now and then, computers don't like to talk on the network, and you'll need a keyboard. That's why data centers have crash carts.

Are you using "stepper" type motors where you sent pulses to move specific amounts or a servo system that PWM?

I've never heard of it.

811G has great bandwidth.

Well, yes all the processing is on my robot. The point is it is cool to get telemetry from the beast.
811G has pretty good bandwidth, and an mpeg encoded image with video camera resolution is pretty manageable.

Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

Well, that's not entirely true: there is a way to give user-level processes access to I/O registers. It requires a small kernel-mode component, but it doesn't mean you have to write it. Many off the selve solutions are available, I personally prefer the winio (http://www.internals.com /) one. That can even give you access to the physical memory address space if you need that for some reason.
Regards, Andras Tantos
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
Andras Tantos wrote:

Well, I know of a lot of ways around WinNT's various "protections," I was overly broad -- obviously. One of my favorite kernel calls is ObReferenceObjectByHandle, it allows a user space application to share interesting things like event handles to be used in the kernel. Once in the kernel, one can do *anything*.
The problem is that most of what you *can* do is usually an artifact of poor design decisions or obscurly documented and mostly circulates by word of mouth or 3rd party web sites like sysinternals. With Linux, or one of the BSDs, you have the source and you see how it all works. No secrets.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
"mlw" wrote <snipped>

done.
Why would I want direct access to hardware on the PC side? To make things more modular, I'm creating a message architecture that sends out all the information gathered on the rover through a serial link (using microcontrollers), the only exception would be the camera system, that would be linked directly to the PC, but for that there are some directX libraries to access a video device without much layers.
I want to be able to easily change "brains". If the only coupling point between the rover and the brain is a serial connection over a well defined software protocol, then it doesn't matter if the brain is windows, linux, a PDA or even if the brain is miles away, connected through some sort of wireless modem. Or is that an utopia?
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
Padu wrote:

Well, I have been thinking about a number of issues along the lines of modularity. For the moment, I believe, the single PC has enough CPU and I/O to handle the robot, but I have been thinking that I may separate the system into two computers. The low level computer and a higher level. (Smilar to the brain, no?)
Anyway, my EPIA is getting old, it is sub-1GHZ, but should be sufficient to handle the routine functions of the robot. If I want to get more sophisticated, I may need to add computational horsepower in the form of a faster computer.
I started thinking about how to do it, and how to develop the system such that any piece could run any where I wanted it to. This sounds like what you are planning.
Fortunately, this problem has already been solved!! Are you familiar with any of the cluster computing projects? Basically, a loose computer cluster works with a messaging API. (Like you are working on.) One of the more common ones is MPI (Message Passing Interface).
Basically, I will construct the "higher" level "main" program as a controller program, using MPI, and code the sub-tasks as MPI subroutines. When coded as such, I can start a virtual computer across a number of computers on the robot. I can even run parts remotely on faster computers. (Assuming good R/F connectivity)
Take a look at http://www.lam-mpi.org /
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
"mlw" wrote

I/O
Exactly. Seems like a good idea, but I've been thinking about implementing a few basic behaviors in hardware, in this "lower substrate". For example, collision detection. If the rover is about to collide, then the microcontrollers would command it to stop right away. A few miliseconds later, the "brain" would realize why the rover stopped and then trigger some behaviors to avoid that obstacle, but I still don't know.

Loose coupling is almost always a good idea

Great, I will take a look at that. I hate reinventing the wheel.'

Thanks! I will look into.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
"mlw" wrote

Interesting, you may also want to take a look in www.omg.org (I guess that's the address, if not, google CORBA). I've used CORBA in the past for systems interoperability. It is really nice since it is completely independent of the platform, unlike COM/DCOM/COM+. Just recently I've heard that the omg also has interoperability solutions for embedded and real-time systems. I don't know if it is CORBA (omg offers many solutions), but the good thing about it is that their technology are very mature, and you can scale up as big as you want very easily.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

Hi!
I'm doing the same thing (modifying an RC car platform to become a robot) but I've found that a PID loop is essential. The speed-controllers for these cars are not much more than an H-bridge. You will be running at the lower end of the RPM range of the motors even if you gear them down, where open-loop control will perform miserably. I ended up developing my own H-bridge with local intelligence for the PID loop. If you're interested, take a look at here: http://h-storm.tantos.homedns.org/h-bridge.htm (and some Wiki-like info on the hows and whys here: http://h-storm.tantos.homedns.org/news.htm ).
Regards, Andras Tantos
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
You have a good start. How far along are you with software? I spent months putting my hardware together, using spare evening hours. I have spent 10 times that on the software. Hardware is easy to do, it acts a certain way according to datasheets. Instructions, diagrams, etc are available. There are no unknowns.
When you start making your software, that is where it gets fun and interesting. There are no datasheets, diagrams, etc on your brain. How do you transform something we take for granted (walking around a house we've never been in) into computer logic?
You have to dissect the problem into tiny bits to be able to code it.
R
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
snipped-for-privacy@gmail.com wrote: [...]

That is called Good Old Fashioned Artificial Intelligence or GOFAI for short.
It is probably not how real organsims work as it always remains programmer dependent for finding bugs and devising improvements to enable better ways of performing the tasks.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
"JGCASEY" wrote

That's not completely true, behavioral robotics also works like this. You dissect the problem in little reactive modules that are independent of each other. The sum of all these small "behaviors" make the final desired behavior "emerge".
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
A side track:
How is behavioral robotics nowadays? There was a lot of stuff about it in the late 90s. People like R Brooks, P. Arkin were doing a lot.
What's happening now besides Kismet? Or is Kismet not behavioral?
But it seems that it's more or less died down now.
Looking at SLAM, a lot of work (if not all) is on probabilistic models (generative).
Is behavioral passe?
Padu wrote:

Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
snipped-for-privacy@gmail.com wrote:

To be honest, my first "real" job in technology was at Denning Mobile Robotics back in the early '80s.

Software can be hard, but so too can be hardware. Often times I have the idea in my head but lack the tools to create them. (Don't get me wrong, I have a fair set of tools ranging from a drill press to a 4 trace asciloscope)

Electronics, maybe. Robot hardware has some mechanical qualities. Motor torque, speed, stresses, acceleration, etc are all touchy feely sorts of things.

Until your battery drops to 11.5 volts and things start getting flaky, or your PWM amplifier is dropping spikes on your power causing your computer to shut down.

I've done most of the elements of the software before. I've done some vision systems, motor control, and stuff like that. I've even programmed mobile robots before (early '80s). There are some aspects that I am excited about and I want to get my platform built so I can work on the fun things.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
mlw wrote: [...]

What do you call fun things?
What kind of vision systems have you done?
Vision systems have interested me for some years but I have found very little desire from others in "sharing ideas". Of course it is possible they have no ideas to share :)
The thing about robots is that there are really three skills required if you start from nothing. Mechanics, electronics and programming.
Due to time limitations I have made the choice to stick to the programming side and buy the hardware where possible.
Because we all use different hardware/OS configurations sharing without needing to completely re write the software is difficult.
John Casey
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
JGCASEY wrote:

AI and navigation. Phased ultrasonic sensors. Stereo vision.

Just simple defect in a fairly constant background system.

lol. I've seen that as well. Even the books on the subject seem to dodge any real substance.

Very true.

I HATE buying overly expensive hardware. I started out as a hardware technician, then engineer, then software. So, I can do both. I was also a shop troll in high school, so I have a feel for the mechanics as well.

I really can't imagine how difficult my robot would be on Windows. I've done a fair number of NT device drivers and while the environment is nice for simple things, any project that requires a reliable interrupt rate is screwed.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

That's not true. There's a wealth of info (books, papers) on computer vision in general. There's lots of stuff on SLAM as well.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

Polytechforum.com is a website by engineers for engineers. It is not affiliated with any of manufacturers or vendors discussed here. All logos and trade names are the property of their respective owners.