Now that's funny.
Actually we don't need different news groups, but the microcontroller guys
need theropy. They remind me of guys who drive hopped up japanise cars. OK,
sure you *can* get a few hundred horse out of a 4 cylinder engine, but it
takes a lot of work and nitrous. A bored out chevy small block with a good
set of heads and fuel injection, will easily produce 400 horse.
All I've ever said is that I don't want to use them and they are limited.
Both statements inarguable, and what followed continues to confound me.
Micro's are neat because they are understable
by a single mind, in a lifetime. You can optimize
your own assembly. You can know exactly what
is going on with all of micro all the time. Register
states, reads, writes, status bytes, absolute max
interrupt handling code latency are all available to
the programmer. Brute force (in the form of CPU
MHZ) coding will get things done, but you might
not understand exactly what is going on inside.
It is a preference for me to know exactly what
a machine is doing. My mind wants to know
everything that is going on, and to understand
Some people just enjoy programming. The
ability to make a processor/computer do
If you want to relate it to a car: Your 2005 vehicle
breaks down. You cannot find the problem because
there are undocumented black boxes, a computer
which is undocumented, the schematic of the
car is unavailable.
Your 1970 vehicle breaks down. You have a full
schematic and can trace what is going on.
I enjoy programming both. A modern PC
has microcontrollers built into it. It allows
CPU to execute applications and OS instead
of counting ms and watching address/data
lines. Example: serial and USB port buffer,
IDE drive, graphics card, etc....
The PC wouldn't even operate if it had to
execute instructions to directly control
IDE drive (turn H bridge on. disc spins.
read encoder. too fast. slow PWM.
too slow. increase PWM. too fast. move
head, read encoder. too far. hey
supposed to be reading parallel port.
go read parallel port. now drive speed
is too slow. Head needs to be moved.
but USB needs to be checked if there
is data on line. oops mouse could
have moved so go read it. IDE drive
#2 that is being written to needs to
have head moved. check it's speed.
You cannot achieve parallel hardware
operation with a standard PC without
each hardware device having its own
microcontroller (or buffer, which is a
very simple, single purpose micro)
you have too many things going on
to get them done with 1 CPU.
You nailed it on the head, Rich. These choices are preferences, for any
number of reasons. MLW started this thread with a presumptuous statement
about "hardware geeks" -- "The guys that would do their robot with many
small PICS tend to be hardware geeks that don't like software all the
much." Does this mean software geeks don't like hardware? He doesn't
Not everyone understands that for 99.9% of people here, robotics is a
hobby, not a business. So it's all about preferences and choice and
personal enjoyment. That means diversity in ideas and approach. But now
we're being pigeonholed into some kind of partisan mindset, like the so
many OS wars crap that dominate the newsgroups. Sad.
This is a very interesting position. It reminds me of the classic physics vs
paricle physics debate, and Einstien's quote "God does not roll dice." How
much predictability is there? How much do you need?
There are some serious limits to what you can know. You can not ever
guarantee there are no bugs. Even if the software is perfect, the hardware
may have issues. It is best to design around things not being as you
That is the a part. True.
Actually, that isn't technically true. (I'm sort of a car buff). There are
code readers that can tell you what is wrong. As for the undocumented
issues, there are DOT regulations, and they are working on laws to make
this less of an issue.
That may be true, but just as you need a code reader with a newer vehicle,
you need some tools to work on an old carborator. How would you fix an old
brass float that has holes in it?
Again, I'm not making the argument that microcontrollers are not valuable. I
My position on microcontrollers, as I have tried to articulate multiple
times and in multiple ways, is that as long as the functionality is well
defined and constrained, then sure. Keyboards, microwaves, etc all perfect
examples of where and how to use a micro.
keyboard: Matrix of switches. PC Keyboard: matrix of switches
connected to small micro. The micro encodes the input of the
switches into a binary number. This number
is buffered. The micro signals PC, "DATA READY".
Without a micro: With 101 key keyboard, you have 101 bits
of information to read. Solder these to CPU data lines.
Ok, now you have to scan what key is read:
for i = 1 to 101
read DATA line (i)
if DATA line (i) = 1 then (you have a keypress)
You just used 101 pins off your CPU. Your CPU
has to scan 101 bits of data (probably have to
load from port, then AND with the proper mask
to find status of bit). This has to happen with
"multitasking" in operation, the IDE drive is
reading, the ethernet card is communicating,
the graphics card is updating.
To get input/output to a CPU you need sCPU's.
Slaves to the CENTRAL processor.
"as long as functionaly well defined and constrained"
doesn't mean much to me. How can a function NOT
be well defined? How can a micro NOT be constrained?
It only does what you program it to do.
1)encode 101 bits into 8.
3)Signal data ready.
4a)reply back to PC with requested data
4b)act on PC command (turn on numberlock light, flush buffer, etc)
It is absolutely necessary if you want to get any efficiency
out of the CPU. Why would you want to bother a 2ghz
32 bit processor with MMX with such a task? Most wouldn't.
The CPU is a processor of data. Not a counter of encoder
inputs. Not a buffer of keyboard or coms data. It can be
used for this. But it is not efficient, and in my experience
with coding hardware drivers and applications, is an
absolute NO if you want to get things done.
Look to the PC joystick (analog). Here the PC is counting
cycles. charge capacitor. start timer. Is cap discharged?
no-->keep counting. yes--->we know where joystick is
Do you want to do this? Or how bout this:
CPU: request data from Micro
<continue executing multitasking OS>
yes--> store count in RAM
signal CPU-- "I have requested information"
CPU: read information from Micro
The whole time the joystick was being read, the
CPU was free to operate the OS, which includes
accessing every other hardware device THROUGH
the micros that control them.
"parallel" hardware operation can only be done
with a master and a few slaves.
The fields didn't get plowed with the master
pulling the plow. The master dictated to
With a robot, you can get a few tasks
done with only the master. He is fast.
He is strong. He can only do what his
engineered circuits are capable of. As
soon as you tax this built in limitation,
it is time to get some slaves.
We shape our society in the way that
our minds work. It is the only way
we know. Hierarchy's of power and
control. Throughout government, business,
social interactions. It works. Is the
way our minds work. Seems like a
good idea to model your experiments
But that is not the situation. The 2ghz 32 bit cpu is doing
the job of *many* 20hz microcontrollers. If there is a cheap
way to hand the job over to hardware you do it. Once the cpu
did read the keyboard and do all the graphics processing.
Now thanks to mass production and cheaper ic manufacturing
it is *cost effective* to do it another way. You are always
trading the cost of hardware with the cost of cpu cycles
for any given task.
As for your example of reading the joystick that is not how
it has to be done. You can save the time, start the process,
and just poll it x number of times a second while you are
doing other things. A 2ghz machine has plenty of cycles
to spare compared with a 20Hz uC. When it gets an interrupt
or reads the input, data ready, it notes the current time,
subtracts from the starting time, done. You can make it
easier by using a ADC chip. You just trigger the convert
pin and poll (or use interrupt) until the data ready pin
indicates the result is ready to read.
The amount of cpu cycles required to control my toy owi
arm were trivial. To use a PIC would have just been plain
silly. It has 5 motors and 5 encoders and video feedback
on a 800Mhz machine using interpreted QBasic!
And things like the pc keyboard, graphics card, sound card,
hard drive, have their hardware support and are therefore
out of the equation when it comes to "do we need a uC for
this pc interface application".
Now it may be true that a uC is the *best* solution for a
PID controller and very cost effective. And it may be true
that a stand alone robot base with simple behaviors only
requires a uC. But if your application *needs* the computing
power of a modern pc and has power to spare why not use it?
Like the old computers. A thin hardware manual
and you could understand it all in an afternoon.
Getting things done at a price you can afford
is what it is all about. There are jobs best
done with a shovel and other jobs best done
with a back digger.
Understanding how a piece of software works is
no different than understanding how a piece of
hardware works. Software is flexible but slow.
If the software is really useful we embody it
in hardware. Thus graphics algorithms are
implemented in graphics chips. Same with sound,
video, controllers of various types...
I don't have the means to embody some of the
lower level visual processing algorithms in
silicon, although those that do have the means
are doing just that, so I need a fast cpu with
lots of memory to do the job instead.
And *enjoy* is what it is all about.
Mlw enjoys using software to solve a problem,
even if others see a hardware solution as a
You're being revisionist. What the "microcontroller guys" said was they
prefer to use them as subcontrollers for PC-based robots (the use of
microcontrollers in smaller robots has not been raised). That is *their*
equally inarguable design preference, for which they've backed up with
their own experiences. That experience may be different from yours.
If you're confounded by anything, might I suggest it is your habit of
challenging someone else's preferences, when you feel your own
preferences are not subject to discussion.
An interesting question came to my mind: how could someone design a robot
that must interface to at least a dozen of sensors and actuators be
Imagine this list of sensors:
- 2 GPS chips speaking CMOS serial
- 3 Gyros (X-Y-Z) speaking I2C
- 4 servos (steering, velocity, camera pan, camera tilt) speaking PWM
- Wheel encoder speaking whatever
- Sonar array speaking I2C
- Temperature sensors speaking 1-wire
- LADAR speaking whatever
In my opinion, it is much easier and conceptually clean to have the PC (or
any other computing unit with enough power to do processing... could be a
mini-itx or a rack of dual xeons communicating through ATM) communicating
with the "sensorial system" being the master of a bus (I2C, RS485) and each
sensor node to have a microcontroller that guarantees the language spoken
between the master and slaves is the same.
In this case, the mcus are here to simplify, right?
At least, that's what I think it's the right solution for this problem, but
I'm always open to new ideas.
"Eu prefiro ser uma metamorfose ambulante, do que ter aquela velha opiniao
formada sobre tudo"
Raul Seixas, great brazilian singer and prophet
I agree that a microcontroller can often simplify a task, but to me, the
larger issues are:
1. isolating the specifics of hardware
2. offering future flexibility for hardware you might not even think
Neither can be dismissed, but as has been discussed here, there are also
personal preferences involved.
For isolating hardware, you could use software drivers alone, but these
can become progressively complex as interface choices increase. With a
consistent hardware interface, no matter what information the underlying
machine needs or returns, it's handled by a common communications
protocol, and software drivers can be kept lean and simple. This is why
R/C servos are so popular for smaller robots.
The ability to plan for the future, without knowing what the future is,
is a basic tenet of good engineering design. That's why *some* of us
prefer using a microcontroller for handling subsystems. It's our
inarguable preference to do so.
I have to agree with Gordon here.
After building a primarily PC/Linux controlled robot, my next robot
will attempt to use a more heirarchical approach.
Many of the ideas for this have come out of these discussions.
I want to take a stock PC interface (USB, hopefully) and then connect
that to the high-level controllers. The PC will do the planning
and high-level control, but it will not be very concerned with
the actual control of the sensors/actuators. For example, the
PC might say "turn right" rather than give specific motor commands.
There would be several reflex layers that might affect the outcome
(such as an emergency motor stop) under certain conditions; the
higher level processors would be notificed and could override
these reflexes if necessary.
In other words, something like a biological nervous system.
However, that's a discussion (and hopefully a book) for another
time and place.
D. Jay Newman
Rod Brooks started one, but without many hardware specifics. Of course
many of his company's products use this approach.
Though his book (a collection of his papers) lack a step-by-step
project, there's enough there for anyone interested in subsumption to
give it a whack. It's an area where, as demonstrated by about two
decades of MIT research, is ably handled -- among other ways -- by using
simple individual microcontrollers.
The fact that Brooks designed Ghengis, or even COG, out of a
heterogeneous collection of microcontrollers reflects his approach to
robotics. With COG, far more complex than Ghengis, he supplemented the
microcontrollers with QNX for real-time vision and sound analysis, and
DSPs as pre-processors. I have to read anywhere that he insists it's the
only way it could be done.
I've read a book from one of his students that now works for iRobot and I
believe is the creator of Roomba. The guy uses an architecture inspired on
the subsumption but more grounded to practical problems such as sensors that
give you false information, wheels that slip and so on. There are a couple
of practical examples throughout the book, and some very good practical
Info on the book:
Robot Programming : A Practical Guide to Behavior-Based Robotics
by Joe Jones, Daniel Roth
They also have a website with a robot simulator
Yes, this is correct. Joe Jones is one of the lead developers of Roomba.
I left out Roomba because its single microcontroller is not slaved to a
PC, which is the subject of the discussion these many weeks. It's more
of an elaborate LEGO RCX. (In fact, quite a number of folks have
replaced the non-reprogrammable controller in the Roomba with another
MCU, like Javelin or even a Basic Stamp.)
That said, it is certainly a testament of the power of a little 8-bit
MCU with 256 bytes of memory. It's arguably the most successful
utilitarian robot design yet. I am not aware of any other robot, that
wasn't basically a child's toy, that's sold 1+ million units.
Jones' earlier book, Mobile Robots, also includes some practical
behavior-based projects and code.
You know, I thought you had made some good arguements in this thread, even
if I didn't always agree with them, but now I think you're just being a
First, you completely ignore the fact that some people ENJOY working on
those "hopped up japanise cars", just like some people like working with
microcontrollers, some like climbing the mountain instead of driving to the
top, and some prefer to make their bread instead of buying it in a store. In
each case, maybe there was a faster, easier way to do the same thing, but
it's the act of doing something a different way that's sometimes the goal,
not the end result.
Second, how will anyone ever know if there is a better way to do something
if they don't experiment. You argue that people should consider your
methods, yet you completely dismiss the other side of the arguement as
unreasonable and without merit. Don't be so quick to dismiss other people's
choices. The most powerful computers started off as just a bunch of
switches... you have to start somewhere. And sometimes the simplest answers
ARE the best; not always, but sometimes.
Third, and lastly, you should always pick the right tool for the job. I
think it would have been really silly if they had used a $100 computer board
in a Furby (or robotic toy of your choice) when a $0.50 microcontroller
worked just as well. Then again, I'd be screaming for someone's head on a
platter if I found out that NASA spent millions of tax dollars to build the
Mar's rover's entirely out of microcontrollers. Sometimes the job needs big
tools, sometimes small... and sometimes a mix. Use what's best, and what's
available to you.
For myself, I've work with computers my entire adult life. It would be very
simple for me to build a robot controlled by my laptop, but for my current
project I've chosen to use a microcontroller because I've never used one
before. (Not to mention that I couldn't figure out how to build a tabletop
sumo with a laptop strapped to the top that could move itself... although I
bet no one would be able to push it out of the ring.)
Not "just as well" but far better. How long would furby's batteries last
if it was running a P3 at 500 mhz, plus memory controller, video
controller, IO controller, etc.. and all those other things it would
MLW, when you say "microcontroller guys" you make it sound like a cult.
The only microcontroller guys are the ones who design them. Just because
some guy used one once in some design where it was prudent doesn't mean
he's obsessive and irrational about them.
On the other hand, to people who like to use microcontrollers, using a PC
to drive a small robot is like using a 10 ton dump truck to go grocery
shopping. Sure, you're not limited in how many groceries you can take
home, but you don't really need that much capacity.
People that build robots from scratch either have an interest in
different disciplines or have a budget to stick to. We all have
different reasons/goals/interests but robotics is the common theme.
Why try to stereotype at all?
Would you build a 1inch cubed sumo bot from a PC?
Would you build a voice controlled, voice recognising, image
recognising, drink serving, all singing all dancing robot from an
I'm glad you started the discussion about using PC's as robot
controllers but when someone talks too much about the same subject
then others become bored and tend to ignore them.
I think you have quite a bit to add to this group but the odd change
in subject would be nice.
What's the line between someone talking too much, and the USA problem of
limited attention span? I'm all about free speech and really believe that
if you don't like something, don't read. Critisizing someone for speaking
is never helpful.
The subject is a rich one. There are tons of interesting design issues. The
choice between using a microcontroller and a PC with an OS, and which OS,
is something that could be the subject of many many papers. There is no
lack analysis that can be done.
Polytechforum.com is a website by engineers for engineers. It is not affiliated with any of manufacturers or vendors discussed here.
All logos and trade names are the property of their respective owners.