How about we aim for one world with different teams?
Look at the earth from space, do you see any borders?
The one that gets beer from the fridge?
Unfortunately that's socialism and doesn't really take
human nature into account. No one is going to share a
good idea without reward. Life is like a football game.
Most people like to compete and have an audience see
Now I am all fired up to go. What's the game plan?
Ah, so true; there was a time when I wanted to write everything in Perl
but found that it wasn't quite up to embedded applications...
C is probably the best for embedded programming - can make for smaller
code, which is important if you've only got a few k of memory to play
with. When we're talking PCs, however, I don't know if the difference
would be noticable. If you're used to OO programming languages, C++
would probably be more familiar; apparently it's easier to build larger
applications in it than C, but I'm just taking that on faith.
If you have an MCU looking after each motor, you could do that bit in C
(or assembly if you're that way inclined) and then do the PC stuff in
C++ if you're happier with that than C.
Don't forget that any speed advantage means that you get your beer that
much quicker ;-)
This is a common misconception. C++ can create just as compact code as C,
and in fact, depending on your design methodology, i.e. object oriented,
can make code more compact since you don't have to create code to simulate
what the compiler can do.
C++ is mostly a superset of C. Anything you can do in C, you can do in C++.
You should read a little about the "free" software community. ("free" means
freedom, not always absense of cost.) Many people mistake it for socialism
because it isn't as restricted as they are used too. It is actually very
The system works like this:
You want to accomplish a task. Rather than roll your own or buy everything
you need to do it, you pull from the huge libary of GPL applictions and
source. If money was involved for doing the task, you still get paid and
you don't have to pay for the tools you used. All the community asks is
that if you alter software that you used, which really doesn't belong to
you in the first place, that you make your changes for other people to use
as well. If you write your own software from scratch, it is yours to do
what ever you want with.
I have a project out there in the "free" software community and I contribute
to a few others as well. Because of my contributions, I can get software
that does what I need "out of the box" and on my "free" projects I have
recieved a number of bug reports that have let me fix problems I didn't
even know about.
Conceptually, it is no different than a bunch of farmers working together to
build an aquaduct. They still compete with product, but cooperate with
infrastructure. The wealth of GPL software is astounding. You should look
at what's available, absolutely for free!
I'm hoping to have a working prototype of my super-cheap PS/2 mouse based
motor encoder system with source code and pictures by the week end,
assuming I don't have to spend too much time ferrying children to
instrument lessons and sports, and the wife doesn't have friends come over.
Perhaps however much of it is restricted to the advanced
programmer? For example I used the DJGPP C/C++ compiler
extensively but was unable to use the Allegro graphics
library with it because I couldn't get it to compile.
Free software is not much use if you don't have the time
to learn how to use it.
You gave your pet peeve now I will give mine:
Imagine how many novels and entertaining stories
we would have if only professors of English (or
French etc) had the know how to write them ??
Those who built the electronics of the tv industry
don't automatically have the ability as a result
to make good tv programs that use that technology.
Once programming was the province of white coated
mathematicians in the big Universities. Although
the computers were simple they could maintain their
mystique as only they had access to them. And then
along came the microcomputer....
Experts do not have a monopoly on ideas:
An example is when I owned a C64 and bought a touch
tablet for graphics. It was flawed in that a glitch
would put bumps in any hand drawn lines. Because
they gave the hardware details I was able to fix
it with a better algorithm.
Of course I had then to write my own graphics routines.
It was fun working out algorithms to make use of the
multicolor mode but I thought one day we would have
enough memory and cpu power to give and display our
pixels with a RGB value. The day came but only via a
slow graphics interface or by a complex DirectX that
takes an expert to decipher and implement with loads
of OOP paper work required by a bureaucratic OS.
Now we are reverting to the bad old days. Power
to the people I say. Why can't a language be both
easy to use, easy to learn and compile fast code
that can access web cams, printers, graphic/sound
cards etc. at a price most people can afford?
At the moment I am looking at Power Basic which seems
to be offering all the above. Working as a team means
allotting the things to be learnt according to interest
Really I don't mind paying experts to write functions
to do things like grab an image from a web cam into
an array or display it in 24bit color. I just don't
have the time to learn to use a complex OS or language
in order to use that function.
or even worse! Imagine how many robots and
entertaining mechanical slaves we would have
if the people who built them typed too long in
a meaningless debate over X vs Y, and neglected
updating the software to control those robots!
There are thousands of packages most are pretty good, some are less so. It
isn't fair to paint the whole environment with one or two experiences. I
can remember quite a few Windows programs which were utter and total crap.
Imagine if we were forced (in school) to read books published by people with
no more than a 6th grade education! Dick cand Jane can only run so much.
I'm not sure how that is important.
This was *never* true, by the way.
It was very much harder to program the old computers. There was no mysique,
it was like expressing complex idea with a very limited vocabulary.
Yes, a very neat invention.
Experts typically don't have many orginal ideas, but their expertise is
generally useful in bringing ideas to fruition.
Yup, Windows sucks. Been there, done that. I've written a couple video
drivers for Windows 9x and NT.
That's where "free" (as in freedom) software and open source come in to
Linux, has a great price: $0.00 and can use web cams, printers, graphics,
As for "easy," that is a complex topic. What is "easy?" Seriously, think
about this for a minute. DOS and old CP/M were "easy" because they were
simple. If all you want to do is something simple, then they are enough. If
you wanted to do something more complex, they were very much more
Think about a robot or other complex system. "Easy" takes on a new aspect.
Easy to me has come to mean good multitasking, memory management, process
management, file systems, networking, etc. etc. The basic facilities that
are, in their own right, very complex projects. If I want to create a
vision system, I don't want to write a whole video management system. I
want to access a video stream and process it.
Bascally, there is an amount of complextity that comes with more advanced
This is fairly short sighted thinking. Knowledge is almost always helpful. A
complex OS and language can make your task much easier to accomplish. If it
takes you 2 months to come up to speed on an OS and language, and then
takes you an additional two months to do a project instead of doing a
project in 6 months, the complex OS and language saved you 2 months.
You are thinking the the OS gets in your way, when you should be thinking,
wow!! These guys have done all this work! All I have to do is build on it.
This is so easy!
I was talking computer *language* not OS ...
Yes I think Linux sounds great. I will give it a go.
But ultimately the outcome depends on how easy the
language is and how complete it is with regards to
enabling me to use the OS.
It takes more than 2 months to come up to speed with
a complex OS or language. Do you think I haven't tried?
What you say makes perfect sense to someone just starting
out but my situation is different to yours. It is your job
and you have put a lot more than 2 months into learning
and practicing your skills. It has been a part time, on
and off, hobby for me. I enjoy the subject but I am an
*I rely on experts like you to make it more enjoyable.*
First I set my goal at a height I think I can achieve.
Then I look for the means to achieve that goal. In my
case I have a hammer and in your case you have a nail
gun. So I have to limit my goals to a playhouse for the
kids and you can go out there and build a house. Or you
might loan me your nail gun? Or I might pay you to
do some of the expert bits just as those do that decide
build their own home. They do what they can and hire
experts to pour the cement or wire the house etc.
They might limit their input to laying the carpets and
tiles, painting the house, designing the house etc.
I don't know what your home building skills are like?
If you have none and decided to build yourself a house
on the weekends would you do a 5 year builders course
first and learn all the skills required?
We all have limited time. The knowledge and skills we
need for our job may not be useful in our hobby. You are
fortunate in that your hobby (build a robot) happens
to require the skills you use in your job. I am just
the little guy that likes to look at the stars through
his little telescope and read the latest works of those
lucky enough to do it for real. I might not find a
planetary system but with some luck maybe a new asteroid?
So to sum it up I cannot build an all dancing all singing
robot and utilize the latest and greatest software systems
to implement the robot. My goals are humble. I have been
able to write simple programs and put together simple
hardware that can control motors, read sensors and do simple
visual processing that I hope will enable my robot to stagger
around the house and maybe get that beer or better still
sweep the carpet/tiles/lino so my wife doesn't think my
hobby is completely useless.
May I point out that top sportsmen are appreciated and
rewarded by the amateurs that play golf or tennis on
I understand it if you want to limit yourself to interacting
with those with an equal footing as regards electronics,
programming, mechanics, AI etc and will await for your book
on the subject :)
Well, a little effort on your part to learn something that hasn't been
dumbed down will pay dividends on your ability to accomplish.
What did you try?
Everyone's situation is different.
Nothing worth doing comes easily. The accomplishment is the reward. I know
that sounds like a forune cookie, but it is true.
We can't make you enjoy anything. You have to enjoy it.
It's like mountain climbing. If you ride the ski-lift to the top, you see
the view. If you climb to the top, you understand the view, and are
improved by it.
I know that isn't what you want to hear, but it is true. "Everything should
be as simple as possible, but no simpler."
If you want to build a house, you learn how to build a house. If you want to
build a playhouse, you learn how to build a playhouse. You don't set your
sights by your abilities, you learn what you need to learn based on that
which you've set your sights. It doesn't matter what I know or have.
5 years? Hardly, I have a couple lesbian friends that are in the process of
building their house. Learn by doing.
Not really. Ripping apart a mouse and making a motor encoder isn't in my job
And amature astronomers find stars and asteroids too.
"Pro" athletes are useless, college football is better, local baseball is a
blast, and a pickup footbal games rocks.
Experts in the field did not learn what they know in school. Most of the
greatest contributions to science and technology have not come from
Your problem is that you limit yourself.
No, actually I've tended to choose Windows or Dos.
Linux is nice, but its documentation isn't anything special.
Just saying that you can read the source code doesn't
mean that documentation is good.
I probably will choose Linux for future projects but it
hasn't been able to meet all of the requirements.
Until the 2.6 kernel, Linux could not have been said
to be a real time OS. Neither has windows.
Really? Where's the documentation describing how to write drivers in
the 2.6 kernel? "Read the source code" isn't documentation. Info files
are o.k. but they don't tell everything. Good examples and thoughtful
presentation save lots of time.
Cheap video input devices, for example.
The more I do robots, the more I believe that real-time is
significant, especially if the robot is going to be working around
people. If a robot moves, it needs to respond to its sensors in a
real-time fashion which depends on its mass and speed in order to
lower damage to people and property.
Again, the Linux 2.6 kernel is becoming more like a real-time system
and there are definitely more peripherals available for Linux. Since
we've agreed that there probably needs to be a microcontroller
anyways, Linux or Windows only need to communicate over USB, serial
ports or ethernet.
Google: "HOWTO write linux kernel drivers"
743,000 hits, including:
Porting device drivers to the 2.6 kernel
Write a Linux Hardware Device Driver
And my favorite:
Hop on over to: www.tldp.org (The Linux Documentation Project) Lots and lots
What? What's cheap? There is lots of support for most USB web cams, video
cpature boards, etc. The V4L (Video for Linux) project is amazing.
"real-time" is an interesting word. I used to work at Keithley-Metrabyte,
and "real-time" is a very subjective term with almost no meaning outside a
specific discussion. It is largely based on your requirement of maximum to
time to response. Some systems this is measured in micro-seconds,
milliseconds, or even seconds.
I wouldn't use Windows for anything that *needs* a computer :)
There may be now, but there hasn't been in the (recent) past. Of
course, Video 4 Windows precedes V4L by many years.
Yes, realtime. Like, able to initiate termination of motion within one
or two milliseconds. Able to come to a complete halt within X
milliseconds. Able to minimize force on an effector within X
milliseconds.I think if a robot is going to be around humans, it needs
to be safe.
Additionally, if the robot is vision guided, then it needs to be able
to perform a certain kind of analysis in fairly "real time."
Then again, I don't think a robot does need an operating system. Do
humans have an operating system? Does a dog have an operating system?
Perhaps the developer needs an operating system to simplify the
development task, but a robot doesn't need one. That is not to say
that since we haven't perfected robot software that we don't need good
development tools. We do. On that basis, operating systems make sense.
Additionally, as multicore and parallel systems become more available,
we'll probably need operating systems to simplify the distribution of
tasks across the CPUs.
That's your preference. I'm willing to consider it or other operating
This is an excellent point and it illustrates a couple of things about
how we think all by itself. Computer people are so often bound by how
computers are defined that they apply a blanket solution (such as the
operating system) to a problem that a computer must solve.
Organisms have operating systems of a sort but they are unlike computer
operating systems in the following ways. First, they are more or less hard
wired into the organism. Second, they are distributed in a manner that
might best be described as granular. The closest approximation we might
reach is to assign a small, simple processor to each task and link them all
Imagine a hand created in this manner. Each digit of each finger might
have a processor that reads its sensors for touch, position, and damage and
integrates the resulting data into an image of its state. You would end up
with an "overlay" of many maps, each reflecting specific data about that
digit of the finger. One would correspond to the temperature sensed. One
would be the pain data, one would reflect flexion or extension data.
As far as the dedicated processor was concerned, it would simply be
returning a small set of arrays of number values, which when taken together
would tell us all we need to know at a raw level. And, there would very
likely be a round-robin task scheduler in that processor that scanned
rapidly through the raw sensory data to develop the arrays of information.
This is entirely unlike the OS of a real finger digit, but it would in
effect return the same results. In our case, it would not matter because
the outcome of that operation is all important- the pattern matters, not how
it was arrived at.
And we can eliminate some of the inherent differences in the microchip
OS method versus the biological method by having all the processors run
asynchronously- all we care about is the data, and that it is produced
rapidly enough so that the time instants overlap or blend together.
Now we have a bus that takes all that information to a master processor
in the hand that integrates all the finger digit information into a whole
net of information about the hand- but take note (and this is most crucial)
that never does any single "bit" of information get sensed at any higher
level! Only the derived, integrated picture of what the whole hand is doing
and feeling is relayed to the next higher level.
On this basis, the OS of each chip is immaterial. We subsume all the
data into a whole, smooth image that is not time dependent (except that
there is a fastest possible sensing time) and that is not processor
dependent. The content and the overall image are all that make sense to the
Now, this might sound like a terrible load of overhead, but truly it is
not. It is through small, granular processes that we are most likely to
succeed in duplicating the extremely fine scale of organic sensing and
processing to reveal the entire state image of the organism's limbs and
systems properly. After all, how much processing does a neuron do,
especially in the context of a net of neurons or sensors?
Now, when using a traditional processor, you always have some sort of OS
involved, but the trick is to write code that has the absolute minimum of
system-related overhead. You can completely simulate everything in code,
but you have to make certain that the system does not interrupt what your
robot is doing in any significant manner.
So the contribution that the OS makes is simplicity of using your
particular machine, but the catch is that it takes major chunks of time that
might be critical - imagine if you were to black out for a fraction of a
second every few seconds... that is what Windows would do to your machine.
Linux can do away with a lot of that, but a real basic system like DOS
is always a far better choice. If only it would handle large memory spaces
and large disc spaces...
Sir Charles W. Shults III, K. B. B.
In fact I have been using DOS. You can access as
much memory as you like (with a DPMI host) and
use 32 bit code. That includes some visual
processing using a monochrom ccd based camera.
I have been forced to use VC++ for some experiments
with a color web cam as the protocols are not made
public (or private).
There was a project on oz overclockers using a webcam with an avr
need to be a member (free)
the guys working project
Looks like you need to have java installed for your browser
Recent past? Linux has been able to interface with video hardware for many
Is that a "real" requirement? How fast can you really stop? Even the best
controlled servo or stepper can't respond that quickly. Acceleration and
velocity must be carefully planned.
What hardware can come to a complete halt in milliseconds? Certainly not
anything with significant mass.
Again, what is the "real" response time? If you are assembling microcips or
PC boards, you might want to have a very tight loop and very high precision
mechanics, but in general purpose robotics, roaming around imperfect floors
on imperfect wheels, with a non-zero mass, single digit milliseconds is not
Yup, and depending on the speed of the beast, you have to carefully plan the
deceleration or it will skid or topple over. How fast can you bring a robot
with greater than zero mass moving at a greater than zero velocity to a
complete stop? Will a single digit millisecond response time *REALLY* be
noticable over a double digit millisecond response time? no.
Yes, which a micro-controller is certainly not capable of doing.
The more and more you look at genetics, the more and more it looks like
software. Gene sequences behave very much like algorithms. Maybe we do have
an operating system we just don't understand.
However, genetics aside, an operating system is a form of glue that holds
together many different process, arbitrates conflicting requirements, and
schedules routine operations. So, yea, in a way perhaps we do.
I think this is short sighted.
I don't understand the resistence to operating systems, it must be a Windows
thing. A UNIX variant makes development, implementation, and operation
easier and more reliable. A good OS implements many useful and nessisary
tools on which you can build.
Just because the time a task needs to be done is different between
systems or applications doesn't make the concept of real-time
subjective. If the computer is controlling a two-legged walking robot
it has to keep those legs moving in a precise way within a certain
amount of time, or it falls over.
It could be.
You're exactly right, and that's precisely why it's a bad idea to
control a stepper motor with a general-purpose OS. Each coil must be
switched on and off with precision down to microseconds to keep it
running at a steady speed, or accelerating or decellerating smoothly
without losing steps. A dedicated microcontroller with no OS, only
application code written specifically to ramp and otherwise control
the stepper motor, is ideal for this, and is almost the ONLY thing for
this. A larger processor running custom-written code (no OS) would of
course work, or even running an RTOS if the interrupt latency is kept
low enough. You can be sure interrupt latency is too high with
'desktop' OS's such as Linux/Unix, any version of Windows, or MS-DOS.
You're right again, the speed must be slewed down at a controlled
rate, something better handled by a dedicated microcontroller
executing higher-level commands sent to it from another processor.
Is the computer runnning-some-OS giving high-level commands to move
the arm, or is it switching stepper-motor coils (or alteratively,
sending a PWM signal to a DC motor and reading an encoder)? I wouldn't
trust a general-purpose OS to do the latter.
What micro-controller are you thinking of? 8-bit processors from
Moto, er, Freescale, Atmel (AVR) and Microchip (PIC) are certainly not
capable of controlling complex systems with hundreds or even dozens of
high-speed I/O's, and the large bandwidth that a high-resolution
vision system uses. But they are totally appropriate, and even the
best choice, for converting commands such as 'move arm up four inches
at two inches per second' into signals that control motor speeds.
But there are other, much 'larger' (as far as word width,
instruction set and memory address size) and faster microcontrollers,
many 32-bit, and at least one that's been around for a long time, the
MC683xx series. There's also the 32-bit ARM, an architecture used by
many manufacturers. There are DSP's (many of which are technically
microcontrollers due to onchip RAM, ROM and I/O) that can do a large
amount of vision or other signal processing, at a lot lower price and
with a lot less power consumption than a desktop-computer's CPU doing
the same task.
On real-time systems conflicting requirements don't need to be
arbitrated, they just need to be done on time (okay, arbitration is
deciding which one goes next so that each task is done within its
timeframe - if that happens, then nothing is conflicting). This argues
for a RTOS, and furthermore (and implied but needs to be stated), a
design wich uses it effectively and guarantees that critical things
get done within the appropriate amount of time.
Polytechforum.com is a website by engineers for engineers. It is not affiliated with any of manufacturers or vendors discussed here.
All logos and trade names are the property of their respective owners.