OS level programming vs microcontrollers

I was corrsponding with D Jay, and some things occured to me about the microcontroller vs PC computer debate.

You guys say that doing things in microcontrollers is easier than with bigger computers. That may be true for you, but it is not true for everybody. For me, I no longer like that aspect of programming. It has its fun, for sure, like camping, the simplicity and the thrill of doing it all yourself, but it seems, to me at least, needlessly pedantic.

Yes, if you are making a "mouse" robot, by all means a micro-controller is the only thing that fits.

I have also said that micro-controllers do have a place, as long as they have a well defined function (and I'm not the one developing it). This is my preference, perhaps not yours, but it is the perspective from which I write.

Lastly, in our discussions I have also noted a trepidation about doing things on a computer running an OS like Linux or Windows. I can fully understand this reaction in Windows. I have written more than a few device drivers and data acquisition applications on NT, and found that while the device driver model is pretty damn good, the application layer is very unpredictable (pretty damn bad!). Systems like Linux and the BSDs have a (IMHO) less well designed driver model, but a MUCH better over all system design.

In the end, yes, Linux and BSD are not real time operating systems. They do have a level of unpredictability, but they are not "unreliable." This is important. I guess this echos back to the post about not needing a RTOS.

In the realm of motion control, it is impossible to make a blanket statement that a RTOS is not needed, because I can think of quite a few applications where it would be. In the general case however, a home made mobile robot does not typically need one.

I had a discussion with my son about a month ago. He said something was

1.536 miles away. Sure he was thinking he was funny, but it opened up a conversation about precision. (1) There is no way he could have known or measured that something was 1.536 miles away. (2) Even if he was 100.0000% accurate in the measurement, it is meaningless in the context of a rolling vehical that must move on an imperfect surface with imperfect wheels with imperfect alignment.

In most cases, a "reliable" OS can do the same amount (or more) of aggregate processing as an RTOS. The issue is dealing with the variability in response times. While this can complicate the algorithms slightly, it is not unmanagable.

Think about Linux or FreeBSD for a minute. One can download a web page, play MP3s or watch a video, burn a CDROM, perhaps even talk on an AIM window. The network, video, audio, and CD writing are all time critical applications.

A higher level OS can do most of things you are worried that it can't.

Reply to
mlw
Loading thread data ...

[snip]

I guess I take what could be called a "middle of the road" approach. My robot runs Linux on a Micro-ATX board, and the drive section is controlled by an

8051. The boards talk to each other using a home-brew message based serial protocol.

I run Linux for most of the reasons stated by mlw. However, it never even occurred to me to try to control the drive motors in Linux. The division of labor made sense to me, leaving the tasks that required microsecond resolution to the 8051.

I do not run an RTOS on the 8051, though several functions (serial data TX/RX, PWM generation, etc.) are interrupt driven, and essentially run in the background of the main (PID) loop.

I think all this philosophizing boils down to just comfort zone and personal preference. I like writing software for Linux, but I also like writing embedded code for microcontrollers. If my feelings were different, my implementation might have been different as well.

Jeff.

Reply to
Jeff Shirley

A real-time OS (or at least some kind of real-time extension, be it part of the OS, or simply some kind of timer interruption that you can reliably catch and take advantage of in a device driver) is the only way to go if you need predictable timings. Period.

If you just experiment or "toy around", yes you can even use Windows. You just have to keep in mind that not only the timings in it are inherently unpredictable, but something important too is the latency involving most kinds of regular communications (serial, USB...) And without communication with the outside world, no robotics. The latency when using serial communication (or USB) can be in the order of up to tens of milliseconds. What kind of control can you reliably achieve with that? As I said, there may be some possibilities involving writer specific device drivers: but believe me, if you already know how to deal with microcontrollers, it's so not worth trying the device driver path. Writing a decent device driver on Windows (ot MacOS, or Linux), especially one that deals with interrupts and such, is... no picnic. Point is, you would lose huge amounts of time.

Now if you're talking about OSs like DOS, yes that can work. But DOS, albeit pretty crude, can be considered a real-time OS, as long as you don't use resident crap that can interfer with timings. And even so, you can always disable interrupts when you really need tight timings. Something absolutely impossible to do in a modern, non real-time, multithreading OS. So yeah, if it's ok for you to use DOS, there are many things you can do with it. As for me, I don't call that an OS. But if you find it more convenient that microcontrollers, go for it.

As long as you know how to use microcontrollers and you have whatever is needed to program the ones you intend to use in a practical way, I don't see the point of not using them. From experience, it might seem enticing at first, but you'll usually lose a heck of a lot of time trying to achieve the same thing with a fully-fledged computer using a "regular" OS.

One fair compromise is to at least use a microcontroller for anything that needs tight timings in your project, then control the whole thing from a computer that can send it commands that are not time-critical.

Reply to
Guillaume

This is getting tiresome.

In your other messages you say you don't want to keep harping on this debate, but you keep bringing it up. By now I'm sure you realize much of this is preference. I think both sides have said all they really can say. Other folks have working robots that use the "other" approach, and you're working on a robot that uses your preferred approach. You can write about the advantages when you're all done.

Do remember that even your PC uses a sub-system architecture, so the rationale is hard to dismiss. Example: The original serial chips used in PCs were, in fact, the first generation PICs -- the name "PIC" coming from peripheral interface controller (time has somewhat muddied the acronym, but no matter what the variation, the idea is the same -- PICs were designed to serve as an interface between a motherboard and an outside device).

While modern PC boards now use different chip sets, it's still basically the same idea. The board designers want to offload certain functions in order to *optimize performance*. The designers of DMA and bus mastering had the same idea. It's not like it HAS to be this way, but it is the

*preference* of these particular designers.

The fact that you prefer to use only a single main board is a preference. Other people have their preference. We've all explained our reasons for preferring one over the other. Time to move on.

-- Gordon

Reply to
Gordon McComb

And I wrote exactly that "predictable" was not the issue, measurable and reliable are.

I wouldn't, but OK.

OK, but is this latency measurable if not predictable?

That is true.

OK.

Really? 10ms? 100 times a second. 20ms? 50 times a second? In a real physical world that is quite a long period of time.

There is where we disagree. I know how to deal with microcontrollers, and device drivers, and I'd choose a device driver.

Again, I disagree. I think writing an ISR in Linux or Windows NT is largely trivial.

DOS is little more than a BIOS extension. It offers no usable process or memory management. It barely does file I/O.

It is not "impossible" to disable interrupts on a "modern" OS, it is just, for the most part, not required.

Like I said, an over glorified BIOS extension.

I do.

Except for the expense of buying more computing power when you already have a computer.

In your opinion.

If you design the system to require tight timings, then you need to use a microcontroller. If you design the system to run on a PC, then you design around them.

Lots of people use PCs running common operating systems, like Linux and Windows NT (XP), to do high speed data acquisition. I have written applications and drivers for this sort of environment. The trick is not relying on "tight timings" but accurate measurement and reliable response. If you have accurate measurement (and pentiums do), in most cases you can code around response times.

Again, I have *never* said that microcontroller is never needed, however, most of the time it is not.

Reply to
mlw

"mlw" wrote:

I agree! I have spent the last few years building PC based robots. I have received a lot of doubts from fellow hobbyists over the years, but I don't care because I've been getting great results in a fraction of the time. Text to speech, Speech recognition, vision, high speed wireless communications, LCD display, audio effects, etc all without having to write any low level code. So far I've been using WindowsXP with VB and C++. One bid reason I like windows is the choice of low cost available hardware, and the built in remote desktop so that I can simply use my desktop PC to "remote" into him (wirelessly via 802.11g) as if I had my keyboard/mouse and monitor connected to him. Its nice to be able to get access to the OS at any time from the comfort of my desktop machine in my house, no matter where he is on my property. Been many a time he was outside somewhere on the lawn and had a problem and I just remoted in, changed some code and restarted his "brain" app, all from my computer inside my house. If you are unfamilier with remote desktop, take a look:

formatting link
I'm more interested in programming the robot to do something rather than making complex hardware work. When I built my last robot and got him into an operable state, about two days later he could walk around and avoid things, accept voice commands, tell me jokes/news/sports/weather/traffic/etc (that he downloaded off web pages), wirelessly stream live video/audio to my desktop (and even to any PC over the internet), play all sorts of sound effects/music, remote control him from another PC from a small simple client app I wrote in VB, sends me emails with a couple of photos when he detects movement video his sensors or video camera, display a ton of data about what's going on his LCD (I now use LCD to draw a mouth), etc. It only took me about 4 days to build the entire frame, install all the electronics, wiring up everything and install the OS. I've been able to focus my time on programming cool things like AI, image recognition, features, etc. Here is a web site about this robot (although very out dated):
formatting link
Ends up he is too big, so gonna make a smaller frame.

So far I have not had any issues with the OS not being real-time. Processing sensor data, deciding what to do, and then acting on it is extremely easy for today's computer. Even with speech recognition and vision processing running, I've never had an issue with a delay in monitoring sensors or sending motor commands.

-Hoss

Reply to
Hoss

In some ways, perhaps.

In the messages that do not directly involve this debate, absolutely.

I never asserted it was not.

Maybe.

Planning too.

Actually the original serial port on the PC was a UART.

True, but their design goals were a generic desktop computer motherboard. The design goals of robotics building is to build a robot. The goals are different.

You know, you snipped the whole post without ackowledging or refuting any specific point, so it is hard to respond to precisely.

The point I was trying to make is that the major reasons people state for using micro-controllers, or get pretty emotional that I am not (I'll never understand this), is that it is about precise timing and real-time reponse or a view that an OS is inscrutible or too difficult.

Preference is fine, I have said many times over, I understand that. It is "trepedation" of using a PC with an OS I was trying to address. There is real utility in using the PC to do lots of the things little microcontrollers do, but many builders may be afraid or unsure that it can be done.

So, you may disagree with the thrust of message, but it is certainly a valuable debate for those trying to make up thier minds.

Reply to
mlw

Exacty.

I'm so done with WIndows.

There are a lot of remote desktop applications for Windows, take a look for VNC. It is pretty slick. There is even VNC for Mac and UNIX.

I know you are familiar with Windows, but the UNIX model is a bit more network centric than Windows. If you are curious about Linux, I can give you some pointers, otherwise, if Windows works for you, great!

That is exactly one of the things I've been saying. There is so much already there, already done. The hard part of getting the "non-robotic" stuff (cameras, audio, networking, communication, process management, etc) done, is a breeze.

And you probably won't if you code the algorithms based on precision instead of predictability.

Reply to
mlw

I snipped it because I'm saying the debate itself has run its course. I don't care to refute anything, because that's already been said to death. Every debate ends...or so the audience hopes.

-- Gordon

Reply to
Gordon McComb

That's cool, sort of like a robotic kiosk. :-)

Here's mine: http://64.46.156.80/robot/

Reply to
mlw

I disagree, obviously.

Your opinion not withstanding, there are a few reasons to post on usenet, not the least of which is to share information. There are at least a couple regulars who agree with me, and maybe one or two who are trying to decide. For them, the debate may not have run its course, and perhaps more information could be helpful.

Secondly, the post was more of an observation of people's perspectives and addressing them.

Reply to
mlw

Boy -- this is really a non-debate. Nobody has said that "doing things in microcontrollers" is easier than with large machines. What we HAVE said is that *certain* tasks are easier, can be implemented more reliably, more cheaply and are generally simpler to accomplish using microcontrollers or DSPs. Other tasks are entirely UN-suited to microcontrollers, and are better left to bigger hardware. Nobody disagrees with this.

What many of us have taken issue with are assertions you've made about microcontrollers that are dead wrong, and clearly evidence complete ignorance about modern hardware and development tools. You've also managed to cheese a couple of us off by making assertions that are simply factually incorrect (as opposed to a matter of opinion) and doggedly defending those assertions to the death (i.e., "all computers have an operating system or close equivalent, even the computer that runs my toaster", "Real time is a relative term"). Stuff like this is simply misinformed, and many of us will feel obliged to correct the public record.

Again, this is a straw man argument. Clearly, the type of computing hardware required by a robot depends on the application.

Really, I have no problem with a "computer on wheels" approach -- a number of people have done it around here and it has met heir needs perfectly. I know of at least one person who simply bolted a PC (case and all) to a frame and supplied the power with an inverter and large SLA. This perfectly met their needs (vision research, I think).

But as a general-purpose platform, I think this is a poor approach for reasons that have already been stated. Moreover, your original post here ("How Much would you pay...") strongly implied that you were considering designing a commercial product -- your goals appear to have changed somewhat since then.

My original response was intended to point out that such a design not only had flaws, but added little value for me personally, as putting a pc on a wheeled platform is a trivial undertaking.

No disagreement here -- although I will reiterate that modern development tools make programming most micros significantly easier than device driver development. I still fail to see what you find so daunting.

As far as simple motion control goes, personally, I think this assertion is probably correct enough of the time that at a hobbiest level you can ignore the response time issues -- provided that certain critical actions (primarily killing the motors) can happen at the driver level. Of course, writing at the driver level, you lose some of the protective benefits of user-mode programming.

A crashed micro is simply restarted by the watchdog timer and resumes it's job without interfering with any other subsytem. In fact, unless you make provisions, you may never know that it went away.

Of course, we're talking motor control (and motor control not requiring a great deal of precision). When it comes to timing pulses (say, sonar), you WILL need to use some kind of peripheral to do this reliably. May I recommend a $7.00 micro? You may also need to generate pulses with a fixed width, say for servo control. Again, you're going to want some kind of peripheral -- I dunno -- perhaps a $7.00 micro.

[snip]

Most of which are assisted significantly by the peripheral hardware itself.

Reply to
the Artist Formerly Known as K

"the Artist Formerly Known as Kap'n Salty"

In my mind, it is not only trivial, but a good design should isolate this debate. Mechanical and electrical considerations apart (shock, power supply, EMF, etc), your cpu design should be able to perform the same if it is physically located in the wheeled platform, or wirelessly connected.

The mechanical and electrical considerations will vary greatly according to your application, that should drive what and where you install your robot "brain". If you are designing a rover that will navigate autonomously for 10 miles, wireless gets complicated. If you are designing a small robot to fit in small places, wireless may be a good idea, and if you are designing robots to explore caves in Afghanistan, maybe neither wireless or local computing may be adequate for you.

In summary, a good designer always must have a "deck" of appropriate methods and technologies for solving a given problem efficiently.

Reply to
Padu

The issue, I think, is our gray area between.

I don't like statements like this that call into question things I may or may not have said without any supporting arguments. You've said I was "dead wrong" about something and I don't believe I have been, so without any reference to what or why, it is merely an Ad Hominem attack used to enhance your argument. I refuse to accept it.

The "all computers have and operating system or close equivalent" was a conceptual argument that I don't agree was wrong, it may have been too generalized for pedantic level of this group, but conceptually it was spot on. In deference to you guys, "The vast over whelming majority of computers and microprocessors use operating systems or operating system-like code." For the sake of this discussion, "operating system-like code, refers to code which is used to service the computing infrastructure and not specifically your application." If your micro communicates with another, that communications code is "operating system-like."

As for the "real-time" is relative debate, real-time is relative. "real-time" stock quotes are not the same as "real-time" data acquisition of a bullet impact data.

Within a very large range of variability.

OK.

My goals have not changed so much, I never was sure what I wanted to do with the finished product. Selling it or the components was and still is a posibility.

Trivial in what way? I think I see now, I get it!! You guys are hardware geeks! You don't see the art, challenge, or sciece in software. Of course! These are the same design arguments people have all the time in companies that produce hardware/software products.

It is more complex than:

vi test.cpp [make changes] make gdb ./test

I'm not sure I believe that. You *can* design controller code that is stateless, maybe, but it is more work.

Maybe a $1.99 CTC chip, perhaps?

Sound cards are not (unless you count DMA or PCI bursting), unless you have an mpeg decoder built into your video card and a driver and application that uses it, video is raw BLT speed, most people have cheap-ass network cards which are really dumb, the CD may be a stand-alone peripheral, but the [E]IDE/ATA interface is pretty dumb as well.

Reply to
mlw

There's a difference between sharing information, and beating a dead horse.

-- Gordon

Reply to
Gordon McComb

If you check out what is available, you'd know that the above statement is incorrect.

dbl click programming environment [make changes] FileMenu / "compile and upload to device"

Is very easy to do. Many of the compliers for micros have extensive help files.

Rich

Reply to
aiiadict

Didn't your mother ever teach you "If you can't say anything nice...?"

Reply to
mlw

Gack, I HATE GUI development environments. Too cumbersome and bloated, not to mention slow and inflexable.

Reply to
mlw

I won't rehash the rest of this, but as for the ignorance bit:

For PICs: test.c or test.bas Ccsc test.c picdownload test.c

Downloading to the PIC in my case is just done through the rs232-port on the PC directly to the PIC -- no programmer required.

These tools work either under linux (with wine) or windows. If you use AVRs, everything is native linux, and there are likely native linux tools for PICs that I don't know about.

Debugging is marginally more of a pain -- it's generally easiest (without ICD hardware) to just use the serial port or other methods to diagnose problems -- but in practice, you're usually not writing massively complex programs, so debugging is pretty trivial. No worse, certainly, than debugging a device driver with the occasional printk, and with icd hardware probably easier.

The charge of ignorance is *not* an ad homenim attack -- you really DO clearly suffer from a dearth of knowledge in this area. Sorry -- don't kill the messenger.

Reply to
the Artist Formerly Known as K

Then you have to run it, what about the debugger?

Yup, I've used them.

YATC "Yes Another Tool Chain"

Yes, but one typically designs a device driver for merely access, not functionality. Yes, that is a *very* general statement, and not always true, but if you can, that's how you do it.

It is when you don't substantiate it, and you haven't.

The messenger is making accusations without supporting argument. The fact that we disagree is, by no means, an indication of ignorance, or is your opinion gospel? I didn't think so.

Reply to
mlw

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.