hardware geeks and software geeks?

Jay is in education. Or was, before he took his sabbatical. In any case, I think you must admit that by phrasing it as "your country" it is immediately divisive, regardless of your nationality.

I've taken the time to read your more political threads in some other groups, so I think I understand your comments a little more (and maybe agree with some of them -- horrors! ). But if you don't want to be so misunderstood, maybe try not to use less inflammatory terms. Just a thought.

I think there's a difference between criticism of ideas and criticism of approach. I was offering my constructive criticism that I felt we had covered this ground before. What's worse, you couched the new thread in ways implicitly negative to those who like the hardware side -- they are somehow deficient elsewhere.

I don't think you were interested in the debate of the ideas, but in the debate of the debate. I felt this lent itself to criticism, which I tried to frame with some humor (being "bored"). Obviously you took it the wrong way, for which I apologize.

*Constructive* criticsm only intended here:

More like this: MLW's posts like his PID code or path planning or suggestions for avoiding brownout resets in PICs

Fewer like this: MLW's posts where he reopens the same subject just so people can argue about it all over again

-- Gordon

Reply to
Gordon McComb
Loading thread data ...

How is it devisive if we are from the same country? Perhaps the wrong pronoun, "our" instead of "your," but, come on now, we're engineers.

In the USA, most people actually agree on the substantive issues. You should find no shame in agreeing with any of my political views. I'm sure my rabid libritarian views are somewhat disagreeable to many.

It's a characteristic that engineers often have. It isn't intentional. You say "not" to do something, but more often than not, we (people like myself) often don't even know we've said something offernsive or inflammatory until someone says something. It's sort of a Dilbert thing.

For instance, "That idea sucks" or "That's the stupidest idea ever" are things often said during design sessions. If we got on about offending people, RT128 in Massachusettes would be a single lane dirt road.

You weren't critisizing an approach. You said: "My opinion is that you've over-posted on this already"

What? Next time I should ask your permission? Get real. Not only that, you did not even address one of the issues. Not that you had too, but it is indicative of critisizing that I wrote something, not what I wrote. Its subtle, but I'm sure you get the difference.

There is a *lot* of gound to cover, it isn't a simple topic at all. If you didn't think it was valuable, you could have moved on...

"No one can make you feel inferior without your concent." Eleanor Roosevelt.

You may not like my choice of words, but what I wrote is a viable perspective. If you have issues with it, address the issues. If you don't want to read it, ignore the thread. If you critisize someone for the act of posting a sincere writing, then you're the person who is wrong.

From an engineering perspective, there are a *lot* of depth to the PC vs microcontroller debate. It is all trade-offs, but what is VERY debatable is what the exact trade-offs are. (1) "Real-Time" is a subjective property and not always needed, even though you think you may. (2) The amount of real CPU processing dedicated to control is usually minimal.

It is a conceptual debate and IMHO, an interesting one. It is also a valuable knowledge set in the PC industry. A better understanding of the ways around latency and accomplishing "real-time" applications on the popular non-realtime operating systems is always marketable.

Who knows, it may even spill over into micro-controllers and make them even more useful.

Accepted.

No promises.

Reply to
mlw

This isn't at all what you stated at the beginning of the thread. You wanted opinions to this statement:

This is hardly worth a response. It's divisive from the outset, presumptuous, and historically inaccurate. Small example: Since you're from Massachusetts, no doubt you're aware of the work at MIT, where they commonly use "many small PICs" (or other microcontrollers), and they approach their research nearly 100% from software, relying heavily on algorithms and behaviorial models. The hardware is close to being irrelevent.

If you really wanted to debate using a PC with a non-RT OS for a robot and no subcontrollers, you'd complete yours, and then compare it with other examples available on the Web that use other methods. In the true spirit of scientific exchange, you could demonstrate precisely what you did to overcome the limitations. If others choose, they can challenge your findings, using their own specifics.

*THAT* is debate, and it exchanges ideas. What you engage is in arguing, and all it exchanges is animosity.

I have taken the time to look up other groups you're in, and there's a common thread among them: you spend a lot of your time re-explaining yourself. I don't think you come across the way you think you do. Perhaps you could take a moment for inner reflection, and make a fresh start here?

-- Gordon

Reply to
Gordon McComb

A lot of non Americans would agree with that :)

However I think that it is true for all nations.

There is also the question of peoples freedom to say things that incite racial hatred or antisocial behavior... but that is for another newsgroup.

Most people are not interested in the freedom of speech when they perceive it as a personal attack.

Although most people like to think they are "logical" in reality we are entirely motivated by emotions. Something robot builders interested in higher AI for their machines need to consider.

John

Reply to
JGCASEY

JGCASEY wrote: [snip]

I really *must* disagree.

I myself am motivated primarily by beer, pie and promises of sex -- in that order.

Reply to
the Artist Formerly Known as K

*This* thread was started because of a similarity I noticed between 'EE' and 'CS' engineers, especially at Metrabyte.

That is a trend I think I recognized, and I was asking if it made sense or anyone else noticed it.

Really, I though it was an interesting observation.

It was an observation, and I was asking if anyone noticed it.

Their design goals are interesting. One could look at the work in other places. I remember Hans Morivick driving a van around with a generator and a VAX, saying funny things like "To a vision system, a road and a tree look very much a like." (The unstated punch line was the series of events that lead up to that observation.)

I notice you are very big on trying to tell people what to do. *That* is rather annoying.

One does not need to build a specic implementation of "X" to debate the doability of "X," especially if it has been done before.

If you don't want to debate, then by all means don't debate. If you don't like the debate, by all means, don't participate. Whether or not I and others wish to debate it, really isn't your call.

I did and I posted code with explanations. I even posted pictures of parts of it.

And I hope some do.

It *only* exchanges animosity when people take a subject personally. Someone is *always* annoyed at some level when a person disagrees with them, that's human nature. A thinking human being *should* be able to see beyond this initial effect, and decide whether or not the points have merit.

If they do not have merit, then you either ignore the debate or counter with your valid points.

Perhaps you should take very good care of Gordon's own business.

Reply to
mlw

Robots have a long way to go than before they can meet those worthy goals :) Most of them get off with a zap of electricity when their batteries get a bit low. Although some have a religious zeal and see the light.

John

Reply to
JGCASEY
  • mlw :

On the other hand, to people who like to use microcontrollers, using a PC to drive a small robot is like using a 10 ton dump truck to go grocery shopping. Sure, you're not limited in how many groceries you can take home, but you don't really need that much capacity.

Reply to
Matthew E Cross

multiple

keyboard: Matrix of switches. PC Keyboard: matrix of switches connected to small micro. The micro encodes the input of the switches into a binary number. This number is buffered. The micro signals PC, "DATA READY".

Without a micro: With 101 key keyboard, you have 101 bits of information to read. Solder these to CPU data lines. Ok, now you have to scan what key is read:

for i = 1 to 101 read DATA line (i) if DATA line (i) = 1 then (you have a keypress) next i

You just used 101 pins off your CPU. Your CPU has to scan 101 bits of data (probably have to load from port, then AND with the proper mask to find status of bit). This has to happen with "multitasking" in operation, the IDE drive is reading, the ethernet card is communicating, the graphics card is updating.

To get input/output to a CPU you need sCPU's. Slaves to the CENTRAL processor.

"as long as functionaly well defined and constrained" doesn't mean much to me. How can a function NOT be well defined? How can a micro NOT be constrained? It only does what you program it to do.

The keyboard:

1)encode 101 bits into 8. 2)Buffer. 3)Signal data ready. 4)communications protocol 4a)reply back to PC with requested data 4b)act on PC command (turn on numberlock light, flush buffer, etc)

It is absolutely necessary if you want to get any efficiency out of the CPU. Why would you want to bother a 2ghz

32 bit processor with MMX with such a task? Most wouldn't.

The CPU is a processor of data. Not a counter of encoder inputs. Not a buffer of keyboard or coms data. It can be used for this. But it is not efficient, and in my experience with coding hardware drivers and applications, is an absolute NO if you want to get things done.

Look to the PC joystick (analog). Here the PC is counting cycles. charge capacitor. start timer. Is cap discharged? no-->keep counting. yes--->we know where joystick is at.

Do you want to do this? Or how bout this:

CPU: request data from Micro

micro: charge cap count cap discharged? no--->count----^ yes--> store count in RAM signal CPU-- "I have requested information"

CPU: read information from Micro

The whole time the joystick was being read, the CPU was free to operate the OS, which includes accessing every other hardware device THROUGH the micros that control them.

"parallel" hardware operation can only be done with a master and a few slaves.

The fields didn't get plowed with the master pulling the plow. The master dictated to the slaves.

With a robot, you can get a few tasks done with only the master. He is fast. He is strong. He can only do what his engineered circuits are capable of. As soon as you tax this built in limitation, it is time to get some slaves.

We shape our society in the way that our minds work. It is the only way we know. Hierarchy's of power and control. Throughout government, business, social interactions. It works. Is the way our minds work. Seems like a good idea to model your experiments accordingly.

Rich

Reply to
aiiadict

********

But that is not the situation. The 2ghz 32 bit cpu is doing the job of *many* 20hz microcontrollers. If there is a cheap way to hand the job over to hardware you do it. Once the cpu did read the keyboard and do all the graphics processing. Now thanks to mass production and cheaper ic manufacturing it is *cost effective* to do it another way. You are always trading the cost of hardware with the cost of cpu cycles for any given task.

As for your example of reading the joystick that is not how it has to be done. You can save the time, start the process, and just poll it x number of times a second while you are doing other things. A 2ghz machine has plenty of cycles to spare compared with a 20Hz uC. When it gets an interrupt or reads the input, data ready, it notes the current time, subtracts from the starting time, done. You can make it easier by using a ADC chip. You just trigger the convert pin and poll (or use interrupt) until the data ready pin indicates the result is ready to read.

The amount of cpu cycles required to control my toy owi arm were trivial. To use a PIC would have just been plain silly. It has 5 motors and 5 encoders and video feedback on a 800Mhz machine using interpreted QBasic!

And things like the pc keyboard, graphics card, sound card, hard drive, have their hardware support and are therefore out of the equation when it comes to "do we need a uC for this pc interface application".

Now it may be true that a uC is the *best* solution for a PID controller and very cost effective. And it may be true that a stand alone robot base with simple behaviors only requires a uC. But if your application *needs* the computing power of a modern pc and has power to spare why not use it?

John

Reply to
JGCASEY

I thought of that, but didn't think subtlety was the best way to go. :)

-- D. Jay Newman

formatting link

Reply to
D. Jay Newman

OK I give in..... a can of spaghetti ~ = a can of worms ...............

Wink wink. nudge nudge say no more...say no more

|-]

Cheers

Dale

Reply to
DS

OK I give in..... a can of spaghetti ~ = a can of worms ...............

Wink wink. nudge nudge say no more...say no more

|-]

Cheers

Dale

Reply to
DS

" You are making a mistake. My logic is undeniable " V.I.K.I.

Reply to
Dale Stewart

Did VIKI have a logical reason to be logical?

It depends on your goal. If it is to work together happily we need empathy. Interestingly enough those without empathy, but intelligent, are usually very successful, at the expense of everyone else. Those without empathy and without intelligence tend to end up in goal and be called psychopaths.

Apparently you can detect a psychopath by monitoring their involuntary reactions while viewing neutral and emotional material. A psychopath doesn't react emotionally to someone else's suffering.

If robots are to care for the increasing percentage of elderly people in many societies I think they would need empathy of some kind.

Cheers

Reply to
JGCASEY

I think that's an extreme over-generalization.

Speaking for myself only (I realize that there are proponents of certain technologies that sometimes tend to favor their preferences for all applications) I try to apply the technology to the task at hand based on what will get the job done most effectively. For some applications, I think a PC is best, while other applications find a number of microcontrollers optimum. Or, maybe some combination of both? I try to start with the question, "What do I want it to do?" & work from there, & let the technology fall out of the requirements.

I also consider software to be the "easy part", & find the manipulation of low-level functionality via electronics to be fascinating, so perhaps I have a bias in that direction.

Does that make me the anti-geek?

JM

Reply to
John Mianowski

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.