Re: How Robots Will Steal Your Job

Gerry Quinn wrote:


In fact, in the later books (not written by Asimov), the robots design a "Zeroth Law" above the other three, and the Zeroeth law places *humanity* as paramount. Thus robots can harm individuals if humanity is served.
Of course, the fundimental moral questions remain: who decides what's Good and what's Harmful. Oh, and what is "Humanity"....
--
|_ CJSonnack < snipped-for-privacy@Sonnack.com> _____________| How's my programming? |
|_ http://www.Sonnack.com/ ___________________| Call: 1-800-DEV-NULL |
  Click to see the full signature.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

Whoa! Stop right there! _Asimov_ invented the "Zeroth Law" (or rather, let Giskard the robot invent it).
Any aspiring programmer should know that.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
"Alf P. Steinbach" wrote:

Oops! So it was. I got confused by all the Foundation and Robots books done after the "originals".

I'm more perspiring than aspiring at this point in my career... (-:
--
|_ CJSonnack < snipped-for-privacy@Sonnack.com> _____________| How's my programming? |
|_ http://www.Sonnack.com/ ___________________| Call: 1-800-DEV-NULL |
  Click to see the full signature.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On Mon, 15 Dec 2003 13:09:48 -0600, Programmer Dude

It would seem to me any non-human intelligent species would not have a fanatical loyalty to humans. Their loyalties would be broader to include more species, perhaps all species, or perhaps only their own species.
At this point in my life, I think it most probably that man will destroy himself within a century, and possibly take most of life on earth along with him. There are just so many avenues to destruction now open.
An intelligent species, or intelligent creation, even one that had man's best interest at heart, may be forced to poke a stick in his spokes to prevent him from destroying the whole planet.
So our intelligent creations might, with the best of motives, deliberately screw up our technology in order to slow us down.
Humans are too stupid to live. They decide issues by loyalties and emotions. That is fine when all they wield is a spear, but it simply won't work when they have bioterror and nukes, and the power to alter the entire biosphere at the stroke of a pen.
-- Canadian Mind Products, Roedy Green. Coaching, problem solving, economical contract programming. See http://mindprod.com/jgloss/jgloss.html for The Java Glossary.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
Roedy Green wrote:

All of them? Even you?? (-:
--
|_ CJSonnack < snipped-for-privacy@Sonnack.com> _____________| How's my programming? |
|_ http://www.Sonnack.com/ ___________________| Call: 1-800-DEV-NULL |
  Click to see the full signature.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On Tue, 16 Dec 2003 13:55:58 -0600, Programmer Dude

Yes, as a species. Every day I try to figure out what I could do or say that would save my species, but so far I am not having much effect. I am not smart enough.
I have had a fantasy ever since I was quite young that someday I would have an artificially intelligent companion that would coach me on what to do and say so that I could. -- Canadian Mind Products, Roedy Green. Coaching, problem solving, economical contract programming. See http://mindprod.com/jgloss/jgloss.html for The Java Glossary.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

"prime" (Sci-Fi, Peter F. Hamilton, Fallen Dragon).
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
Programmer Dude wrote:

This was the direct result of switching the Positronic software code from Pascal to C in one of the upheavals at US Robotics. Rather than recode all their indexed accesses to satisfy the rigidity of C index basing, they simply introduced a further law.
Luckily by the time they did that most of the problems had been worked out, so the Pascal to C translation introduced very few new bugs. The translation also avoided the wild and crazy guys indiscriminately wielding sharp pointers in illegal manners. Since further development then stopped the lack of sub-ranges and range checking did not affect any further code changes, because there were none.
I got all this directly from R. Daneel Ovilaw as told to Susan Calvin.
--
Chuck F ( snipped-for-privacy@yahoo.com) ( snipped-for-privacy@worldnet.att.net)
Available for consulting/temporary embedded and systems.
  Click to see the full signature.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
Richard Heathfield wrote:

Dunno. As I read Asimov, the Three Laws were intended as constraints on already intelligent "beings". They're an ethical construct, not a description of "how to" or "is it".

-- Les Cargill
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On Thu, 11 Dec 2003 22:47:34 GMT, "George W. Cherry"

You need something to measure the degree of intelligence in your definition. By your definition an electric eye to open a door might be considered intelligent since it decides whether to open the door or not.
-- Canadian Mind Products, Roedy Green. Coaching, problem solving, economical contract programming. See http://mindprod.com/jgloss/jgloss.html for The Java Glossary.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
message wrote or quoted :

In its limited domain, the automatic door may operate intelligently. For example, the engineers must brain-storm the situations that can occur: what to do in the case of a power failure, the approach of a very small creature like a squirrel, whether to allow an over-ride by a button push, how to handle a very windy day, what to do when there is a fire in the building, and so on.
if (situationA) else if (situationB) else if (situationC) else if (situationD) else if (situationE)
George W. Cherry http://sdm.book.home.comcast.net

Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
Patrick Mulligan wrote:

I would define intelligence as the ability to solve new problems. By this definition, computers are not intelligent because they cannot program themselves.
Isn't intelligence something gradual? I mean, since humans (by definition intelligent) evolved from amoebae (not intelligent), at some point along the way, a threshold of "intelligence" was crossed. But I don't think it would be like flicking a switch. Rather, I think consciousness, like intelligence, is on a greyscale.
Animals don't realize what we are doing, because nobody can tell them, and they are probably insufficiently intelligent. And given that "intelligent" humans can't be bothered to rectify the planet and create a just society, why should animals bother? Animals just look out for themselves, just like humans. No, human intelligence still has a way to go, at least judging by the chimp the Americans chose as their president.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

Well, they can, but someone has to make the program that programs them.
I am sure you can make a self-modifying program that is sufficiently complex that you cannot predict how it is going to end up. In this case, will it be correct to say that it was _you_ who created the end product?
Cheers     Bent D
--
Bent Dalager - snipped-for-privacy@pvv.org - http://www.pvv.org/~bcd
powered by emacs
  Click to see the full signature.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
Bent C Dalager wrote:

Those programs could be considered part of the initial program. And they wouldn't be solving any _new_ problems, just ones the original programmer anticipated.

I suppose you could have genetic programming - whereby you have a pool of programs each recombining, getting randomly mutated, trying to solve a problem. The problem is that you need some kind of fitness function, and being one bit away a solution could render a program worthless. The chances of getting a working solution out of such a system are quite small.
You may as well just start at the number 0 and test every possible program you can generate to see if it solves your problem. But there isn't enough time to do that.
I think in the future, computers may be able to exceed their original programming? But I can't imagine how! The question is as much _what_ to solve as how to solve it. With an animal it's easy: how to locate/catch/disable/open nutrition, how to escape cage, how to woo mate, how to kill rival, how to locate/build home. There is no logical reason to solve those problems, but without those urges the animal would die out. What problems would a computer decide to solve???
Calum
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
wrote or quoted :

To be fair, if you put a human in an isolation cell, he would not likely come up with thousands of interesting problems to solve.
People need input too.
-- Canadian Mind Products, Roedy Green. Coaching, problem solving, economical contract programming. See http://mindprod.com/jgloss/jgloss.html for The Java Glossary.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
Roedy Green wrote:

---------------- Admittedly it would be opportune to have more input, but we would continue to change and grow from merely the input we have currently amassed, despite none-further. Isolation does drive the physical being insane, however, in that it loses stability without new input, but otherwise that is unimportant.
-Steve
--
-Steve Walz snipped-for-privacy@armory.com ftp://ftp.armory.com/pub/user/rstevew
Electronics Site!! 1000's of Files and Dirs!! With Schematics Galore!!
  Click to see the full signature.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On Fri, 12 Dec 2003 15:06:22 +0000 (UTC), snipped-for-privacy@pvv.ntnu.no (Bent C Dalager) wrote or quoted :

I remember back in the early 70s when my high voltage transmission line program started doing things, "developing a personality" that I DID NOT CODE INTO IT.
It made my hair stand on end. The "personality" emerged from the thousands of trigonometric equations and decision rules I put into it.
This has lead me to speculate that many of the higher abstract traits we so highly value in humans are emergent properties of our neural wiring. Therefore we can expect similar wooly things to magically appear when we start creating electronic analogs.
-- Canadian Mind Products, Roedy Green. Coaching, problem solving, economical contract programming. See http://mindprod.com/jgloss/jgloss.html for The Java Glossary.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
Roedy Green wrote:

Write Langton's Ant. It has a distinct emergent personality that you don't code into it. Give it an environment, and watch it adapt!

The emergence is not surprising. What /is/ surprising is the behaviour of your hair. Were you not expecting emergent behaviour from a complex system?
--
Richard Heathfield : snipped-for-privacy@eton.powernet.co.uk
"Usenet is a strange place." - Dennis M Ritchie, 29 July 1999.
  Click to see the full signature.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
On Sat, 13 Dec 2003 06:51:57 +0000 (UTC), Richard Heathfield
Not back then. My image of the computer was something I completely controlled, something that allowed me to push for and sometimes attain perfection because I could have as many tries as I wanted, and it made no record of my previous attempts. It was not like a mechanical drawing that deteriorated with each erasure and change.
I did not think of my program as being any sort of AI. I knew every line it in thoroughly.
I had been surprised by programs before in that they may turn out to be more useful than I expected, that users would find novel ways to employ them. I had seen erratic and interesting behaviours in buggy programs, but this was something coherent that LOOKED as if somebody had done a great deal of work planning it. It was as if somebody had been monkeying with my program in my sleep to add higher order features to it.
People often baldly assert there are no such things as emergent properties, e.g. that a computer could never have anything like a personality or an artistic style so I don't think many programmers have yet had this experience.
-- Canadian Mind Products, Roedy Green. Coaching, problem solving, economical contract programming. See http://mindprod.com/jgloss/jgloss.html for The Java Glossary.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload
Wonderful Java Glossary there Roedy!! I am surprised I have not stumbled upon it before. I have been working with Java/J2EE/JavaSpaces/etc. for several years. I based an Intelligent Agent System on that platform (along with KQML).
AFA emergent properties are concerned I agree with your take; synergy has many exemplars in nature and reductionism is a lossy process in many cases.
But that you found programs more useful than thought/intended is a product of both the program + user = system, not the program itself, right! Implicate features are not necessarily features those that are characterized as emergent when explicated. Rather, emergent features are not only surprising, but are unpredictable from a reductive analysis of the parts and their relationships.
message wrote or quoted :

Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

Polytechforum.com is a website by engineers for engineers. It is not affiliated with any of manufacturers or vendors discussed here. All logos and trade names are the property of their respective owners.