is DNA stupid?

You keep saying that. However, his point is valid. *You* introduced Chomsky and never provided references for the statement. Also, Marcus is right in assuming that you must defend the statements that you make. You are the one that introduced these things.

Both of your statements have been debunked and refuted, yet you refuse to provide any defense. I find it extremely difficult to take your arguments seriously.

-- D. Jay Newman

Reply to
D. Jay Newman
Loading thread data ...

------------------- If THAT'S all you're talking about you're not so crazy, so quit acting like it.

Both camps can exist in each of us with impunity. The real problem in the pure "philosopher's" view is that we wouldn't be able to analyze and reproduce so many lower, and progressively higher mental functions if they were NOT mechanical in origin, and therefore analyzable, and the worst case is that we will quite likely soon build an aware being!

At some point the philosopher's argument is that even something that appears to entirely replicate human behavior may not of necessity be, within itself, actually self-aware, as even that function MAY, and I repeat MAY be entirely emulatable with or without TOTAL sufficiency.

Of course that is debatable, because anything that does what we do with our own mental personality in being self-aware seems as though it must, itself, truly also be self-aware!!

But of course this boils down to the same old Manichaean Heresy of souls in machines. Can a machine exist which has no "soul", no real inner being? Or is that a religious superstition born of the medieval terror of hubris, the "sin of pride before God"!!

Now unless one is superstitious, and without real cause, given the history of Science, (Afterall, God hasn't come down all pissed off by Science YET!), one actually has NO reason to believe this "soul" business, except that we have not yet achieved it, which proves nothing of importance unless we discover absolutely that we CAN'T do so for some actual theoretical reason!! Until or unless such is found, we have NO reason to give credence to medieval superstitions.

-Steve

Reply to
R. Steve Walz

From:

formatting link
A lecture by Patricia Babbitt University of California, San Francisco April 24, 2003

"Using a standard measure for overall amino acid frequencies gives the information content of a random protein sequence as 4.19 bits/residue."

So there goes the 20,000 bits of information idea.

I still don't think 95% of the information we have in encoded in DNA though. (It's sort of my job)

Reply to
Alan Kilian

--------------------- You don't get it, no "cybersoul" in software somewhere in hardware either in this dimension or the future or any other is needed, because the Infinite Imagination includes that possibility along with all possible others simultaneously. Nothing you can describe is Infinite as is the Imagination, because it includes ALL possible lives, all possible experiences.

------------------------------------- I know about relative infinities, but the largest is the most general, and that's the Imagination, because it has NO features, and is neither personal, nor mental, nor physical, it probes the question of life right down to row, row, row your boat, life is naught but a dream!!

----------------------------- And much more...

------------------------------ Misquote, "It is a tale told by an idiot, full of sound and fury, signifying nothing." Shakespeare, "Macbeth"

---------------------------- What's the "meta-golden rule"?

-Steve

Reply to
R. Steve Walz

----------------------- No, I actually think this way and talk this way.

-------------- Sure we do, why not, nobody knows how big it is!

-Steve

Reply to
R. Steve Walz

Eastern Hemisphere

- karen715j

Reply to
Karen J

I thought this has generally been agreed upon for ages? That the brain is known to go through distinct phases, first of rapid learning during childhood, then to curb the learning ability in order that the information loss that is bundled with learning (to rapidly forget learned information that is incorrect) is also shut down so that what has been learned in childhood can be applied in adulthood and never forgotten.

The idea that this is a programmed high-level mechanism is quite frankly absurd, however, compared to the simpler notion that this change in learning ability could be caused by chemical factors released into the brain during transition to adulthood that inhibit or change the associative processes of each individual brain cell, rather than acting specifically on higher structures. Can it really true that a specific structure, a partially-programmed language system,exists and is lost as the associative capacity of the neurons is lost, or is it in fact the case that it is merely one manifestation of a much more general, global loss in learning capacity?

I dunno if this is common accepted theory or not, I'm really just thinking aloud here, but I've been seriously thinking about conducting experiments on genetically-hardwired GLOBAL changes in brain behaviour in different stages of life - perhaps to construct two AIs, one programmed to learn behaviour and apply it constantly, the other to learn and apply, then slow down learning (not stop altogether, it should be stressed - as mentioned earlier, an adult can still learn, but it's much, much more difficult), and see how they compare in responding to identical situations, or even pitted against each other. There we go, a useful idea from this mostly useless thread, at last! Who said troll-bashing was a waste of time? Heh.

Tom

Reply to
Tom McEwan

Example? That was a single phrase. Examples have DETAILS!

Reply to
Tom McEwan

"Fear the Supercomputer System Operator, who could sack your silly cybersoul at any second."

But since you don't accept (yet) that you are a cybersoul, I guess you are free to blather on, and on, and on....

- karen715j

Reply to
Karen J

information content of a random protein sequence as 4.19 bits/residue."

4.19 bits/residue? Where did that number come from? DNA sequences would be similar to a base-4 numeric system as opposed to binary (base-2), but how can you have a base-4.19?
Reply to
David Harper

The paper explains it far better than I ever could, but here goes....

To tell how manmy bits are required to store a symbol take log-base-2(number of possible symbols)

So, for DNA, there are 4 possible symbols. log_2(4) = 2 (bits)

There are 20 amino-acid symbols. log_2(20) = (about) 4.3 (bits)

Since the amino-acid symbols don't occur equally in real proteins, Patricia and others have estimated that there are about 4.2 bits of information in each protein symbol.

Is that an OK explanation?

So, I'm going to go with 30,000 genes and a total guess of 400 symbols-per-gene conservatively. (Actually EXTREMELY conservatively) so I get a pure guess of

30,000 * 400 * 4.2 = Fifty megabits of "information" encoded in JUST THE PROTEINS of the Human genome.

There are regulatory-regions, structural DNA, promoter regions, multiple splice-sites, multiple-exon proteins, forward and reverse-strand information and many many other places for "information" in the human genome.

Also, I would like to re iterate that I personally do not believe that 95% of a human's knowledge is pre-programmed into our DNA.

I have too many friends doing infant-memory studies to believe that.

There are only 20 amino-acid symbols.

Reply to
Alan Kilian

I know. I was *trying* to get this discussion back to robotics.

It doesn't matter from a linguist point of view *how* this happens.

It is known that certain things are learned at certain ages and that some of them are linked.

I believe that this is a fairly simple process of the brain. I never said anything else.

This would be an interesting experiment also.

I was thinking of making a net of neural networks, and at some point reducing the feedback to slow down the learning process to simulate human learning.

-- D. Jay Newman

Reply to
D. Jay Newman

These are very different from your proposal.

Your statement is that 95% of the personality (and by that you *seem* to mean knowledge) in the brain is hard-wired.

Therefore a child raised in total isolation should have 95% of the information of a normally raised child. If I am wrong, please tell me where I have misunderstood you.

You're getting off the subject. Deaf children are rarely raised without language. They may not be able to hear, but they can see and know that communication is going on.

Only in a few pathological cases are there major problems with aquiring language. And most of those are from lack of personal feedback.

I was talking about feral children who have attained adolescence without language aquisition. These were studies from your link.

In none of these cases have the children been able to assimulate human society including language.

If 95% of the personality were built in, then why do these children have such problems?

Strange. I thought you made a statement about posters insulting you? I have been very careful to debate your points and not you personally.

And yes, I think my thought-experiment totally debunks your statement.

And you still have yet to provide a reference.

-- D. Jay Newman

Reply to
D. Jay Newman

You tripped up at the first hurdle of understanding. If it didn't need to be programmed then who wrote the AI program then?

Reply to
e7

Why is it absurd? How do you propose to make a machine that can learn without having put in the program for it to learn?

Reply to
e7

And if read any/all the works of AI, you will know emergent behaviour doesn't exist in any software written by anyone. Each behaviour is a function of the program. Emergent behaviour means behaviour outside of programming - which sadly has never been reproduced.

Reply to
e7

I'm afraid none of us are your encyclopedia service. I wouldn't post it even if it were sitting right next to me. You find it if you want as any lecturer will tell you. Even better know your subject.

Reply to
e7

Well they didn't. The first person to do so will be formally recognised by everyone.

Reply to
e7

Poppycock. Emergent behaviour in software systems goes as far back as Conway's Game of Life.

No, don't bother writing a reply. I have seen you in action when I busted you over your claim that the human genome has

20,000 bits when it actually has 3,000,000,000 bits. No need to repeat that performance with your latest error; we already know that yiou are ineducable.
Reply to
Guy Macon

I have a more plausable explanation. You made it up. You fabricated it. You told a fib. You pulled it out of your hat. You lied. You imagined it. You got the idea from the voices in your head. This explains why nobody can find the "facts" that you claim are true and explains why you refuse to say where you got them. Liar, liar, pants on fire.

Reply to
Guy Macon

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.