I have excerpted a piece of correspondence from a friend here. Hope it makes for lively discussion.
"The particular application was understanding why digital computers
and any digital circuit only has ones and zeroes. A typical integrated
circuit works on 5 volt logic. A lot of people think that a 1 is +5 volts
while a 0 is 0 volts. If this were the case, nothing would ever work
reliably since the slightest voltage drop would be interpreted as in between
a 1 and a 0 which is not allowed. So, the designers set a threshold voltage
about 1/2 way in between or about 2.5 volts. This means that the decision to
decide if an input to an IC is to be interpreted as a 1 is made when the
voltage is between 2.5 and 5.0 while the decision for a 0 is between 0 and
2.5. It's obvious that, if the voltage, or noise on the input, is around 2.5volts (even for an instant), the circuit could see a series of 1s and 0s
which of course is no good. Other designs then chose to have a 1 represented
by a voltage between 4 and 5 volts while a 0 would be between 0 and 1 volt.
This meant that the range between 1 and 4 volts did not change the status
detected and is called "noise immunity". Of course, if the voltage was about
1.1 volts, it probably would not be interpreted as a 0 but would not be a 1.The same thing goes for everything else! Answers to questions such as "What
time is it?", "How do you feel?", "Is that heavy?", "How much does it
weigh?", "How old are you?", "Which direction is North?", and on and on.
I started telling everyone who would listen (ha!) that "there is NO digital
world, only analog".