View Single Post
  #25   Report Post  
Old 20-05-2003, 12:08 AM
Ted Byers
 
Posts: n/a
Default Orchid boarding houses


"Larry Dighera" wrote in message
...
On Thu, 15 May 2003 00:16:07 -0400, "Ted Byers"
wrote:

While I can get a computer to do literally
anything I want, just by writing a program to do it


You can't get a computer to SUCCESSFULLY divide by zero. :-)

Sure you can! The precise result is infinity. Floating point arithmetic
has two useful identities: INF and NaN. NaN is an acronym for not a number,
and is used whenever something realy bad happens: it just isn't useful in
this context. INF represents infinity, and, while most FPUs will throw an
exception with a divide by zero error, all one needs to do is catch and
clear the exception (on Windows done by the OS and converted into a
"structured exception"), and then set the result to INF. Of course, how
easy or painful this is depends on what language you're using. This is all
very simple, although it isn't very practical. That said, dividing by zero
is something I generally don't want to do. In fact, I generally try to
avoid it! In the software engineering I do, a divide by zero operation is
generally a consequence of structural instability of a system given the
parameters selected by the user. In such a case, it is best to test for a
denominator of zero, and, when such a condition occurs, stop the simulation
or analysis and notify the user about what has happened.

But I guess, in some respects, being able to get a computer to do anything I
want it to do is guaranteed by more than two dozen years experience working
with them, and thus knowing what they can and can't do, and thus
guaranteeing that I never want a computer to do what I know they can't do.
For example, I don't want to spend time making a computer that can speak
English, or any other natural language, because natural language is
multivocal (i.e. there are many examples of sentences that can have more
than one meaning - a common tool in humour), and computers can handle only
univocal languages. A computer can neither make nor understand a pun. Of
course, down the road, some genius may well be able to prove to me that what
I had thought was impossible is both possible and practical; but that is a
different question.

(I can even get a computer to exhibit behaviours as complex as those of

almost
any animal; a fact which raises countless interesting philosophical

questions we probably
should avoid in this forum),


That would be an interesting topic of discussion.

Indeed it is! I spent a large part of my second doctorate studying this
from different perspectives: philosophy, educational psychology, cognitive
science, software engineering, &c. But it is also a very difficult one and
an area where even experts seem to easily fall into error.

I would likely be quite dangerous trying to put together a circuit of any

kind.

It's not as difficult as you fear.

I hope you're right.

Or maybe it is a question of simply never having done it before. Do you

know
of any good reference books which would show me how?


These authors can make learning the fundamentals of electronic
circuits painless:

[snip]

I'll cross-post this article to some of these newsgroups, and perhaps
we'll get some additional help in getting you started in electronic
circuit design:

sci.electronics.basics,sci.electronics.design,rec. radio.amateur.homebrew


Terrific. Thanks a lot. I really appreciate this.

Cheers,

Ted