I can put simply the message I wish you to believe and take away from
The value and importance of the qualified
teacher to our civilization becomes ever
higher as the public is beguiled into the
belief that computers can think for them.
They cannot. Every student is different,
but computer programs are the same, and
we have not yet found how to adapt them to
each student as a human teacher does.
In the hierarchy of teachers for our
complex world, no teachers are as critical
as teachers of mathematics. Because that
is where logic is learned, and all human
survival has depended upon using logic.
I realize that popular opinion does not reflect this, with the result
that you are not remunerated like basketball players. But then you
never have been, as I can attest, being the offspring of a mathematics
In lieu, you must accept the honor given you today, for your skills in
imparting knowledge of the field of mathematics, most commonly to
younger people. As I read the Tandy Prize, you are being honored not so
much for your own knowledge, or your developments, as for your skill and
dedication and success in passing on that knowledge to others. I
applaud the Tandy Corporation for starting and continuing the tradition
that brings you here today. I applaud the principles enunciated by that
An old saw is appropriate here, and I probably don't have the words just
right, but it's "Give a man bread, and you have fed him once; teach him
to plant, and you have fed him forever". I hope all my fellow computer
programmers forgive me for saying that they mostly belong in the first
There is another old saw -- one I dislike immensely. "Those who can, do.
Those who can't, teach." If our supply of teachers should diminish
proportionally, how will anyone ever learn the requisites of "doing"?
Moreover, that might have been a popular position when manual labor and
the production of goods was dominant, but it surely fails now when
information is our dominant asset.
Yes, new knowledge always comes along. But it seldom replaces much of
the old knowledge, which still must be taught to each new human being.
And the old definition of a teacher as someone on the other end of a log
from a student is still valid.
I remember (and that was in 1937) when my father was a superintendent of
schools, and the winter vacation of his school overlapped by a week the
vacation period of my college. He invited me to sit in as a math
teacher for a week. I found the students in the process of learning
binomial expansion, and they were puzzled. Easy, I said. Just use
Pascal's triangle. What's that, they asked? So I put it on the board,
and showed the derivation. End of the difficulties! We moved on. Two
weeks of their learning lives saved by a different teacher.
NOW ABOUT THOSE COMPUTERS
Mathematics being a basic science, you might be interested in how it
happened that you have me, a computer programmer and not a teacher,
standing here to tell you why you teachers are so important in a world
run by computers. They do run it, you know.
It started with the Wall Street Journal in 1996. I had read an article
about impending difficulties with computer programs where the way the
year was represented was shortcut. Sort of like having only two wheels
on the odometer of a used car you were thinking about buying, and you
didn't know whether it had gone just 97 miles or 100,097 miles.
It happened that "Time and the Computer" was a hot button of mine. In
fact, I published an article of that name in 1979 February. In it I
warned about the consequences of the folly of implying, rather than
representing specifically, the millennium and century values in
computers. Unfortunately nobody paid much attention.
And I was interested even before that. In 1969 I proposed a National
Computer Year of assessment and planning, somewhat like the previous
International Geophysical Year. One of the things I hoped to fix was
this very same problem, based upon the newly- issued international time
standard that mandated a 4-digit year representation. Might have
worked, but President Nixon didn't think computers were that important,
and he wouldn't sign such a proclamation.
With these bitter memories of failure, I resurrected a scheme for
effectively storing all 4 digits of the year in the same space that the
previous 2 digits had used, together with a method of using them in
computation, and applied for a patent.
The Wall Street Journal ran a story on this. Your director Kaye
Thornton appreciated the implications this might have in the coming
years, seeing correctly that by the time of this meeting the problem
would be understood by many more people. And I can promise you that
you're going to be more than understanding it. You're going to be
WHAT'S THE LESSON?
I have been a computer programmer since early 1949 -- 49 years of
computer programming -- and I still love it. So much that I sometimes
feel I was specifically designed to do just that. Ridiculous, I know,
but I get that feeling.
But as I have so many times cautioned, computers are but a tool. They
are not gods. And tools can be used for good or evil. Fire is such a
tool. I like a fire in a fireplace, and to cook a steak. But I don't
I'd like you to keep this in mind as I go along, because much of what
I'll cover is about bad use of computers. I am not knocking computers
per se, just their misuse. Banks don't like electronic theft of funds,
even though electronic transfer can be a great boon. People don't like
it at all if a virus is slipped into their PC. And to my mind the
worship of the computer as a teaching tool is not justified.
Many years back Dr. Joseph Weisenbaum of MIT wrote a little program
called "Eliza". From another room it would ask questions, and then give
canned dialogue fragments in reply. It was meant as a joke, actually to
debunk the superstition that computers could think.
But greatly to Dr. Weisenbaum's surprise and disgust, the psychiatry
profession took it up as a marvelous breakthrough in patient treatment.
Papers were written about it in scholarly journals. PhD theses
appeared on improvements. It did cause Weizenbaum to write a book
himself, debunking computer thinking.
Beware of this happening in your profession. Or any profession.
My bio can be found elsewhere, but I did learn the basics back then.
The way a computer really worked on the inside. Of course then we went
to COBOL and FORTRAN and such, and more people with fewer skills could
suddenly learn to use computers. I now wonder if I was wise in having
such a large influence in that.
Along came the Year 2000 bug, when, late in the day as it were, it was
found that everybody had put only those two wheels on the year odometer.
Most people have now found that that was a dangerous mistake, albeit
one that I started warning about in 1971.
So who could fix it? People that used only COBOL? Most of them don't
even know that a program called a compiler changes their COBOL programs
to a form the computer understands, in order to run. And if they did
realize this, they would have no idea how a computer translated those
"object" programs into action at the most basic level. COBOL is seldom
taught any more, nor are machine language and its principles.
Fortunately for some, but not for all, my old basic knowledge of
character codes, and of representing time, permitted the creation of a
method that may allow computerized correction of the problem, as opposed
to new people studying programs someone else wrote 30 years ago without
documenting the thought and logic processes that went into them. My
method may give some relief, but certainly not for the computer chip
inside your coffee maker, or your automobile, the satellite from which
you get your television signals, the nuclear power plants, or the air
traffic control system.
In our trade jargon, there is no "silver bullet". My own method was
described in the Wall Street Journal as only a "silver-plated
Why am I telling you this now? Because the "silver bullet" form of
wishfulness is endemic. It takes its most virulent form in the current
President and Vice President of the United States. They claim that the
fast way to better education is to put a PC on every child's desk.
When this happens, I want you all to alert me when you hear the first
"Wow! Now that I have a computer I can learn
algebra better. Where ARE those algebra
lessons? Point me to them. I can hardly wait.
What's the icon for algebra?"
If there should be an algebra icon in Windows YY, there had better
be an initial decisi:on box for
"Do you want to get a warm and fuzzy feeling
that your answers will always be acceptable,
or do you want to learn how to get CORRECT
answers, just in case you eventually design
a skyscraper or bridge where hundreds could
die if you don't know what you're doing?"
I ask this seriously! I was one of three supervisors in the
computational department of Lockheed Aircraft in 1952. One of the
others was in charge of flutter calculations. It turned out that the
IBM 704, the large computer of that time, employed "fuzzy math" -- that
is, the floating point arithmetic gave answers of minus infinity when
two identical values were subtracted. Not a zero value with a reduced
power exponent to indicate loss of significance, mind you, but minus
Thus the computer saw no flutter in the wings, but the wings themselves
did! The vibration flutter just built up to a point where the metal
fractured. And then they fell off the airplane. Hundreds of people
flying in Lockheed Electras died. None of them knew, when the aircraft
was plunging to the ground, that some engineer had been badly trained in
Also in the Wall Street Journal I saw some Letters to the Editor under
the caption that said we did not need to follow eighteenth century math
teaching any more. It would not suffice for today's world, said one
writer. Of course this was all part of that big debate you math
teachers have had for several years about how best to teach the subject.
A virulent discussion, isn't it?
I won't impose an answer, but I assure you that the users of
eighteenth-century mathematics would not have left out 2 digits of any
argument in their own calculations! And I was trained the same way.
If on January 01 of the year 2000 your parents receive their pension
checks and their Social Security payments -- if your bank allows you to
write a check on, or withdraw, money you know to be yours -- if
airplanes can take you to see your children -- if the transportation
system brings farmers seed, and brings you food and clothing and heating
oil -- if the water supply doesn't quit -- if the stock markets haven't
collapsed completely -- if your house isn't repossessed for nonpayment
of the mortgage -- THEN
You can be grateful that I and some few others were not taught fuzzy
math by computer programs that coddle us, but rather by real mathematics
teachers that passed along the basics that they had learned. People
like you! And like my father, who was so concerned about my education
in mathematics that he taught my courses himself, in addition to being
the school superintendent.
Do I think mathematics teachers are critically important? Do I think
the profession must survive despite computers? Who can doubt that I
Do people forget an outstanding math teacher? No, I don't think so,
from my own evidence. When I was proposing the National Computer Year,
I was in the men's room at the Old Executive Office Building here in
Washington, and I recognized the man standing at my right to be Dr. Lee
DuBridge, former President of Cal Tech, and then the President's Science
Advisor. I asked: "Dr. DuBridge, have you ever been to Sault Ste.
Marie, Michigan?" He replied that yes, he had attended high school
there. And did he by any chance remember his math teacher? "Yes, Mr.
Bemer". "Well", I said, "I'm his son". And not only did he say that my
father was a fine teacher, but also that my mother, his English teacher,
was beloved by every student.
So you see that there may be a chain of mathematics teachers that carry
on their knowledge to the next generation of teachers. Which I am sure
is why the Tandy people honor outstanding students as well. Here I must
confess that I did teach a bit of Projective Geometry at UCLA during
World War II.
You are not anachronisms. You're custodians of the human future, and
defenders against inexactnesses that can plague our lives. Why do I
rail against "inexactness"? Aren't "approximately" or "sort of" just as
good? If they were, you would all have credit cards with a "00"
expiration date on them. But you don't! When your decisionmaking and
your lives are given over to computers, you'll just have to accept the
fact that very few of them operate via inequalities.
Despite the fact that the old IBM computer gave inexact answers and
But suppose that we already know all we have to know, or will ever know.
The story goes that the U. S. Patent Commissioner, around the turn of
the last century, said that he might as well resign, for about
everything had been invented that there was to invent. Make's me think
of the song "Everything's up-to-date in Kansas City ... they've gone
about as far as they can go ..."
But that did not turn out to be true. For just a few examples, I
invented the escape sequence in 1960. You wouldn't have laser printers
or the Web without that. Around that time Bernard Mandelbrot invented
fractals, and look what that led to. And Arthur C. Clarke came up with
the geostationary satellite idea. That old patent commissioner didn't
even envision moving satellites.
I've been leading up to my main caution, which is -- if you program our
computers for our present world, and, knowing that they are running that
world, fail to properly educate our children, who will reprogram those
computers when new knowledge is found? The kid who cannot make change
without a cash register? The people that don't know how the computer is
doing its thing? Even if no new knowledge is found, who can reprogram
the computer when the old knowledge is found to be faulty? I certainly
would not depend upon computers to reprogram themselves. Maybe in a
couple of more centuries, but not now.
So again I say, watch these next two years carefully, as those
faultily-programmed computers make a mess of things on our way to the
Year 2000. And leave here with the determination to teach exact
mathematics so that such catastrophes may not again arise.
Fight against dictionaries that record bad usage rather than specify
correct usage, to the detriment of the languages we use, and "dumbing us
down", in the words of Senator Moynihan.
Forget that IBM's "Big Blue" computer finally bested a human in chess.
That was a specialized aberration. Don't let it be used as argument in
favor of turning the world over to computerized decisions that are not
countermandable. We humans can, and always should, outthink
Reread, perhaps, E. M. Forster's "The Machine Stops". A frightening
prediction, even in the Year 1909, of what can happen when people
abdicate control of their lives to machines.
Let computers be our students, not our teachers.
Let computers be our servants, not our masters.