Computer pioneer blames government for Y2K glitch

December 30, 1999


 
n30dec1999father1.jpg

Computer pioneer Bob Bemer foresaw the Y2K problem more than 40 years ago when he developed a programming language for computers.

 
n30dec1999father2.jpg

The first computer used punch cards marked with a numerical code called ASCII representing English characters.

NORTH CENTRAL TEXAS — The Y2K computer bug could be blamed on the federal government, apathy and laziness, according to the father of ASCII, the American Standard Code for Information Interchange.
Bob Bemer, inventor of the programming language, said he warned federal officials about the potential problem in the 1950s. He acknowledges that hardware limitations will have nothing to do with the problems one could encounter after Dec. 31.
Instead, the U.S. Department of Defense is accused of being the biggest culprit. Bemer, who lives in North Central Texas, said defense officials demanded programmers only use two digits for the year in the computer code.
The computer pioneers did as they were told.
“There wasn’t any reason for us to not use four digit years back then,” Bemer said.
At the dawn of the Computer Age, punch cards were used to enter information. The machine accepted these cards that created electronic impulses, which were accepted and retained by the machine.
However, computers weren’t immune from human error.
“It’s a commonly accepted excuse that the hardware back in those days was too expensive to use four-digit years,” Bemer said.
But that wasn’t true, he said.
Bemer came up with a way to use two holes in a punch card to indicate a year with four. But this method was ignored.
Bemer helped create technology, inventing and developing some of the functions that computers use today, such as ASCII, the great translator between machines and humans.
Pronounced ask-ee, the code represents English characters as numbers, with each letter a number from 0 to 127.
If your computer has an “ESC,” or “escape” key, Bemer put it there.
Forty-one years ago, Bemer said he saw the future while working for IBM and helping the Mormon Church computerize genealogical records.
“And it became readily apparent at that time that you couldn’t use two digits for people who were born in 1363, for example,” he said.
Bemer first alerted the world to the problem of using two-digit years back in 1958. He wrote an article in 1971 saying, “Don’t drop the first two digits. The program may well fail from ambiguity in the year 2000.”
That was then. Now, Bemer said the problem is not faulty programming.
“If we threw away every program we had, we’d still be in trouble because we have these mounds of data — just mountains of data — already stored in computers and it is only a two-digit year.”
Decades of using two-digit years have created a tidal wave of records that computers will have to deal with for some time to come. Although most computers are now Y2K compliant, they still are left to guess at the century for billions of records that omitted the information.
“Doesn’t make any difference if the program is wrong,” Bemer said. “The data is wrong. It has to be corrected.”
For now, computers will have to assume the century. But that quick fix, Bemer said, will come back to haunt us in 20 or 30 years.

This story written for the Internet by Alison Sutton, NewsofTexas.com Editor


Related Links


ASCII — Webopedia
http://webopedia.internet.com/TERM/A/ASCII.html

The Computer Garage: a computer museum Web site
http://www.computergarage.org/