*META name="last-modified" content="2000-08-11">

 

Some Quotes from Public Relations Events

To see excerpts with Bemer's credits, click on "See".
To see the full text of these and more, GO BACK

Medium

Program/Publication

 

Interviewer

Shown/Issued

Television News of Texas See
Don Smith 1999 Dec 30
General Press Wall Street Journal See
Tom Petzinger 1997 Jun 20
General Press Dallas Morning News See
Alan Goldstein 1997 Jul 29
General Press Scientific American See
Alden Hayashi 1998 Jun
General Press Irish Times See
Lucille Redmond 1998 May 11
General Press Dallas Morning News See
Jim Landers 1998 Oct 04
General Press Vanity Fair Magazine See
Robert Sam Anson 1999 Jan
General Press Time Magazine See
Chris Taylor 1999 Jan 18
General Press Baltimore Sun See
Mike Stroh 1999 Apr 25
General Press Washington Post See
Gene Weingarten 1999 Jul 18
General Press Abilene Reporter-News (Scripps) See
Doug Williamson 1999 Jun 29
General Press Cincinnati Enquirer (Gannett) See
Phil Waga 1999 Dec 26
General Press Arizona Republic Editorial See
Keven Willey 1999 Dec 30
General Press The Australian See
Steve Romei 2000 Jan 10
WEB Only Pretext Magazine See
Susan Dumett 1998 Feb
WEB Only Dr. Dobbs Technetcast See
Philippe Lourier 1998 Apr 10
WEB Only WWW.UPSIDE.COM See
Sam Williams 1998 Jul 21
Technical Press Design News See
Mike Puttré 1997 Oct 20
Technical Press The COBOL Report See
Edmund Arranga 1997 Nov
Technical Press Info World See
Lynda Radosevich 1998 Jan 11
Technical Press Dr. Dobb's Journal See
Mike Swaine 1998 May
Technical Press Federal Computer Week See
Margret Johnston 1998 Sep 14
Technical Press Computerworld (COBOL) See
Mary Brandel 1999 Mar 15
Technical Press Ancestry Magazine See
Mark Howells 1999
Technical Press Computerworld (ASCII) See
Mary Brandel 1999 Apr 12
Technical Press Communications ACM See
Hal Berghel 1999 May
Back to Home Page

Vanity Fair (Anson)

... Robert Bemer, an IBM wizard who had invented the Escape key, and was one of the creators of ASCII, the language that enabled different computer systems to "talk" to one another. During the 50s, Bemer also developed a feature that permitted COBOL programmers to use either two or four digit year dates ...

FULL Story            Back to Quotes            Back to Home Page

The COBOL Report (Arranga)

If Grace Hopper is considered the mother of COBOL, Bob Bemer should be considered the father ...

Mr. Bemer's accomplishments resonate so deeply one almost assumes they have been with us always, snatched out of the ether by a collective consciousness, instead of having been the creation of one man. A short list includes:

  • helped create COBOL
  • coined the word COBOL
  • invented the ESCape sequence
  • created the PICTURE clause
  • helped create and standardize the ASCII character set
  • created the backslash
  • helped create the 8-bit per byte standard
CR: Give us some of your background, if you would?

BB: I started out as a wartime mathematician for Douglas Aircraft. In 1949 I got my first look at a computer working for the Rand Corporation and never looked back. In 1955 I went to work at IBM. I was IBM’s chief of programming standards.

CR: What led to the creation of COBOL?

BB: The Department of Defense wanted a standard business language. Charlie Phillips, myself, and others started CODASYL (Conference on Data Systems Languages) to assist in the effort. This was in 1959. Also, there was one improvement in hardware after another at IBM. The opcode (operation code) structure changed every time. There was no way we could build software for every machine going out. So COBOL was in IBM’s interest too.

CR: Was Univac’s Flow-Matic the driving force behind COBOL.

BB: Flow-Matic was part of it. IBM brought to the table a language called COMTRAN, short for Commercial Translator, that contained many of the ideas found in COBOL. We had been working since 1958 on COMTRAN. COMTRAN was a competitor to Flow-Matic.

CR: How did you arrive at the name COBOL?

BB: Cobol to me has a nice round sound - a lyrical quality (drawing an imaginary hourglass in the air). The sound reminds me of a women’s figure.

CR: Are you saying that Cobol, the language that is often considered the epitome of design by committee and bureaucracy, was named with Venus de Milo musings in mind?

BB: Yes (laughing).

CR: I must say I've been programming for over 20 years in Cobol and never heard that one. What did Grace Hopper have to say about your metaphorical naming?

BB: She just laughed and said okay.

CR: ... What led to the creation of ASCII?

BB: I surveyed the number of character sets and found 60. So I helped form BEMA (Business Equipment Manufactures Association) which was the beginning of the X3 committee which was tasked with the responsibility to define a standard character set.

CR: Here you are helping create the ASCII standard and IBM remains in the EBCDIC camp. What’s that all about?

BB: Originally IBM was supposed to move to ASCII. We had something called a P-bit that would allow machines to run either ASCII or EBCDIC. Learson was the CEO of IBM and he made the decision to stay with EBCDIC. A terrible mistake.

FULL Story            Back to Quotes            Back to Home Page

Communications ACM (Berghel)

One of the most innovative ideas for object-code interception is to be found in Bob Bemer's clever Xday (eXchange day) proposal discussed below.

Bob Bemer is a genuine computer pioneer. Among his many advertised accomplishments are the first use of the ESCape sequence, the creation of ASCII, and coining the term COBOL and inventing its "Picture Clause." Since 1949 he's held prominent positions with IBM, Honeywell, Bull and General Electric to name but a few.

That's where XDay comes in. Bemer observed that if we substitute for our Gregorian solar calendar a variant of its Julian parent, we have a unique way of representing each day into which every conceivable date format may be mapped. 1 A.D. is Julian 1721475. 1900 A.D. is 2415021. 2000 A.D. is 2451545, 2400 A.D. is Julian 2597642. Coincidentally, Bemer observes, the most significant digit in the Julian date stays unchanged for 27 centuries, with 15 centuries remaining (Figure 1). Assuming that Y2k will no longer be a problem in 3500, we imply the first digit without reservation.

Now the magic comes in. Bemer points out that the next 2 digits of XDay (Julian date sans leading digit) fall within the range of 45 to 99 until August 14, 3501.

As if that isn't clever enough, it also turns out that XDay occupies the same 6 digit space which is used for existing date formats, so it can be used interchangeably. If one has the time and energy, the XDay value could actually be used to replace the problematic date fields (ala' replacement strategy discussed above)...

FULL Story            Back to Quotes            Back to Home Page

Computerworld (Brandel)

If it weren't for a particular development in 1963, we wouldn't have e-mail and there would be no World Wide Web. Cursor movement, laser printers and video games -- all of these owe a big debt of gratitude to this technological breakthrough.

What is it? Something most of us take for granted today: ASCII. Yep, plain old ASCII, that simplest of text formats.

To understand why ASCII (pronounced AS-KEE) is such a big deal, you have to realize that before it, different computers had no way to communicate with one another. Each manufacturer had its own way of representing letters in the alphabet, numbers and control codes. "We had over 60 different ways to represent characters in computers. It was a real Tower of Babel," says Bob Bemer, who was instrumental in ASCII's development and is widely known as "the father of ASCII."

All the characters used in e-mail messages are ASCII characters, as are the characters in HTML documents.

in May 1961, Bemer submitted a proposal for a common computer code to the American National Standards Institute (ANSI). The X3.4 Committee -- representing most computer manufacturers of the day and chaired by John Auwaerter, vice president of the former Teletype Corp. -- was established and got right to work.

"It got down to nitpicking," Bemer says. "But finally, Auwaerter and I shook hands outside of the meeting room and said, 'This is it.'" Ironically, the end result bore a strong resemblance to Bemer's original plan.

The story of ASCII wouldn't be complete without mentioning the "escape" sequence. According to Bemer, it's the most important piece of the ASCII puzzle. Early in the game, ANSI recognized that 128 characters were insufficient to accommodate a worldwide communication system. But the seven-bit limitation of the hardware at the time forbade them to go beyond that.

So Bemer developed the escape sequence, which allows the computer to break from one alphabet and enter another. Since 1963, more than 150 "extra-ASCII" alphabets have been defined.

FULL Story            Back to Quotes            Back to Home Page

Computerworld (Brandel)

With Hopper, Bemer served as an adviser to Codasyl. He is responsible for coining the term Cobol.

FULL Story            Back to Quotes            Back to Home Page

Pretext Magazine (Dumett)

"Sometimes I regret creating Cobol," says Bob Bemer of the still-dominant machine language for which he built a key component in the 1950s. "It allowed lots of people that are less competent and responsible than they should be to get into the computer field."

FULL Story            Back to Quotes            Back to Home Page

Gannett Newspapers (Waga)

Mr. Bemer's account of his last appeal to the White House has been reported before. In 1970, he rounded up nearly 90 top scientists and prestigious technical associations to argue the case with Edward David, President Richard Nixon's science adviser. Mr. David agreed with Mr. Bemer, and appealed the matter directly to Mr. Nixon. Rather than acting on Mr. David's information, Mr. Nixon asked him whether he could help in repairing his TV set.

Mr. Bemer was not just any computer programmer.

As part of his work with two and four digits, he worked alongside Grace Murray Hopper, a computing legend. Together they developed a standard programming language called COBOL, for common business-oriented language.

He also developed a technology called ASCII, for American Standard Code for Information Interchange. It opened computing by letting machines communicate with one another. He also invented a technology that enabled laser printers to function.

Mr. Bemer has reams of reports and other papers to document how often he objected to the implementation of a two-digit year field to IBM and others. "This is a computer disaster that should never have been", he said.

Frederick Brooks, an IBM mainframe executive during the 360 era, confirmed that Mr. Bemer warned IBM executives about the potential problems ...

FULL Story            Back to Quotes            Back to Home Page

Dallas Morning News (Goldstein)

Mr. Bemer is known for more than 15 critical innovations from the early days of computing. He created the "escape sequence" behind the "esc" key on computers. He is known as the father of ASCII text, making it a worldwide technology standard. (It's on his vanity license plate.)

FULL Story            Back to Quotes            Back to Home Page

Scientific American (Hayashi)

A pioneer in the digital world, Bemer is the man who, among other accomplishments, helped to define ASCII characteristics, which allow otherwise incompatible computers to exchange text.

FULL Story            Back to Quotes            Back to Home Page

Ancestry Magazine (Howells)

An interesting historical footnote to the Year 2000 problem has recently surfaced. One of the more insightful computer pioneers has issued a well deserved "I told you so!" regarding the 2-digit date shortcut. Not all of system developers of the 1950s were unaware of need to fully represent the year with four digits. Bob Bemer, a pioneer in business computing, was recently quoted in Time Magazine as having successfully confronted the 2-digit year problem in the late 1950s The most remarkable thing about Bemer's work on the 2-digit year problem was that genealogy contributed to his solution. The story of how the study of ancestors helped solve the Year 2000 problem in the late 1950s is almost the stuff of legend. This article provides the details which transform that legend into real history.

The Picture Clause

As a pioneer in the development of commercial (rather than strictly scientific) computing, Bob Bemer developed the COMTRAN (COMmercial TRANslator) programming language in 1957. COMTRAN became one of the three forerunner languages of the COBOL (COmmon Business Oriented Language) programming language. Bemer's conversations with Grace Hopper in the prior year had convinced him that programming languages specifically for business had a future. Hopper was a famous computing pioneer and the "mother" of COBOL. It may seem self-evident today, but the concept of a programming language useful for business purposes was revolutionary at the time. Bob Bemer also invented the "Escape" key found on every computer keyboard, the backslash character, and is the "father" of ASCII - but those are other stories.

Bob Bemer's COMTRAN language originally included the Picture Clause. The Picture Clause was the first programming language element to specify data format, size, and type. Much like a dictionary defines the spelling and meaning of a word, a Picture Clause defines the length of a piece of data, whether the data must contain letters or numbers, and other characteristics of the data.

By 1958, Bob Bemer's Picture Clause had provided the flexibility to define a year as a complete, 4-digit representation within COMTRAN and subsequently COBOL.

The Picture Clause from Bemer's COMTRAN language was carried forward as a standard part of the COBOL programming language.

COBOL's use of the Picture Clause gave the programming language the ability to define a 4-digit year. However, with processing and memory being highly expensive, COBOL programmers could still use the Picture Clause just as easily to define a 2-digit date. The ability to make year dates fully flexible for handling any given year thus became optional within COBOL. It was left up to the individual COBOL programmer to decide how to format year data. Some chose 4-digit years, some did not. Today COBOL remains one of the most commonly used business programming languages in the world.

Early Genealogy Computerization by the LDS Church

The Church formed an all-volunteer data processing group in July of 1958. This group consisted of Church members whose professional employers included IBM, the Rand Corporation, and Hughes amongst others. This volunteer group from Southern California set to work on writing a computer program which could demonstrate how computers might assist the Church's genealogy efforts. As this sort of project had never been done before, the LDS group called on Bob Bemer as a consultant for their project. This is when genealogy provided Bob Bemer with his "a-ha" moment of epiphany regarding 4-digit dates.

The LDS demonstration project selected for computerization was to take microfilmed copies of christening records from the British Isles and manually enter that information onto computer-readable punched cards. The information could then be entered into a computer and sorted in a variety of ways useful for family research. For example, family group sheets could be created from the christening records showing father, mother, and child in a standard format. This project was to be the first automation of what we now know as the International Genealogical Index.

The majority of the British christenings for this project were from before 1900.

This need to show dates in past centuries provided Bob Bemer with what he says was "a big push to think correctly by showing that at least one class of data, with the same name (year), could have alternate characteristics and representations (2-digit or 4-digit year)." As a consultant to the LDS demonstration project, Bob Bemer had been given a concrete user requirement to represent a fully flexible 4-digit date.

The British Christenings Demonstration Project

For Bob Bemer and a few others in the computer industry, 4-digit dates became a cause célèbre of good design practice. While solving the problem of pre-1900 dates for the LDS' British christening demonstration project in 1958, 4-digit dates also provided the means to solve the problem of post-2000 dates.

Early warnings issued from the 1970s onward did little to change common programming practices. If all programmers and system designers had followed Bemer and the LDS project's example, the Year 2000 problem facing us today might have been a non-issue. The irony that a project concerned with the past helped confirm a solution for a future problem is not lost on Bob Bemer.

FULL Story            Back to Quotes            Back to Home Page

Federal Computer Week (Johnston)

Bemer, who in addition to leading the effort to establish ASCII as a universal character set for text files, also played a role in the creation of Cobol ...

FULL Story            Back to Quotes            Back to Home Page

Dallas Morning News (Landers)

The first article alerting computer programmers to the millennium bug - "Time and the Computer" - in the February 1979 issue of Interface Age - was written by Dallas computer pioneer Bob Bemer.

The Pentagon convened a Conference on Data Systems Languages in the late 1950s that produced COBOL, the Common Business Oriented Language.

Bob Bemer, then with IBM, was one of the designers. His COBOL Clause allowed programmers to use either four- or two-digit years for calendar dates.

Computers also needed a way to translate data into numbers as binary code. In 1960, Mr. Bemer developed the American Standard Code for Information Interchange.

ASCII and COBOL were two major standards that spread to computer users around the world, with the Pentagon leading the way. Less noticed was an effort to determine how programmers should enter data elements such as units of measure, time and dates.

Mr. Bemer developed the committee's scope of work. Mr. Henriques was the secretary. Mr. White became chairman of a subcommittee on Representations of Data Elements.

FULL Story            Back to Quotes            Back to Home Page

Dr. Dobbs Technetcast (Lourier)

(Note: Hard to cut this any. See the original.

Back to Quotes            Back to Home Page

Wall Street Journal (Petzinger)

(Bemer) saw his first computer in 1949 and never looked back, working for IBM, Univac, GE and others. As IBM's chief of programming standards, his creation of the escape sequence in 1960 allowed computers to break from one alphabet to another, a critical step toward laser printers and cursor movement. He led the effort to establish a universal character set known to millions as ASCII. He created the name COBOL for what still ranks among the world's dominant computer languages. He helped develop the standard by which binary digits (bits) travel in packs of eight.

FULL Story            Back to Quotes            Back to Home Page

Design News (Puttré)

... says Bob Bemer, the pioneer programmer who is credited with the invention of ASCII type and coining the term COBOL. Bemer, 77, has come out of retirement to take on an issue of Damoclesian proportions: the so-called "Y2K Problem".

FULL Story            Back to Quotes            Back to Home Page

Info World (Radosevitch)

Meanwhile, one man is finding an ironic silver lining to all this. Bob Bemer, a 77-year-old former IBM programmer credited with inventing ASCII text and the escape sequence that controls laser printing, came out of retirement last year when year-2000 problems started making headlines.

What drew Bemer out of retirement was ire. He predicted the year-2000 problem back in the 1970s.

"The programmers didn't use the picture clause right," Bemer said. "The picture clause is my invention, and it just ticked me off."

The picture clause in Cobol is where a programmer describes the data characteristics for the compiler. Most Cobol programmers set the end year to 99, which means they defined the year field as two digits.

"It would have been just as simple to say end-year PIC 9999, and we could have avoided all this junk," Bemer said.

FULL Story            Back to Quotes            Back to Home Page

Irish Times (Redmond)

Now, don't get nervous. Bemer is actually a good bet; he knows what he's doing. He is the programmer who invented the Escape sequence in 1960, a sequence of special characters that sends a command to a device or program, e.g. when you hit the ESC key on your computer.

He was behind the international acceptance of the ASCII standard and says he coined the term COBOL.

Bemer is a legend, credited with inventing the pivotal notion of `timesharing' in a computer's operating system (in 1957). He also invented the concept of word processing in 1959, made the first load-and-go compiler, and wrote rules and procedures for data processing which are used internationally to this day.

FULL Story            Back to Quotes            Back to Home Page

New Australian (Romei)

Mr. Bemer became a computer programmer in early 1949. He is considered the father of ASCII, the method computers use to translate letters and numbers into digital language. He first warned about the date problem in 1971, in an editorial in an in-house publication. Eight years later, he published "Time and the Computer" in Interface Age magazine.

FULL Story            Back to Quotes            Back to Home Page

Scripps Newspaper (Williamson)

Bemer, 79, is "The Father of ASCII," the method computers use to translate letters and numbers into digital language they can understand. And the "Esc" escape button on the standard PC - he developed the sequence behind that, too.

Bemer belonged to a group of computer programmers back in the 1950s who, led by Navy Rear Adm. Grace Murray Hopper, developed COBOL - the "common business-oriented language" used by most PCs today. In fact, Bemer coined the name. ...

A group from the Church of Jesus Christ of Latter-day Saints approached IBM in the 1950s to see if the company's new computers could be used to help the Mormons in their genealogical research and record-keeping. Bemer was working at IBM at the time.

So, Bemer created a way to use a four-digit year reference in COBOL.

Bemer doesn't blame the COBOL programmers of four and five decades ago for the Y2K problem.

Bemer had the first published warning of the potential of the Y2K crisis in the early 1970's.

"It was everyone else's fault," he said. "If you place the blame on Grace Hopper and me, you would be wrong. The worst thing we did was making the language so easy that anybody could use it."

And he said it wasn't his fault for not sending out a warning. As for the programmers, "they just did what their employers and the country did."

The real blame, Bemer said, lies in us, as "lazy, fix-it-later people."

Many other computer scientists joined Bemer in an effort to declare 1970 the National Year of the Computer. Through this campaign, Bemer hoped he could get a forum to warn of the approaching Y2K train. He persuaded presidential science adviser Edward E. David to approach President Nixon with the idea. But he was ignored.

Probably the first published warning about Y2K came from Bemer in an editorial he wrote to the 6,500 technologically influential readers of Honeywell Computer Journal in 1971. His first writing for public consumption came in February 1979 when his article "Time and the Computer" was published in Interface Age.

In this article Bemer wrote, "Don't drop the first two digits. The program may well fail from ambiguity in the Year 2000."

FULL Story            Back to Quotes            Back to Home Page

News of Texas (Smith)

The Y2K computer bug could be blamed on the federal government, apathy and laziness, according to the father of ASCII, the American Standard Code for Information Interchange.

Bemer helped create technology, inventing and developing some of the functions that computers use today, such as ASCII, the great translator between machines and humans.

Pronounced ask-ee, the code represents English characters as numbers, with each letter a number from 0 to 127.

If your computer has an "ESC" or "escape" key, Bemer put it there.

Bemer first alerted the world to the problem of using two-digit years back in 1958. He wrote an article in 1971 saying, "Don't drop the first two digits. The program may well fail from ambiguity in the year 2000."

FULL Story            Back to Quotes            Back to Home Page

Baltimore Sun (Stroh)

Programmer saw Y2K bug coming; Prophet: At age 79, the man who contributed the `Escape' key and other innovations to the computer world works to solve a problem he warned of three decades ago.; SUN PROFILE

When historians someday dissect the long chain of missteps that allowed the year 2000 computer bug to flourish, they will undoubtedly linger over the tale of a little-known programmer named Bob Bemer.

For decades, Bemer has been an unheard prophet, warning anybody who would listen that using two-digit dates in computers was a prescription for trouble.

Thirty years ago he lobbied government agencies to require four digits. He was snubbed. Twenty years ago, he published articles predicting that software polluted with shortened dates would haunt society at century's end. Programmers did it anyway.

Had anyone listened, Bemer figures we could have averted one of the most costly and bizarre screw-ups of the century.

Over the years, he became a star. The International Biographical Dictionary of Computer Pioneers dubs him a "programmer extraordinaire" and ticks off his contributions: the COBOL computer language that still runs many major businesses, the "Escape" key found on almost every computer, and the landmark American Standard Code for Information Interchange (ASCII).

"Without ASCII, you wouldn't have the Internet, you wouldn't have e-mail, you wouldn't have anything," notes computer historian Jean Sammet.

Bemer is proud of these accomplishments -- he has emblazoned "COBOL" and "ASCII" on vanity license plates for his three cars.

In the 1960s Bemer was recruited to help create government standards for the computer industry. There were roughly 6,000 general-purpose electronic computers in the United States then -- most of them crunching data for the government. Each machine had its own way of doing things, which made exchanging data difficult.

"It was like a Tower of Babel," recalls Walter M. Carlson, a retired computer programmer.

The standards makers planned to tackle everything from the layout of computer keyboards to how programmers abbreviated the names of states.

Also part of the effort was a little-noticed committee whose task was to decide how government programmers should represent dates.

Recalling the lesson of the Mormon project, Bemer got involved. He and Harry S. White of the National Bureau of Standards, the government agency overseeing the effort, lobbied to end the practice of using two-digit dates. "We knew there was ambiguity," says Bemer. "We also knew that if we chunked the `19' on there, that was a great help in removing the ambiguity."

He marched into the office of the postmaster general to suggest the postal service use four-digit years in postmarks.

Later, he wrote the first published articles spelling out the dangers to come. "There are many horror stories about programs, working for years, that died on some significant change in the date," he wrote in the February 1979 issue of Interface Age. "Don't drop the first two digits for computer processing, unless you take extreme care otherwise the program may fail from ambiguity in the year 2000."

Nobody is ignoring Bob Bemer anymore. CNN, Time, and Vanity Fair have come knocking. So have lawyers entreating him to testify as an expert witness in future Y2K liability trials. The Defense Department invited him to speak on the Y2K problem.

FULL Story            Back to Quotes            Back to Home Page

Dr. Dobb's Journal (Swaine)

(Note: Hard to cut this any. See the original.

Back to Quotes            Back to Home Page

Time Magazine (Taylor)

Conventional wisdom goes something like this: back in the 1950s, when computers were the size of office cubicles and the most advanced data-storage system came on strips of punched cardboard, several scientists, including a Navy officer named Grace Murray Hopper, begat a standard programming language called COBOL (common business-oriented language). To save precious space on the 80-column punch cards, COBOL programmers used just six digits to render the day's date: two for the day, two for the month, two for the year. It was the middle of the century, and nobody cared much about what would happen at the next click of the cosmic odometer. But today the world runs on computers, and older machines run on jury-rigged versions of COBOL that may well crash or go senile when they hit a double-zero date. So the finger of blame for the approaching crisis should point at Hopper and her COBOL cohorts, right?

Wrong. Nothing, especially in the world of computing, is ever that simple. "It was the fault of everybody, just everybody," says Robert Bemer, the onetime IBM whiz kid who wrote much of COBOL. "If Grace Hopper and I were at fault, it was for making the language so easy that anybody could get in on the act." And anybody did, including a group of Mormons in the late '50s who wanted to enlist the newfangled machines in their massive genealogy project--clearly the kind of work that calls for thinking outside the 20th century box. Bemer obliged by inventing the picture clause, which allowed for a four-digit year. From this point on, more than 40 years ahead of schedule, the technology was available for every computer in the world to become Y2K compliant.

Programmers ignored Bemer's fix. And so did his bosses at IBM, who unwittingly shipped the Y2K bug in their System/360 computers, an industry standard every bit as powerful in the '60s as Windows is today. By the end of the decade, Big Blue had effectively set the two-digit date in stone. Every machine, every manual, every maintenance guy would tell you the year was 69, not 1969. "The general consensus was that this was the way you programmed," says an IBM spokesman. "We recognize the potential for lawsuits on this issue."

No one in the computer industry wanted to rock the boat. And no one could alter the course IBM had set, not even the International Standards Organization, which adopted the four-digit date standard in the 1970s. The Pentagon promised to adopt century-friendly dates around 1974, then sat on its hands. Bemer himself wrote the earliest published Y2K warnings--first in 1971, then again in 1979. Greeted by nothing but derision, he retired in 1982. "How do you think I feel about this thing?" says Bemer, now an officer at his own Y2K software firm. "I made it possible to do four digits, and they screwed it up."

And that, at least for the next 13 years, was the attitude De Jager adopted. "We used to joke about this at conferences," he says. "Irresponsible talk, like 'We won't be around then.'" But by 1991, De Jager, a self-described "nobody" in the industry, had decided he would be around. Four years later, he was giving more than 85 lectures a year on the topic and posting regular updates to his site, the Web's first for Y2K warnings.

And here's the curious thing. From 1995 on, Y2K awareness had a kind of critical mass. Congress, the White House and the media all got wind of the bug at about the same time. After making too little of the problem for so long, everybody began to make, if anything, too much of it.

Why then, and not two decades earlier? Why De Jager, and not Bemer? Proximity to the millennium may have had something to do with it as well as the increasingly ominous tone of the warnings. This was Bemer's dry 1979 prophecy of doom: "Don't drop the first two digits. The program may well fail from ambiguity." Twenty years later, here's De Jager's jeremiad: "The economy worldwide would stop ... you would not have water. You would not have power ..."

FULL Story            Back to Quotes            Back to Home Page

Washington Post (Weingarten)

The Y2K problem wasn't just foreseeable, it was foreseen.

Writing in February 1979 in an industry magazine called Interface Age, computer industry executive Robert Bemer warned that unless programmers stopped dropping the first two digits of the year, programs "may fail from ambiguity in the year 2000."

This is geekspeak for the Y2K problem ...

We sought the answer from the first man to ask the question.

Robert Bemer, the original Y2K whistleblower, lives in a spectacular home on a cliff overlooking a lake two hours west of a major American city. We are not being specific because Bemer has made this a condition of the interview. We can say the car ride to his town is unrelievedly horizontal. The retail stores most in evidence are fireworks stands and taxidermists.

In his driveway, Bemer's car carries the vanity tag "ASCII." He is the man who wrote the American Standard Code for Information Interchange, the language through which different computer systems talk to each other. He also popularized the use of the backslash, and invented the "escape" sequence in programming. You can thank him, or blaspheme him, for the ESC key.

In the weenieworld of data processing, he is a minor deity ...

Bemer is 79. He looks flinty, like an aging Richard Boone still playing Paladin.

Who, then, is to blame?

Bemer rocks back in his chair and offers a commodious smile.

In one sense, he says, he is.

In the late 1950s, Bemer helped write COBOL, the Esperanto of computer languages. It was designed to combine and universalize the various dialects of programming. It also was designed to open up the exploding field to the average person, allowing people who weren't mathematicians or engineers to communicate with machines and tell them what to do. COBOL's commands were in plain English. You could instruct a computer to MOVE, ADD, SEARCH or MULTIPLY, just like that.

It was a needed step, but it opened the field of programming, Bemer says, to "any jerk."

"I thought it would open up a tremendous source of energy," he says. "It did. But what we got was arson."

There was no licensing agency for programmers. No apprenticeship system. "Even in medieval times," Bemer notes dryly, "there were guilds." When he was an executive at IBM, he said, he sometimes hired people based on whether they could play chess.

There was nothing in COBOL requiring or even encouraging a two-digit year. It was up to the programmers. If they had been better trained, Bemer says, they might have known it was unwise. He knew.

He blames the programmers, but he blames their bosses more, for caving in to shortsighted client demands for cost-saving.

"What can I say?" he laughs. "We're a lousy profession ..."

FULL Story            Back to Quotes            Back to Home Page

Arizona Republic (Willey)

This week the Wall Street Journal said American business believes it has conquered the Y2K bug. Business spent gazillions - more than $300 billion, according to Gannett News Service - to fix a mistake Bemer urged them not to make.

FULL Story            Back to Quotes            Back to Home Page

Upside.com (Williams)

... what Bemer lacks in button-pushing skills he more than makes up for in street credibility. As co-developer of the Cobol computer language and the main force behind landmark software achievements such as ASCII, the ESCape sequence and computer time-sharing, Bemer, 78, has been instrumental in making computers a ubiquitous part of daily life ...

Down the Toilet: Some people have referred to Y2K as "the coder's revenge." You, however, maintain that programmers shouldn't take the blame for this. What do you think were the major cultural decisions that led to this becoming such a widespread problem?

Bemer: It was just that, a cultural decision. People were lazy. It's like the checks you see where they print a "19" and leave the last two spaces blank. I hear the story that [programmers] used two digits for the year to save storage. That's hogwash. It wasn't storage at all. It was unthinkingness.

As for the revenge part, I admit that programmers are a recalcitrant bunch, and they like to do tricky stuff. But really they had the management behind them telling them to use the two-digit year. Most of them just said OK and did what they were told.

What motivated you to come out of retirement? Was it a sense of guilt or the challenge of coming up with a time-saving solution?

I can't say I felt guilty. I mean, I was talking about this back in 1970. I wrote a paper on it in 1971, titled "What's the Date?" In February 1979, I wrote an article called "Time and the Computer" for Interface Magazine. It addressed this entire problem and how companies were choosing to ignore it.

When nobody paid attention to the articles, I didn't take offense. I've been ignored a lot in my career. This time, I finally figured it was serious enough that somebody had to do something. I also knew that we'd lost a lot of the original source code.

What was it about the institutional mindset that forced companies to go for the quick fix all these years, waiting until the last minute to overhaul their legacy systems?

I blame the business schools. I think the Harvard Business School is treasonous. The mentality coming out of that place says don't plan more than three to six months ahead because somebody else will be handling your job at that point. That's not the way to do things for the good of the country. I think people ought to be responsible for what they do over the long range.

Doesn't the same go for programmers?

Every program I write starts with, "Author = RBemer, Tel = my home phone number." It's like a signature on a painting. I did it. If it doesn't work, it's my fault. Blame nobody else but me.

With management, there's no accountability. In fact, you even have a lot of [management] people retiring now or taking early retirement so as not to be in the top dog position when this whole thing blows. It's irresponsible.

So whom do you blame? Richard Nixon.

What did he do? I proposed a national computer year back in 1970. I wanted to model it after the IGY [the International Geophysical Year was from July 1957 to December 1958]. I could see that people were not prepared for the influx of computer usage that was sure to come. I thought that if we all put our minds to it and planned ahead a little bit, maybe it would be easier. Year 2000 was just one of the issues we would have addressed.

President Nixon was very suspicious of computers, though, and wouldn't sign off on it. Without his proclamation we couldn't do it. I think he'll go down in history along with King Canute.

FULL Story            Back to Quotes            Back to Home Page