Two towering figures in computing died this month, besides Steve Jobs. A good case can be made that their contributions are as great as his, if not greater. The first was Dennis Ritchie who died at the age of 70 on October 13th. Ritchie invented the C programming language and the Unix operating system, two of the technologies that are key to the distributed computing today that makes the Internet possible. The other is John McCarthy who died yesterday at the age of 84. Like Ritchie, McCarthy was a major innovator in both languages and operating systems. He was the father of time-sharing, the key component of mainframe operating systems that allowed multiple users to access the computer as if they were the exclusive owner. McCarthy went on to become a pioneer of artificial intelligence and created a language called LISP that is widely used in this arena.
When I began working on IBM mainframes in 1970, I was trained in something called TSO or time-sharing option. This was the framework for submitting test jobs, editing my COBOL programs, debugging, etc. TSO grew out of a project under John McCarthy’s supervision in 1957 on an IBM 704, the predecessor to the 360 generation of computers that I was introduced to in 1970. Before time-sharing, a computer was used on a serial, one-at-time basis. So an IBM 704 might be loaded with a payroll program to print checks. Afterwards, another program to print checking account statements would be loaded to print statements, etc. If you wanted to make a change to the program that was used for payroll, you had to wait until these jobs were completed. With time-sharing, all three tasks—plus many more—could be run simultaneously. The only limitation was the size of the memory of the computer. It is astonishing to think that the average mainframe in the early 70s, which cost 115 thousand dollars and required mammoth air-conditioning support, typically was shipped with 8 megabytes of memory. By comparison, my wife’s IPod with 8 gigabytes of memory, a thousand times larger than an IBM 360, costs $199.
I had a brief exposure to LISP about 15 years ago when I enrolled in a computing and education program at Teachers College in Columbia. One of the classes I took was titled Cognition and Computers taught by John Black. It was supposed to introduce you to the use of artificial intelligence in primary education. We were required to use a LISP-like meta-language (don’t ask me to explain) for homework exercises, all of which were supposed to “teach” students about highly mechanical tasks such as playing billiards or starting a car. I seem to remember his explanation of how the language would be used:
1. INSERT KEY.
2. TURN KEY.
3. PUT GEAR INTO DRIVE.
4. PUT FOOT ON GAS.
I stupidly tried to use our LISP-like language for an exercise on American history that he told me was inappropriate. I tried again with a more mechanical application, only to be told—after 25 years of programming experience—that I still didn’t get it. I dropped out of the class with a great feeling of relief.
One of the books I have on my shelf at work is titled “LISP Lore: a Guide to Programming the LISP Machine”, a gift of the author Hank Bromley, an MIT graduate who was one of our Tecnica volunteers in Nicaragua. I have opened it once or twice without making heads or tails out of what I was reading.
In the NY Times obituary on McCarthy, the verdict on AI is mixed at best:
Artificial intelligence is still thought to be far in the future, though tremendous progress has been made in systems that mimic many human skills, including vision, listening, reasoning and, in robotics, the movements of limbs. From the mid-’60s to the mid-’70s, the Stanford lab played a vital role in creating some of these technologies, including robotics and machine-vision natural language.
The last time I heard AI being hyped was in the early 80s when Reagan was pushing SDI or “Star Wars”, an anti-ballistic missile system that would supposedly put a shield over the U.S. Breakthroughs in AI would supposedly make the system fail-safe. Fat chance of that after what I saw at Teacher’s College.
Opposition to SDI became part of a broader anti-nuclear movement that opposed both nuclear weapons and nuclear power. Three Mile Island in 1979, Chernobyl in 1986, and the Reagan administration’s talk about surviving nuclear war got lots of people into motion, including the Green Party in Germany. In the U.S., you saw the growth of Computer Professionals for Social Responsibility, a group that was formed primarily to oppose SDI. I was a member briefly until my attention shifted to solidarity work in Central America.
I imagine that John McCarthy would have been gung-ho for SDI even though he started out early in life as a Communist. The Times obit recounts:
John McCarthy was born on Sept. 4, 1927, into a politically engaged family in Boston. His father, John Patrick McCarthy, was an Irish immigrant and a labor organizer.
His mother, the former Ida Glatt, a Lithuanian Jewish immigrant, was active in the suffrage movement. Both parents were members of the Communist Party. The family later moved to Los Angeles in part because of John’s respiratory problems.
He entered the California Institute of Technology in 1944 and went on to graduate studies at Princeton, where he was a colleague of John Forbes Nash Jr., the Nobel Prize-winning economist and subject of Sylvia Nasar’s book “A Beautiful Mind,” which was adapted into a movie.
At Princeton, in 1949, he briefly joined the local Communist Party cell, which had two other members: a cleaning woman and a gardener, he told an interviewer. But he quit the party shortly afterward.
In the ’60s, as the Vietnam War escalated, his politics took a conservative turn as he grew disenchanted with leftist politics.
Dennis Ritchie went in the reverse direction politically from McCarthy as the NY Times obit reveals:
While a graduate student at Harvard, Mr. Ritchie worked at the computer center at the Massachusetts Institute of Technology, and became more interested in computing than math. He was recruited by the Sandia National Laboratories, which conducted weapons research and testing. “But it was nearly 1968,” Mr. Ritchie recalled in an interview in 2001, “and somehow making A-bombs for the government didn’t seem in tune with the times.”
I first ran into Ritchie’s writings back in the early 90s when Columbia trained us in the C programming language. Ritchie co-authored a book titled “The C Programming Language” with Brian Kernighan that I still have on my shelf from that time. The university was about to go full blast into client-server computing and C was being considered as a front-end to replace COBOL.
I have to confess that I never developed an appreciation for C and could never understand why it would be considered for business-oriented applications of the sort that had been written in COBOL. In Cobol, you would open a file on a disk or on tape for output with this instruction “OPEN OUTPUT EMPLOYEE_MASTER_FILE”. Here is the equivalent command in C:
/* fopen example */
int main ()
FILE * pFile;
pFile = fopen (“myfile.txt”,”w”);
fputs (“fopen example”,pFile);
Columbia wisely decided against getting involved with C and used Foxpro instead for a client-server system to handle the school’s basic financial functions (purchasing, budgets, etc.) I never got involved with the Foxpro work but was assigned to support the back-end of the system using Unix and Perl, a fairly high-level language that was simple to use like COBOL but succinct like C. My job consisted mainly of getting data from the mainframe loaded into Sybase tables that were updated by users during the day and sent back to the mainframe at night. Columbia is about to replace this nearly 20 year old system with a package from Peoplesoft. I am in the strange position of supporting a “legacy” system that was bleeding-edge when it was introduced but that’s the way it goes in data processing.
I am glad that I acquired basic skills in Unix since my Macbook—and all Apple computers—run on Dennis Ritchie’s operating system. This means that I am able to write software on the computer using Perl, including a program called compare.prl that reconciles my checking account using a file I download from Chase online. The best thing about Unix, of course, is that it is far more stable than Windows and we have Steve Jobs to thank for that. Jobs went against the grain when he started NeXT back in 1985, a personal computer that used Unix. Apple bought NeXT in 1996 and adapted the computer for the Mac, an architecture that persists to this day.
What a long, strange trip it’s been.