I'm a fan of old UNIX papers, i remember his name associated with Multics mostly. "Introduction and overview of the Multics system", F. J. Corbató, Massachusetts Institute of Technology
and V. A. Vyssotsky Bell Telephone Laboratories. (https://multicians.org/fjcc1.html)
seminal in OS development at the time.
if anyone has links to other interesting papers, do share.
Multics (Multiplexed Information and Computing Service) was a mainframe timesharing operating system begun in 1965 and used until 2000. Multics began as a research project and was an important influence on operating system development. The system became a commercial product sold by Honeywell to education, government, and industry.
Multics was a prototype of a Computer Utility, providing secure computing to remote users at their terminals. Multicians still miss the elegant, consistent, and powerful programming environment; some Multics features are only now being added to contemporary systems.
Here's the site history of Dockmaster, the National Security Agency's Multics system (which was mentioned in Cliff Stoll's "The Cuckoo's Egg"):
The MIT GE-645 was installed in the 9th floor machine room of Project MAC, at 545 Technology Square. (1967)
In 1972, a Honeywell 6180 was installed in MIT's building 39, at the Information Processing Services Center. There are publicity pictures of the new machine.
The MIT system was upgraded several times and finished as a dual DPS8/70M.
This article describes MIT's Project MAC, the organization that led the initial creation of Multics. This is not a comprehensive history of Project MAC: my goal is to provide context and motivation for Multics history, and to provide accurate statements and references for deeper investigation.
(The author [Tom Van Vleck] was a part-time undergraduate programmer and then a research staff member at Project MAC during its first seven years.)
That is one of the most worth to watch videos ever. I also had no idea it was a "precursor" to UNIX, so it kind of amazed me as if I was understanding time-sharing for the first time in my life.
I also really like how he's describing what I know as an OS, but before that term was used. He also mentions magnetic disks and terms like "words" (we use bytes now), "supervisor" (would be OS) and "alarm clock", which gives me a feeling of the turning point between huge expensive computers and personal computers. The work of this man and his team is truly the birth of computers as we knows them.
Exactly, this is why I shared this video. Hearing concepts explained by those who pioneered them really brings them home, because it highlights the context in which they were created, with a description of what was the trouble with computing back then and how it was solved, and the excitement of the achievement.
For instance, he situates his breaktrought in a timeline where the main milestones are, first, getting computers to work at all, and, then, high level languages, to make them much easier to program (Fortran, https://www.youtube.com/watch?v=dDsWTyLEgbk). Then, time sharing made them much easier to use.
The birth of the terminal, no less, which is still around and which we all take for granted (I know it's deeper than this, multiprocessing, but this is the most visible manifestation) ...
Judging from this video, it also seems that he was a very good communicator.
I haven't watched the video yet so I may be missing some context, but terminals (as in ASR-33 and ilk) predate computers. They were used for telegraph/telex services for example. Old teletypes had no electronics btw.
I mean a computer terminal, in the sense of a device from which you can control multiple computations and get real time feedback, with the ilusion that it's all happening on a dedicated computer in spite of the fact that others might be sharing it.
But, he does say a "word" is the information of 6 letters or 10 numbers, which is a bit curious.
My understanding is the most encodings for letters in 1960s were 6 bits, so that would perhaps imply a 36-bit word for Dr. Corbato's computer. But then, if you have to fit ten of them into a word, you could only use up to 3-bit integers, which doesn't sound right.
Well, I meant he used words instead of bytes, even if in some architectures, a word was a byte. That's because the term "word" was more widely used back then from what I can tell. Nowadays it's much more meaningful to give the actual size in bytes rather than words simply because 8-bit bytes are common throughout most modern computer systems.
About 11 years ago, Dr Corbato was invited to speak at an ACM meeting in the computer science department at Uconn.
He came, but he was, seemingly, too elderly to actually speak. So he stood up, turned on that video on YouTube, and then sat down, ending the entirety of his interaction with our students.
I was thrilled to become aware of the video and have watched it several times since, but I feel the physical appearance was not a great use of Dr. Corbato's time.
Even though it covers much more than Corbato and the time-sharing system, The Dream Machine[0] is a great read if you want to learn more about the history of the computers and the Internet.
This man deserves it more than most of those who got one (I don't mean to take anything away from them, they did deserve one, just that Corbato's contributions to computing are enormous and more deserving than most; for instance, would there have been a Linux without his contributions? He led the Multics project, of which he was possibly the main figure -> Unix -> Linux). I guess that it comes down to a popularity thing, rather than actual importance.
See, this is why black bars are a bad idea. Anytime someone dies and doesn’t get a black bar, it becomes a moderator/admin-induced negative judgment on that person. Even if the moderators/admins may not even have been aware of it, that is the effect in practice.
I, too, was surprised at no black bar. This man had tremendous influence on computing. For me it was personal, since I began my career journey in IT in 1967 after being fascinated by a Teletype clacking away interactively with a GE timesharing system at my local school district. Perhaps it has been too many years, and people now are so familiar with the interactivity of personal computers and cell phones, that they have little knowledge of how revolutionary the concept of timesharing was in an era when nearly all computers operated in "batch" mode.
seminal in OS development at the time.
if anyone has links to other interesting papers, do share.