'dec' Episodes

VMS

     2/11/2020

VMS and OpenVMS Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us to innovate (and sometimes cope with) the future! Today we’re going to talk through the history of VMS.

Digital Equipment Corporation gave us many things. Once upon a time, I used a DEC Alpha running OpenVMS. The PDP-11 had changed the world, introducing us to a number of modern concepts in computers such as time sharing. The PDP was a minicomputer, smaller and more modern than mainframes. But by 1977 it was time for the next generation and the VAX ushered in the 32-bit era of computers and through the evolutions evolve into the VaXServer, helping to usher in the modern era of client-server architectures. It supported Branch delay slots and suppressed instructions. The VAX adopted virtual memory, privilege modes, and needed an operating system capable of harnessing all the new innovations packed into the VAX-11 and on. That OS would be Virtual Memory System, or VMS. The PDP had an operating system called RSX-11, which had been released in 1972. The architect was Dan Brevik, who had originally called it DEX as a homonym with DEC. But that was trademarked so he and Bob Decker over in marketing wrote down a bunch of acronyms and then found one that wasn’t trademarked. Then they had to reverse engineer a meaning out of the acronym to be Real-Time System Executive, or RSX. But for the VAX they needed more and so Dave Cutler from the RSX team, then in his early 30s, did much of the design work. Dick Hustvedt and Peter Lipman would join him and they would roll up to Roger Gourd, who worked with DECs VP of engineering Gordon Bell to build the environment. The project began as Starlet, named because it was meant to support the Startlet family of processors. A name that still lives on in various files in the operating system. The VMS Operating System would support RISC instructions, support 32-bit virtual address extension, would work with DECnet, would have virtual memory of course, as the name implies. VMS would bring a number of innovations in the world of clustering. VMS would use a modified Julian Day system to keep track of system time, which subtracts the Julian Date from 2,400,000.5. Why? Because it begins on November 17th, 1858. THat’s not why, that the day it starts. Why? Because it’s not Y10,000 compliant only having 4 slots for dates. Wait, that’s not a thing. Anyway, how did VMS come to be? One of the killer apps for the system though, was that DECnet was built on DIGITAL Network Architecture, or DNA. It first showed up in RSX, where you could like two PDPs but you could have 32 nodes by the time VaX showed up and 255 with VMS 2. Suddenly there was a simple way to network these machines, built into the OS. Version 1 was released in 1977 in support of the VAX-11/780. Version 2 would come along in 1980 for the 750 and Version 3 would come in 1982 for the 730. The VAX 8600 would ship in 84 with version 4. And here’s where it gets interesting. The advent of what were originally called microcomputers but are now called personal computers had come in the late 70s and early 80s. By 1984, MicroVMS was released as a port for running on the MicroVAX, Digitals attempt to go down-market. Much as IBM had missed minicomputers initially, Digital had missed the advent of microcomputers though and the platform never took off. Bill Gates would adorn the cover of Time that year. Of course, by 84, Apple had AppleTalk and DOS was ready to plug in as well. Bill Joy moved BSD away from VAX in 1986, after having been with the PDP and then VAX for years, before leaving for Sun. At this point the platform was getting a bit long in the tooth. Intel and Microsoft were just starting to emerge as dominant players in computing and DEC was the number two software company in the world, with a dominant sales team and world class research scientists. They released ULTRIX the same year though, as well as the DECStation with a desktop environment called UW for ULTRIX Workstation. Ultrix was based on BSD 4 and given that most Unixes had been written on PDPs, Bill Joy knew many of the group launched by Bill Munson, Jerry Brenner, Fred Canter and Bill Shannon. Cutler from that OpenVMS team hates Unix. Rather than have a unified approach, the strategy was fragmented. You see a number of times in the history of computing where a company begins to fail not because team members are releasing things that don’t fit within the strategy but because they release things that compete directly with a core product without informing their customers why. Thus bogging down the sales process and subsequent adoption in confusion. This led to brain drain. Cutler ended up going to the Windows NT team and bringing all of his knowledge about security and his sincere midwestern charm to Microsoft, managing the initial development after relations with IBM in the OS/2 world soured. He helped make NT available for the Alpha but also helping make NT dominate the operating system from his old home. Cutler would end up working on XP, Server operating systems, Azure and getting the Xbox to run as a host for Hyper-V . He’s just that rad and his experience goes back to the mid 60s, working on IBM 7044 mainframes. Generational changes in software development, like the move to object oriented programming or micro services, can force a lot of people into new career trajectories. But he was never one of those. That’s the kind of talent you just really, really, really hate to watch leave an organization - someone that even Microsoft name drops in developer conference session to get ooohs and aaahs. And there were a lot of them leaving as DEC shifted into more of a sales and marketing company and less into a product and research company as it had founded to be back when Ken Olsen was at MIT. We saw the same thing happen in other areas of DEC - competing chips coming out of different groups. But still they continued on. And the lack of centralizing resources and innovating quickly and new technical debt being created caused the release of 5 to slip from a 2 year horizon to a 4 year horizon, shipping in 1988 with Easynet, so you could connect 2,000 computers together. Version 6 took 5 years to get out the door in 1993. In a sign of the times, 1991 saw VMS become OpenVMS and would make OpenVMS POSIX compliant. 1992 saw the release of the DEC Alpha and OpenVMS would quickly get support for the RISC processor which OpenVMS would support through the transition of Alpha to Itanium when Intel bought the rights for the Alpha architecture. Version 7 of OpenVMS shipped in 1996 but by then the company was in a serious period of decline and corporate infighting and politics killed them. 1998 came along and they practically bankrupted Compaq by being acquired and then HP swooped in and got both for a steal. Enterprise computing has never been the same. HP made some smart decisions though. They inked a deal with Intel and Alpha would become the HP Itanium and made by Intel. Intel then had a RISC processor and all the IP that goes along with that. Version 8 would not be released until 2003. 7 years without an OS update while the companies were merged and remerged had been too long. Market share had all but disappeared. DECnet would go on to live in the Linux kernel until 2010. Use of the protocol was replaced by TCP/IP much the same way most of the other protocols got replaced. OpenVMS development has now been licensed to VSI and is now run by vmssoftware, which supports many former DEC and HP employees. There are a lot of great, innovative, unique features of OpenVMS. There’s a common language environment, that allows for calling functions easily and independently of various languages. You can basically mix Fortran, C, BASIC, and other languages. It’s kinda’ like my grandmas okra. She said I’d like it but I didn’t. VMS is built much the same way. They built it one piece at a time. To quote Johnny Cash: “The transmission was a fifty three, And the motor turned out to be a seventy three, And when we tried to put in the bolts all the holes were gone.” You can of course install PHP, Ruby, Java, and other more modern languages if you want. And the System Services, Run Time Libraries, and language support make it easy to use whatever works for a task across them pretty equally and provides a number of helpful debugging tools along the way. And beyond debugging, OpenVMS pretty much supports anything you find required by the National Computer Security Center and the DoD. And after giving the middle finger to Intel for decades… As with most operating systems, VMS is finally being ported to the x86 architecture signaling the end of one of the few holdouts to the dominance of the x86 architecture in some ways. The Itatiums have shipped less and less chips every year, so maybe we’re finally at that point. Once OpenVMS has been ported to x86 we may see the final end to the chip line as the last windows versions to support them stopped actually being supported by Microsoft about a month before this recording. The end of an era. I hope Dave Cutler looks back on his time on the VMS project fondly. Sometimes a few decades of crushing an old employer can help heal some old wounds. His contributions to computing are immense, as are those of Digital. And we owe them all a huge thanks for the techniques and lessons learned in the development of VMS in the early days, as with the early days of BSD, the Mac, Windows 1, and others. It all helped build a massive body of knowledge that we continue to iterate off of to this day. I also owe them a thank you for the time I got to spend on my first DEC Alpha. I didn’t get to touch another 64 bit machine for over a decade. And I owe them a thanks for everything I learned using OpenVMS on that machine! And to you, wonderful listers. Thank you for listening. And especially Derek, for reaching out to tell me I should move OpenVMS up in the queue. I guess it goes without saying… I did! Hope you all have a great day!


The Data General Nova

     3/5/2020

Today we’re going to talk through the history of the Data General Nova. Digital Equipment was founded in 1957 and released a game changing computer, the PDP-8, in 1965. We covered Digital in a previous episode, but to understand the Data General Nova, you kinda’ need to understand the PDP. It was a fully transistorized computer and it was revolutionary in the sense that it brought interactive computing to the masses. Based in part on research from work done for MIT in the TX-0 era, the PDP made computing more accessible to companies that couldn’t spend millions on computers and it was easier to program - and the PDP-1 could be obtained for less than a hundred thousand dollars. You could use a screen, type commands on a keyboard for the first time and it would actually output to screen rather than reading teletypes or punch cards. That interactivity unlocked so much.

The PDP began the minicomputer revolution. The first real computer game Spacewar! Was played on it and the adoption increased. The computers got faster. They could do as much as large mainframes. The thousands of transistors were faster and less error-prone than the old tubes. In fact, those transistors signaled that the third generation of computers was upon us. And people who liked the PDP were life-long converts. Fanatical even. The PDP evolved until 1965 when the PDP-8 was released. This is where Edson de Castro comes in, acting as the project manager for the PDP-8 development at Digital. 3 years later, he, Henry Burkhardt, and Richard Sogge of Digital would be joined by Herbert Richman a sales person from Fairchild Semiconductor.

They were proud of the PDP-8. It was a beautiful machine. But they wanted to go even further. And they didn’t feel like they could do so at Digital. They would build a less expensive minicomputer that opened up even more markets. They saw new circuit board manufacturing techniques, new automation techniques, new reasons to abandon the 12-bit CPU techniques. Edson had wanted to build a PDP with all of this and the ability to use 8 bit, 16 bit, or 32 bit architectures, but it got shut down at Digital. So they got two rounds of venture capital at $400,000 each and struck out on their own. They wanted the computer to fit into a 19-inch rack mount. That choice would basically make the 19 inch rack the standard from then on.

They wanted the machines to be 16-bit, moving past the 8 or 12 bit computers common in mini-computing at the time. They used an accumulator-based architecture, which is to say that there was a CPU that had a register that stored the results of various bits of code. This way you weren’t writing the results of all the maths into memory and then sending it right back to the CPU. Suddenly, you could do infinitely more math! Having someone from Fairchild really unlocked a lot of knowledge about what was happening in the integrated circuit market. They were able to get the price down into the thousands, not tens of thousands.

You could actually buy a computer for less than 4 thousand dollars.

The Nova would ship in 1969 and be an instant success with a lot of organizations. Especially smaller science labs like one at the University of Texas that was their first real paying cusotmer. Within 6 months they sold 100 units and within the first few years, they were over $100 million in sales. They were seeking into Digital’s profits. No one would have invested in Digital had they tried to compete head-on with IBM. Digital had become the leader in the minicomputer market, effectively owning the category. But Nova posed a threat. Until they decided to get into a horse race with Digital and release the SuperNOVA to compete with the PDP-11. They used space age designs. They were great computers. But Digital was moving faster. And Data General started to have production and supply chain problems, which led to law suits and angry customers. Never good.

By 1977 Digital came out with the VAX line, setting the standard to 32-bit. Data General was late to that party and honestly, after being a market leader in low-cost computing they started to slip. By the end of the 70s microchips and personal computers would basically kill minicomputers and while transitioning from minicomputers to servers, Data General never made quite the same inroads that Digital Equipment did. Data General would end up with their own DOS, like everyone their own UNIX System V variant, one of the first portable computers, but by the mid-80s, IBM showed up on the market and Data General would make databases and a number of other areas to justify what was becoming a server market.

In fact, the eventual home for Data General would be to get acquired by EMC and become CLaRiiON under the EMC imprint. It was an amazing rise. Hardware that often looked like it came straight out of Buck Rogers. Beautiful engineering. But you just can’t compete on price and stay in business forever. Especially when you’re competing with your former bosses who have much much deeper pockets.

EMC benefited from a lot of these types of acquisitions over the years, to become a colossus by the end of the 2010s. We can thank Data General and specifically the space age nova, for helping set many standards we use today. We can thank them for helping democratize computing in general. And if you’re a heavy user of EMC appliances, you can probably thank them for plenty of underlying bits of what you do even through to today. But the minicomputer market required companies to make their own chips in that era and that was destroyed by the dominance of Intel in the microchip industry. It’s too bad.

So many good ideas. But the costs to keep up turned out to be too much for them, as with many other vendors. One way to think about this story. You can pick up on new manufacturing and design techniques and compete with some pretty large players, especially on price. But when the realities of scaling an operation come you can’t stumble or customer confidence will erode and there’s a chance you won’t get to compete for deals again in the future. But try telling that to your growing sales team.

I hear people say you have to outgrow the growth rate of your category. You don’t. But you do have to do what you say you will and deliver. And when changes in the industry come, you can’t be all over the place. A cohesive strategy will help you whether the storm. So thank you for tuning into this episode of the History of Computing Podcast. We are so lucky you chose to join us and we hope to see you next time! Have a great day!


The Mouse

     2/18/2020

In a world of rapidly changing technologies, few have lasted as long is as unaltered a fashion as the mouse. The party line is that the computer mouse was invente d by Douglas Engelbart in 1964 and that it was a one-button wooden device that had two metal wheels. Those used an analog to digital conversion to input a location to a computer. But there’s a lot more to tell. Englebart had read an article in 1945 called “As We May Think” by Vannevar Bush. He was in the Philippines working as a radio and radar tech. He’d return home,. Get his degree in electrical engineering, then go to Berkeley and get first his masters and then a PhD. Still in electrical engineering. At the time there were a lot of military grants in computing floating around and a Navy grant saw him work on a computer called CALDIC, short for the California Digital Computer. By the time he completed his PhD he was ready to start a computer storage company but ended up at the Stanford Research Institute in 1957. He published a paper in 1962 called Augmenting Human Intellect: A Conceptual Framework. That paper would guide the next decade of his life and help shape nearly everything in computing that came after. Keeping with the theme of “As We May Think” Englebart was all about supplementing what humans could do. The world of computer science had been interested in selecting things on a computer graphically for some time. And Englebart would have a number of devices that he wanted to test in order to find the best possible device for humans to augment their capabilities using a computer. He knew he wanted a graphical system and wanted to be deliberate about every aspect in a very academic fashion. And a key aspect was how people that used the system would interact with it. The keyboard was already a mainstay but he wanted people pointing at things on a screen. While Englebart would invent the mouse, pointing devices certainly weren’t new. Pilots had been using the joystick for some time, but an electrical joystick had been developed at the US Naval Research Laboratory in 1926, with the concept of unmanned aircraft in mind. The Germans would end up building one in 1944 as well. But it was Alan Kotok who brought the joystick to the computer game in the early 1960s to play spacewar on minicomputers. And Ralph Baer brought it into homes in 1967 for an early video game system, the Magnavox Odyssey. Another input device that had come along was the trackball. Ralph Benjamin of the British Royal Navy’s Scientific Service invented the trackball, or ball tracker for radar plotting on the Comprehensive Display System, or CDS. The computers were analog at the time but they could still use the X-Y coordinates from the trackball, which they patented in 1947. Tom Cranston, Fred Longstaff and Kenyon Taylor had seen the CDS trackball and used that as the primary input for DATAR, a radar-driven battlefield visualization computer. The trackball stayed in radar systems into the 60s, when Orbit Instrument Corporation made the X-Y Ball Tracker and then Telefunken turned it upside down to control the TR 440, making an early mouse type of device. The last of the options Englebart decided against was the light pen. Light guns had shown up in the 1930s when engineers realized that a vacuum tube was light-sensitive. You could shoot a beam of light at a tube and it could react. Robert Everett worked with Jay Forrester to develop the light pen, which would allow people to interact with a CRT using light sensing to cause an interrupt on a computer. This would move to the SAGE computer system from there and eek into the IBM mainframes in the 60s. While the technology used to track the coordinates is not even remotely similar, think of this as conceptually similar to the styluses used with tablets and on Wacom tablets today. Paul Morris Fitts had built a model in 1954, now known as Fitts’s Law, to predict the time that’s required to move things on a screen. He defined the target area as a function of the ratio between the distance to the target and the width of the target. If you listen to enough episodes of this podcast, you’ll hear a few names repeatedly. One of those is Claude Shannon. He brought a lot of the math to computing in the 40s and 50s and helped with the Shannon-Hartley Theorum, which defined information transmission rates over a given medium. So these were the main options at Englebart’s disposal to test when he started ARC. But in looking at them, he had another idea. He’d sketched out the mouse in 1961 while sitting in a conference session about computer graphics. Once he had funding he brought in Bill English to build a prototype I n 1963. The first model used two perpendicular wheels attached to potentiometers that tracked movement. It had one button to select things on a screen. It tracked x,y coordinates as had previous devices. NASA funded a study to really dig in and decide which was the best device. He, Bill English, and an extremely talented team, spent two years researching the question, publishing a report in 1965. They really had the blinders off, too. They looked at the DEC Grafacon, joysticks, light pens and even what amounts to a mouse that was knee operated. Two years of what we’d call UX research or User Research today. Few organizations would dedicate that much time to study something. But the result would be patenting the mouse in 1967, an innovation that would last for over 50 years. I’ve heard Engelbart criticized for taking so long to build the oNline System, or NLS, which he showcased at the Mother of All Demos. But it’s worth thinking of his research as academic in nature. It was government funded. And it changed the world. His paper on Computer-Aided Display Controls was seminal. Vietnam caused a lot of those government funded contracts to dry up. From there, Bill English and a number of others from Stanford Research Institute which ARC was a part of, moved to Xerox PARC. English and Jack Hawley iterated and improved the technology of the mouse, ditching the analog to digital converters and over the next few years we’d see some of the most substantial advancements in computing. By 1981, Xerox had shipped the Alto and the Star. But while Xerox would be profitable with their basic research, they would miss something that a candle-clad hippy wouldn’t. In 1979, Xerox let Steve Jobs make three trips to PARC in exchange for the opportunity to buy 100,000 shares of Apple stock pre-IPO. The mouse by then had evolved to a three button mouse that cost $300. It didn’t roll well and had to be used on pretty specific surfaces. Jobs would call Dean Hovey, a co-founder of IDEO and demand they design one that would work on anything including quote “blue jeans.” Oh, and he wanted it to cost $15. And he wanted it to have just one button, which would be an Apple hallmark for the next 30ish years. Hovey-Kelley would move to optical encoder wheels, freeing the tracking ball to move however it needed to and then use injection molded frames. And thus make the mouse affordable. It’s amazing what can happen when you combine all that user research and academic rigor from Englebarts team and engineering advancements documented at Xerox PARC with world-class industrial design. You see this trend played out over and over with the innovations in computing that are built to last. The mouse would ship with the LISA and then with the 1984 Mac. Logitech had shipped a mouse in 1982 for $300. After leaving Xerox, Jack Howley founded a company to sell a mouse for $400 the same year. Microsoft released a mouse for $200 in 1983. But Apple changed the world when Steve Jobs demanded the mouse ship with all Macs. The IBM PC would ;use a mouse and from there it would become ubiquitous in personal computing. Desktops would ship with a mouse. Laptops would have a funny little button that could be used as a mouse when the actual mouse was unavailable. The mouse would ship with extra buttons that could be mapped to additional workflows or macros. And even servers were then outfitted with switches that allowed using a device that switched the keyboard, video, and mouse between them during the rise of large server farms to run the upcoming dot com revolution. Trays would be put into most racks with a single u, or unit of the rack being used to see what you’re working on; especially after Windows or windowing servers started to ship. As various technologies matured, other innovations came along to input devices. The mouse would go optical in 1980 and ship with early Xerox Star computers but what we think of as an optical mouse wouldn’t really ship until 1999 when Microsoft released the IntelliMouse. Some of that tech came to them via Hewlett-Packard through the HP acquisition of DEC and some of those same Digital Research Institute engineers had been brought in from the original mainstreamer of the mouse, PARC when Bob Taylor started DRI. The LED sensor on the muse stuck around. And thus ended the era of the mouse pad, once a hallmark of many a marketing give-away. Finger tracking devices came along in 1969 but were far too expensive to produce at the time. As capacitive sensitive pads, or trackpads came down in price and the technology matured those began to replace the previous mouse-types of devices. The 1982 Apollo computers were the first to ship with a touchpad but it wasn’t until Synaptics launched the TouchPad in 1992 that they began to become common, showing up in 1995 on Apple laptops and then becoming ubiquitous over the coming years. In fact, the IBM Thinkpad and many others shipped laptops with little red nubs in the keyboard for people that didn’t want to use the TouchPad for awhile as well. Some advancements in the mouse didn’t work out. Apple released the hockey puck shaped mouse in 1998, when they released the iMac. It was USB, which replaced the ADB interface. USB lasted. The shape of the mouse didn’t. Apple would go to the monolithic surface mouse in 2000, go wireless in 2003 and then release the Mighty Mouse in 2005. The Mighty Mouse would have a capacitive touch sensor and since people wanted to hear a click would produce that with a little speaker. This also signified the beginning of bluetooth as a means of connecting a mouse. Laptops began to replace desktops for many, and so the mouse itself isn’t as dominant today. And with mobile and tablet computing, resistive touchscreens rose to replace many uses for the mouse. But even today, when I edit these podcasts, I often switch over to a mouse simply because other means of dragging around timelines simply aren’t as graceful. And using a pen, as Englebart’s research from the 60s indicated, simply gets fatiguing. Whether it’s always obvious, we have an underlying story we’re often trying to tell with each of these episodes. We obviously love unbridled innovation and a relentless drive towards a technologically utopian multiverse. But taking a step back during that process and researching what people want means less work and faster adoption. Doug Englebart was a lot of things but one net-new point we’d like to make is that he was possibly the most innovative in harnessing user research to make sure that his innovations would last for decades to come. Today, we’d love to research every button and heat map and track eyeballs. But remembering, as he did, that our job is to augment human intellect, is best done when we make our advances useful, helps to keep us and the forks that occur in technology from us, from having to backtrack decades of work in order to take the next jump forward. We believe in the reach of your innovations. So next time you’re working on a project. Save yourself time, save your code a little cyclomatic complexity, , and save users frustration from having to relearn a whole new thing. And research what you’re going to do first. Because you never know. Something you engineer might end up being touched by nearly every human on the planet the way the mouse has. Thank you Englebart. And thank you to NASA and Bob Roberts from ARPA for funding such important research. And thank you to Xerox PARC, for carrying the torch. And to Steve Jobs for making the mouse accessible to every day humans. As with many an advance in computing, there are a lot of people that deserve a little bit of the credit. And thank you listeners, for joining us for another episode of the history of computing podcast. We’re so lucky to have you. Now stop consuming content and go change the world.


Spacewar! (the Game)

     8/25/2019

It really seems like in the last decade video games have gone from a somewhat niche hobby to a widespread part of our culture. Nowadays, there are a multitude of ways to get out gaming fix. Consoles, handheld game systems, and even smartphones make video games more accessible than ever. But when and how exactly did video games start to creep into the modern consciousness?

In this episode we look at some of the earliest video games and how they came to be.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1962: Spacewar! Developed


Bob Tayler: ARPA to PARC to DEC

     1/15/2021

Robert Taylor was one of the true pioneers in computer science. In many ways, he is the string (or glue) that connected the US governments era of supporting computer science through ARPA to innovations that came out of Xerox PARC and then to the work done at Digital Equipment Corporation’s Systems Research Center. Those are three critical aspects of the history of computing and while Taylor didn’t write any of the innovative code or develop any of the tools that came out of those three research environments, he saw people and projects worth funding and made sure the brilliant scientists got what they needed to get things done.

The 31 years in computing that his stops represented were some of the most formative years for the young computing industry and his ability to inspire the advances that began with Vannevar Bush’s 1945 article called “As We May Think” then ended with the explosion of the Internet across personal computers. 

Bob Taylor inherited a world where computing was waking up to large crusty but finally fully digitized mainframes stuck to its eyes in the morning and went to bed the year Corel bought WordPerfect because PCs needed applications, the year the Pentium 200 MHz was released, the year Palm Pilot and eBay were founded, the year AOL started to show articles from the New York Times, the year IBM opened a we web shopping mall and the year the Internet reached 36 million people. Excite and Yahoo went public. Sometimes big, sometimes small, all of these can be traced back to Bob Taylor - kinda’ how we can trace all actors to Kevin Bacon. But more like if Kevin Bacon found talent and helped them get started, by paying them during the early years of their careers… 

How did Taylor end up as the glue for the young and budding computing research industry? Going from tween to teenager during World War II, he went to Southern Methodist University in 1948, when he was 16. He jumped into the US Naval Reserves during the Korean War and then got his masters in psychology at the University of Texas at Austin using the GI Bill. Many of those pioneers in computing in the 60s went to school on the GI Bill. It was a big deal across every aspect of American life at the time - paving the way to home ownership, college educations, and new careers in the trades. From there, he bounced around, taking classes in whatever interested him, before taking a job at Martin Marietta, helping design the MGM-31 Pershing and ended up at NASA where he discovered the emerging computer industry. 

Taylor was working on projects for the Apollo program when he met JCR Licklider, known as the Johnny Appleseed of computing. Lick, as his friends called him, had written an article called Man-Computer Symbiosis in 1960 and had laid out a plan for computing that influenced many. One such person, was Taylor. And so it was in 1962 he began and in 1965 that he succeeded in recruiting Taylor away from NASA to take his place running ARPAs Information Processing Techniques Office, or IPTO. 

Taylor had funded Douglas Engelbart’s research on computer interactivity at Stanford Research Institute while at NASA. He continued to do so when he got to ARPA and that project resulted in the invention of the computer mouse and the Mother of All Demos, one of the most inspirational moments and a turning point in the history of computing. 

They also funded a project to develop an operating system called Multics. This would be a two million dollar project run by General Electric, MIT, and Bell Labs. Run through Project MAC at MIT there were just too many cooks in the kitchen. Later, some of those Bell Labs cats would just do their own thing. Ken Thompson had worked on Multics and took the best and worst into account when he wrote the first lines of Unix and the B programming language, then one of the most important languages of all time, C. 

Interactive graphical computing and operating systems were great but IPTO, and so Bob Taylor and team, would fund straight out of the pentagon, the ability for one computer to process information on another computer. Which is to say they wanted to network computers. It took a few years, but eventually they brought in Larry Roberts, and by late 1968 they’d awarded an RFQ to build a network to a company called Bolt Beranek and Newman (BBN) who would build Interface Message Processors, or IMPs. The IMPS would connect a number of sites and route traffic and the first one went online at UCLA in 1969 with additional sites coming on frequently over the next few years. That system would become ARPANET, the commonly accepted precursor to the Internet. 

There was another networking project going on at the time that was also getting funding from ARPA as well as the Air Force, PLATO out of the University of Illinois. PLATO was meant for teaching and had begun in 1960, but by then they were on version IV, running on a CDC Cyber and the time sharing system hosted a number of courses, as they referred to programs. These included actual courseware, games, convent with audio and video, message boards, instant messaging, custom touch screen plasma displays, and the ability to dial into the system over lines, making the system another early network. 

Then things get weird. Taylor is sent to Vietnam as a civilian, although his rank equivalent would be a brigadier general. He helped develop the Military Assistance Command in Vietnam. Battlefield operations and reporting were entering the computing era. Only problem is, while Taylor was a war veteran and had been deep in the defense research industry for his entire career, Vietnam was an incredibly unpopular war and seeing it first hand and getting pulled into the theater of war, had him ready to leave. This combined with interpersonal problems with Larry Roberts who was running the ARPA project by then over Taylor being his boss even without a PhD or direct research experience. And so Taylor joined a project ARPA had funded at the University of Utah and left ARPA. 

There, he worked with Ivan Sutherland, who wrote Sketchpad and is known as the Father of Computer Graphics, until he got another offer. This time, from Xerox to go to their new Palo Alto Research Center, or PARC. One rising star in the computer research world was pretty against the idea of a centralized mainframe driven time sharing system. This was Alan Kay. In many ways, Kay was like Lick. And unlike the time sharing projects of the day, the Licklider and Kay inspiration was for dedicated cycles on processors. This meant personal computers. 

The Mansfield Amendment in 1973 banned general research by defense agencies. This meant that ARPA funding started to dry up and the scientists working on those projects needed a new place to fund their playtime. Taylor was able to pick the best of the scientists he’d helped fund at ARPA. He helped bring in people from Stanford Research Institute, where they had been working on the oNLineSystem, or NLS. 

This new Computer Science Laboratory landed people like Charles Thacker, David Boggs, Butler Lampson, and Bob Sproul and would develop the Xerox Alto, the inspiration for the Macintosh. The Alto though contributed the very ideas of overlapping windows, icons, menus, cut and paste, word processing. In fact, Charles Simonyi from PARC would work on Bravo before moving to Microsoft to spearhead Microsoft Word.

Bob Metcalfe on that team was instrumental in developing Ethernet so workstations could communicate with ARPANET all over the growing campus-connected environments. Metcalfe would leave to form 3COM. 

SuperPaint would be developed there and Alvy Ray Smith would go on to co-found Pixar, continuing the work begun by Richard Shoup. 

They developed the Laser Printer, some of the ideas that ended up in TCP/IP, and the their research into page layout languages would end up with Chuck Geschke, John Warnock and others founding Adobe. 

Kay would bring us the philosophy behind the DynaBook which decades later would effectively become the iPad. He would also develop Smalltalk with Dan Ingalls and Adele Goldberg, ushering in the era of object oriented programming. 

They would do pioneering work on VLSI semiconductors, ubiquitous computing, and anything else to prepare the world to mass produce the technologies that ARPA had been spearheading for all those years. Xerox famously did not mass produce those technologies. And nor could they have cornered the market on all of them. The coming waves were far too big for one company alone. 

And so it was that PARC, unable to bring the future to the masses fast enough to impact earnings per share, got a new director in 1983 and William Spencer was yet another of three bosses that Taylor clashed with. Some resented that he didn’t have a PhD in a world where everyone else did. Others resented the close relationship he maintained with the teams. Either way, Taylor left PARC in 1983 and many of the scientists left with him. 

It’s both a curse and a blessing to learn more and more about our heroes. Taylor was one of the finest minds in the history of computing. His tenure at PARC certainly saw the a lot of innovation and one of the most innovative teams to have ever been assembled. But as many of us that have been put into a position of leadership, it’s easy to get caught up in the politics. I am ashamed every time I look back and see examples of building political capital at the expense of a project or letting an interpersonal problem get in the way of the greater good for a team. But also, we’re all human and the people that I’ve interviewed seem to match the accounts I’ve read in other books. 

And so Taylor’s final stop was Digital Equipment Corporation where he was hired to form their Systems Research Center in Palo Alto. They brought us the AltaVista search engine, the Firefly computer, Modula-3 and a few other advances. Taylor retired in 1996 and DEC was acquired by Compaq in 1998 and when they were acquired by HP the SRC would get merged with other labs at HP. 

From ARPA to Xerox to Digital, Bob Taylor certainly left his mark on computing. He had a knack of seeing the forest through the trees and inspired engineering feats the world is still wrestling with how to bring to fruition. Raw, pure science. He died in 2017. He worked with some of the most brilliant people in the world at ARPA. He inspired passion, and sometimes drama in what Stanford’s Donald Knuth called “the greatest by far team of computer scientists assembled in one organization.” 

In his final email to his friends and former coworkers, he said “You did what they said could not be done, you created things that they could not see or imagine.” The Internet, the Personal Computer, the tech that would go on to become Microsoft Office, object oriented programming, laser printers, tablets, ubiquitous computing devices. So, he isn’t exactly understating what they accomplished in a false sense of humility. I guess you can’t do that often if you’re going to inspire the way he did. 

So feel free to abandon the pretense as well, and go inspire some innovation. Heck, who knows where the next wave will come from. But if we aren’t working on it, it certainly won’t come.

Thank you so much and have a lovely, lovely day. We are so lucky to have you join us on yet another episode. 


(OldComputerPods) ©Sean Haas, 2020