'digital' Episodes

VMS

     2/11/2020

VMS and OpenVMS Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us to innovate (and sometimes cope with) the future! Today we’re going to talk through the history of VMS.

Digital Equipment Corporation gave us many things. Once upon a time, I used a DEC Alpha running OpenVMS. The PDP-11 had changed the world, introducing us to a number of modern concepts in computers such as time sharing. The PDP was a minicomputer, smaller and more modern than mainframes. But by 1977 it was time for the next generation and the VAX ushered in the 32-bit era of computers and through the evolutions evolve into the VaXServer, helping to usher in the modern era of client-server architectures. It supported Branch delay slots and suppressed instructions. The VAX adopted virtual memory, privilege modes, and needed an operating system capable of harnessing all the new innovations packed into the VAX-11 and on. That OS would be Virtual Memory System, or VMS. The PDP had an operating system called RSX-11, which had been released in 1972. The architect was Dan Brevik, who had originally called it DEX as a homonym with DEC. But that was trademarked so he and Bob Decker over in marketing wrote down a bunch of acronyms and then found one that wasn’t trademarked. Then they had to reverse engineer a meaning out of the acronym to be Real-Time System Executive, or RSX. But for the VAX they needed more and so Dave Cutler from the RSX team, then in his early 30s, did much of the design work. Dick Hustvedt and Peter Lipman would join him and they would roll up to Roger Gourd, who worked with DECs VP of engineering Gordon Bell to build the environment. The project began as Starlet, named because it was meant to support the Startlet family of processors. A name that still lives on in various files in the operating system. The VMS Operating System would support RISC instructions, support 32-bit virtual address extension, would work with DECnet, would have virtual memory of course, as the name implies. VMS would bring a number of innovations in the world of clustering. VMS would use a modified Julian Day system to keep track of system time, which subtracts the Julian Date from 2,400,000.5. Why? Because it begins on November 17th, 1858. THat’s not why, that the day it starts. Why? Because it’s not Y10,000 compliant only having 4 slots for dates. Wait, that’s not a thing. Anyway, how did VMS come to be? One of the killer apps for the system though, was that DECnet was built on DIGITAL Network Architecture, or DNA. It first showed up in RSX, where you could like two PDPs but you could have 32 nodes by the time VaX showed up and 255 with VMS 2. Suddenly there was a simple way to network these machines, built into the OS. Version 1 was released in 1977 in support of the VAX-11/780. Version 2 would come along in 1980 for the 750 and Version 3 would come in 1982 for the 730. The VAX 8600 would ship in 84 with version 4. And here’s where it gets interesting. The advent of what were originally called microcomputers but are now called personal computers had come in the late 70s and early 80s. By 1984, MicroVMS was released as a port for running on the MicroVAX, Digitals attempt to go down-market. Much as IBM had missed minicomputers initially, Digital had missed the advent of microcomputers though and the platform never took off. Bill Gates would adorn the cover of Time that year. Of course, by 84, Apple had AppleTalk and DOS was ready to plug in as well. Bill Joy moved BSD away from VAX in 1986, after having been with the PDP and then VAX for years, before leaving for Sun. At this point the platform was getting a bit long in the tooth. Intel and Microsoft were just starting to emerge as dominant players in computing and DEC was the number two software company in the world, with a dominant sales team and world class research scientists. They released ULTRIX the same year though, as well as the DECStation with a desktop environment called UW for ULTRIX Workstation. Ultrix was based on BSD 4 and given that most Unixes had been written on PDPs, Bill Joy knew many of the group launched by Bill Munson, Jerry Brenner, Fred Canter and Bill Shannon. Cutler from that OpenVMS team hates Unix. Rather than have a unified approach, the strategy was fragmented. You see a number of times in the history of computing where a company begins to fail not because team members are releasing things that don’t fit within the strategy but because they release things that compete directly with a core product without informing their customers why. Thus bogging down the sales process and subsequent adoption in confusion. This led to brain drain. Cutler ended up going to the Windows NT team and bringing all of his knowledge about security and his sincere midwestern charm to Microsoft, managing the initial development after relations with IBM in the OS/2 world soured. He helped make NT available for the Alpha but also helping make NT dominate the operating system from his old home. Cutler would end up working on XP, Server operating systems, Azure and getting the Xbox to run as a host for Hyper-V . He’s just that rad and his experience goes back to the mid 60s, working on IBM 7044 mainframes. Generational changes in software development, like the move to object oriented programming or micro services, can force a lot of people into new career trajectories. But he was never one of those. That’s the kind of talent you just really, really, really hate to watch leave an organization - someone that even Microsoft name drops in developer conference session to get ooohs and aaahs. And there were a lot of them leaving as DEC shifted into more of a sales and marketing company and less into a product and research company as it had founded to be back when Ken Olsen was at MIT. We saw the same thing happen in other areas of DEC - competing chips coming out of different groups. But still they continued on. And the lack of centralizing resources and innovating quickly and new technical debt being created caused the release of 5 to slip from a 2 year horizon to a 4 year horizon, shipping in 1988 with Easynet, so you could connect 2,000 computers together. Version 6 took 5 years to get out the door in 1993. In a sign of the times, 1991 saw VMS become OpenVMS and would make OpenVMS POSIX compliant. 1992 saw the release of the DEC Alpha and OpenVMS would quickly get support for the RISC processor which OpenVMS would support through the transition of Alpha to Itanium when Intel bought the rights for the Alpha architecture. Version 7 of OpenVMS shipped in 1996 but by then the company was in a serious period of decline and corporate infighting and politics killed them. 1998 came along and they practically bankrupted Compaq by being acquired and then HP swooped in and got both for a steal. Enterprise computing has never been the same. HP made some smart decisions though. They inked a deal with Intel and Alpha would become the HP Itanium and made by Intel. Intel then had a RISC processor and all the IP that goes along with that. Version 8 would not be released until 2003. 7 years without an OS update while the companies were merged and remerged had been too long. Market share had all but disappeared. DECnet would go on to live in the Linux kernel until 2010. Use of the protocol was replaced by TCP/IP much the same way most of the other protocols got replaced. OpenVMS development has now been licensed to VSI and is now run by vmssoftware, which supports many former DEC and HP employees. There are a lot of great, innovative, unique features of OpenVMS. There’s a common language environment, that allows for calling functions easily and independently of various languages. You can basically mix Fortran, C, BASIC, and other languages. It’s kinda’ like my grandmas okra. She said I’d like it but I didn’t. VMS is built much the same way. They built it one piece at a time. To quote Johnny Cash: “The transmission was a fifty three, And the motor turned out to be a seventy three, And when we tried to put in the bolts all the holes were gone.” You can of course install PHP, Ruby, Java, and other more modern languages if you want. And the System Services, Run Time Libraries, and language support make it easy to use whatever works for a task across them pretty equally and provides a number of helpful debugging tools along the way. And beyond debugging, OpenVMS pretty much supports anything you find required by the National Computer Security Center and the DoD. And after giving the middle finger to Intel for decades… As with most operating systems, VMS is finally being ported to the x86 architecture signaling the end of one of the few holdouts to the dominance of the x86 architecture in some ways. The Itatiums have shipped less and less chips every year, so maybe we’re finally at that point. Once OpenVMS has been ported to x86 we may see the final end to the chip line as the last windows versions to support them stopped actually being supported by Microsoft about a month before this recording. The end of an era. I hope Dave Cutler looks back on his time on the VMS project fondly. Sometimes a few decades of crushing an old employer can help heal some old wounds. His contributions to computing are immense, as are those of Digital. And we owe them all a huge thanks for the techniques and lessons learned in the development of VMS in the early days, as with the early days of BSD, the Mac, Windows 1, and others. It all helped build a massive body of knowledge that we continue to iterate off of to this day. I also owe them a thank you for the time I got to spend on my first DEC Alpha. I didn’t get to touch another 64 bit machine for over a decade. And I owe them a thanks for everything I learned using OpenVMS on that machine! And to you, wonderful listers. Thank you for listening. And especially Derek, for reaching out to tell me I should move OpenVMS up in the queue. I guess it goes without saying… I did! Hope you all have a great day!


The Data General Nova

     3/5/2020

Today we’re going to talk through the history of the Data General Nova. Digital Equipment was founded in 1957 and released a game changing computer, the PDP-8, in 1965. We covered Digital in a previous episode, but to understand the Data General Nova, you kinda’ need to understand the PDP. It was a fully transistorized computer and it was revolutionary in the sense that it brought interactive computing to the masses. Based in part on research from work done for MIT in the TX-0 era, the PDP made computing more accessible to companies that couldn’t spend millions on computers and it was easier to program - and the PDP-1 could be obtained for less than a hundred thousand dollars. You could use a screen, type commands on a keyboard for the first time and it would actually output to screen rather than reading teletypes or punch cards. That interactivity unlocked so much.

The PDP began the minicomputer revolution. The first real computer game Spacewar! Was played on it and the adoption increased. The computers got faster. They could do as much as large mainframes. The thousands of transistors were faster and less error-prone than the old tubes. In fact, those transistors signaled that the third generation of computers was upon us. And people who liked the PDP were life-long converts. Fanatical even. The PDP evolved until 1965 when the PDP-8 was released. This is where Edson de Castro comes in, acting as the project manager for the PDP-8 development at Digital. 3 years later, he, Henry Burkhardt, and Richard Sogge of Digital would be joined by Herbert Richman a sales person from Fairchild Semiconductor.

They were proud of the PDP-8. It was a beautiful machine. But they wanted to go even further. And they didn’t feel like they could do so at Digital. They would build a less expensive minicomputer that opened up even more markets. They saw new circuit board manufacturing techniques, new automation techniques, new reasons to abandon the 12-bit CPU techniques. Edson had wanted to build a PDP with all of this and the ability to use 8 bit, 16 bit, or 32 bit architectures, but it got shut down at Digital. So they got two rounds of venture capital at $400,000 each and struck out on their own. They wanted the computer to fit into a 19-inch rack mount. That choice would basically make the 19 inch rack the standard from then on.

They wanted the machines to be 16-bit, moving past the 8 or 12 bit computers common in mini-computing at the time. They used an accumulator-based architecture, which is to say that there was a CPU that had a register that stored the results of various bits of code. This way you weren’t writing the results of all the maths into memory and then sending it right back to the CPU. Suddenly, you could do infinitely more math! Having someone from Fairchild really unlocked a lot of knowledge about what was happening in the integrated circuit market. They were able to get the price down into the thousands, not tens of thousands.

You could actually buy a computer for less than 4 thousand dollars.

The Nova would ship in 1969 and be an instant success with a lot of organizations. Especially smaller science labs like one at the University of Texas that was their first real paying cusotmer. Within 6 months they sold 100 units and within the first few years, they were over $100 million in sales. They were seeking into Digital’s profits. No one would have invested in Digital had they tried to compete head-on with IBM. Digital had become the leader in the minicomputer market, effectively owning the category. But Nova posed a threat. Until they decided to get into a horse race with Digital and release the SuperNOVA to compete with the PDP-11. They used space age designs. They were great computers. But Digital was moving faster. And Data General started to have production and supply chain problems, which led to law suits and angry customers. Never good.

By 1977 Digital came out with the VAX line, setting the standard to 32-bit. Data General was late to that party and honestly, after being a market leader in low-cost computing they started to slip. By the end of the 70s microchips and personal computers would basically kill minicomputers and while transitioning from minicomputers to servers, Data General never made quite the same inroads that Digital Equipment did. Data General would end up with their own DOS, like everyone their own UNIX System V variant, one of the first portable computers, but by the mid-80s, IBM showed up on the market and Data General would make databases and a number of other areas to justify what was becoming a server market.

In fact, the eventual home for Data General would be to get acquired by EMC and become CLaRiiON under the EMC imprint. It was an amazing rise. Hardware that often looked like it came straight out of Buck Rogers. Beautiful engineering. But you just can’t compete on price and stay in business forever. Especially when you’re competing with your former bosses who have much much deeper pockets.

EMC benefited from a lot of these types of acquisitions over the years, to become a colossus by the end of the 2010s. We can thank Data General and specifically the space age nova, for helping set many standards we use today. We can thank them for helping democratize computing in general. And if you’re a heavy user of EMC appliances, you can probably thank them for plenty of underlying bits of what you do even through to today. But the minicomputer market required companies to make their own chips in that era and that was destroyed by the dominance of Intel in the microchip industry. It’s too bad.

So many good ideas. But the costs to keep up turned out to be too much for them, as with many other vendors. One way to think about this story. You can pick up on new manufacturing and design techniques and compete with some pretty large players, especially on price. But when the realities of scaling an operation come you can’t stumble or customer confidence will erode and there’s a chance you won’t get to compete for deals again in the future. But try telling that to your growing sales team.

I hear people say you have to outgrow the growth rate of your category. You don’t. But you do have to do what you say you will and deliver. And when changes in the industry come, you can’t be all over the place. A cohesive strategy will help you whether the storm. So thank you for tuning into this episode of the History of Computing Podcast. We are so lucky you chose to join us and we hope to see you next time! Have a great day!


Happy Birthday ENIAC

     2/15/2020

Today we’re going to celebrate the birthday of the first real multi-purpose computer: the gargantuan ENIAC which would have turned 74 years old today, on February 15th. Many generations ago in computing. The year is 1946. World War II raged from 1939 to 1945. We’d cracked Enigma with computers and scientists were thinking of more and more ways to use them. The press is now running articles about a “giant brain” built in Philadelphia. The Electronic Numerical Integrator and Computer was a mouthful, so they called it ENIAC. It was the first true electronic computer. Before that there were electromechanical monstrosities. Those had to physically move a part in order to process a mathematical formula. That took time. ENIAC used vacuum tubes instead. A lot of them. To put things in perspective: very hour of processing by the ENiAC was worth 2,400 hours of work calculating formulas by hand. And it’s not like you can do 2,400 hours in parallel between people or in a row of course. So it made the previous almost impossible, possible. Sure, you could figure out the settings to fire a bomb where you wanted two bombs to go in a minute rather than about a full day of running calculations. But math itself, for the purposes of math, was about to get really, really cool. The Bush Differential Analyzer, a later mechanical computer, had been built in the basement of the building that is now the ENIAC museum. The University of Pennsylvania ran a class on wartime electronics, based on their experience with the Differential Analyzer. John Mauchly and J. Presper Eckert met in 1941 while taking that class, a topic that had included lots of shiny new or newish things like radar and cryptanalysis. That class was mostly on ballistics, a core focus at the Moore School of Electrical Engineering at the University of Pennsylvania. More accurate ballistics would be a huge contribution to the war effort. But Echert and Mauchly wanted to go further, building a multi-purpose computer that could analyze weather and calculate ballistics. Mauchly got all fired up and wrote a memo about building a general purpose computer. But the University shot it down. And so ENIAC began life as Project PX when Herman Goldstine acted as the main sponsor after seeing their proposal and digging it back up. Mauchly would team up with Eckert to design the computer and the effort was overseen and orchestrated by Major General Gladeon Barnes of the US Army Ordnance Corps. Thomas Sharpless was the master programmer. Arthur Burkes built the multiplier. Robert Shaw designed the function tables. Harry Huskey designed the reader and the printer. Jeffrey Chu built the dividers. And Jack Davis built the accumulators. Ultimately it was just a really big calculator and not a computer that ran stored programs in the same way we do today. Although ENIAC did get an early version of stored programming that used a function table for read only memory. The project was supposed to cost $61,700. The University of Pennsylvania Department of Computer and Information Science in Philadelphia actually spent half a million dollars worth of metal, tubes and wires. And of course the scientists weren’t free. That’s around $6 and a half million worth of cash today. And of course it was paid for by the US Army. Specifically the Ballistic Research Laboratory. It was designed to calculate firing tables to make blowing things up a little more accurate. Herman Goldstine chose a team of programmers that included Betty Jennings, Betty Snyder, Kay McNulty, Fran Bilas, Marlyn Meltzer, and Ruth Lichterman. They were chosen from a pool of 200 and set about writing the necessary formulas for the machine to process the requirements provided from people using time on the machine. In fact, Kay McNulty invented the concept of subroutines while working on the project. They would flip switches and plug in cables as a means of programming the computer. And programming took weeks of figuring up complex calculations on paper. . Then it took days of fiddling with cables, switches, tubes, and panels to input the program. Debugging was done step by step, similar to how we use break points today. They would feed ENIAC input using IBM punch cards and readers. The output was punch cards as well and these punch cards acted as persistent storage. The machine then used standard octal radio tubes. 18000 tubes and they ran at a lower voltage than they could in order to minimize them blowing out and creating heat. Each digit used in calculations took 36 of those vacuum tubes and 20 accumulators that could run 5,000 operations per second. The accumulators used two of those tubes to form a flip-flop and they got them from the Kentucky Electrical Lamp Company. Given the number that blew every day they must have loved life until engineers got it to only blowing a tube every couple of days. ENIAC was modular computer and used different panels to perform different tasks, or functions. It used ring counters with 10 positions for a lot of operations making it a digital computer as opposed to the modern binary computational devices we have today. The pulses between the rings were used to count. Suddenly computers were big money. A lot of research had happened in a short amount of time. Some had been government funded and some had been part of corporations and it became impossible to untangle the two. This was pretty common with technical advances during World War II and the early Cold War years. John Atanasoff and Cliff Berry had ushered in the era of the digital computer in 1939 but hadn’t finished. Maunchly had seen that in 1941. It was used to run a number of calculations for the Manhattan Project, allowing us to blow more things up than ever. That project took over a million punch cards and took precedent over artillery tables. Jon Von Neumann worked with a number of mathematicians and physicists including Stanislaw Ulam who developed the Monte Method. That led to a massive reduction in programming time. Suddenly programming became more about I/O than anything else. To promote the emerging computing industry, the Pentagon had the Moore School of Electrical Engineering at The University of Pennsylvania launch a series of lectures to further computing at large. These were called the Theory and Techniques for Design of Electronic Digital Computers, or just the Moore School Lectures for short. The lectures focused on the various types of circuits and the findings from Eckert and Mauchly on building and architecting computers. Goldstein would talk at length about math and other developers would give talks, looking forward to the development of the EDVAC and back at how they got where they were with ENIAC. As the University began to realize the potential business impact and monetization, they decided to bring a focus to University owned patents. That drove the original designers out of the University of Pennsylvania and they started the Eckert-Mauchly Computer Corporation in 1946. Eckert-Mauchley would the build EDVAC, taking use of progress the industry had made since the ENIAC construction had begun. EDVAC would effectively represent the wholesale move away from digital and into binary computing and while it weighed tons - it would become the precursor to the microchip. After the ENIAC was finished Mauchly filed for a patent in 1947. While a patent was granted, you could still count on your fingers the number of machines that were built at about the same time, including the Atanasoff Berry Computer, Colossus, the Harvard Mark I and the Z3. So luckily the patent was avoided and digital computers are a part of the public domain. That patent was voided in 1973. By then, the Eckert-Mauchly computer corporation had been acquired by Remington Rand, which merged with Sperry and is now called Unisys. The next wave of computers would be mainframes built by GE, Honeywell, IBM, and another of other vendors and so the era of batch processing mainframes began. The EDVAC begat the UNIVAC and Grace Hopper being brought in to write an assembler for that. Computers would become the big mathematical number crunchers and slowly spread into being data processors from there. Following decades of batch processing mainframes we would get minicomputers and interactivity, then time sharing, and then the PC revolution. Distinct eras in computing. Today, computers do far more than just the types of math the ENIAC did. In fact, the functionality of ENIAC was duplicated onto a 20 megahertz microchip in 1996. You know, ‘cause the University of Pennsylvania wanted to do something to celebrate the 50th birthday. And a birthday party seemed underwhelming at the time. And so the date of release for this episode is February 15th, now ENIAC Day in Philadelphia, dedicated as a way to thank the university, creators, and programmers. And we should all reiterate their thanks. They helped put computers front and center into the thoughts of the next generation of physicists, mathematicians, and engineers, who built the mainframe era. And I should thank you - for listening to this episode. I’m pretty lucky to have ya’. Have a great day! .


(OldComputerPods) ©Sean Haas, 2020