'computing' Episodes

Happy Birthday ENIAC


Today we’re going to celebrate the birthday of the first real multi-purpose computer: the gargantuan ENIAC which would have turned 74 years old today, on February 15th. Many generations ago in computing. The year is 1946. World War II raged from 1939 to 1945. We’d cracked Enigma with computers and scientists were thinking of more and more ways to use them. The press is now running articles about a “giant brain” built in Philadelphia. The Electronic Numerical Integrator and Computer was a mouthful, so they called it ENIAC. It was the first true electronic computer. Before that there were electromechanical monstrosities. Those had to physically move a part in order to process a mathematical formula. That took time. ENIAC used vacuum tubes instead. A lot of them. To put things in perspective: very hour of processing by the ENiAC was worth 2,400 hours of work calculating formulas by hand. And it’s not like you can do 2,400 hours in parallel between people or in a row of course. So it made the previous almost impossible, possible. Sure, you could figure out the settings to fire a bomb where you wanted two bombs to go in a minute rather than about a full day of running calculations. But math itself, for the purposes of math, was about to get really, really cool. The Bush Differential Analyzer, a later mechanical computer, had been built in the basement of the building that is now the ENIAC museum. The University of Pennsylvania ran a class on wartime electronics, based on their experience with the Differential Analyzer. John Mauchly and J. Presper Eckert met in 1941 while taking that class, a topic that had included lots of shiny new or newish things like radar and cryptanalysis. That class was mostly on ballistics, a core focus at the Moore School of Electrical Engineering at the University of Pennsylvania. More accurate ballistics would be a huge contribution to the war effort. But Echert and Mauchly wanted to go further, building a multi-purpose computer that could analyze weather and calculate ballistics. Mauchly got all fired up and wrote a memo about building a general purpose computer. But the University shot it down. And so ENIAC began life as Project PX when Herman Goldstine acted as the main sponsor after seeing their proposal and digging it back up. Mauchly would team up with Eckert to design the computer and the effort was overseen and orchestrated by Major General Gladeon Barnes of the US Army Ordnance Corps. Thomas Sharpless was the master programmer. Arthur Burkes built the multiplier. Robert Shaw designed the function tables. Harry Huskey designed the reader and the printer. Jeffrey Chu built the dividers. And Jack Davis built the accumulators. Ultimately it was just a really big calculator and not a computer that ran stored programs in the same way we do today. Although ENIAC did get an early version of stored programming that used a function table for read only memory. The project was supposed to cost $61,700. The University of Pennsylvania Department of Computer and Information Science in Philadelphia actually spent half a million dollars worth of metal, tubes and wires. And of course the scientists weren’t free. That’s around $6 and a half million worth of cash today. And of course it was paid for by the US Army. Specifically the Ballistic Research Laboratory. It was designed to calculate firing tables to make blowing things up a little more accurate. Herman Goldstine chose a team of programmers that included Betty Jennings, Betty Snyder, Kay McNulty, Fran Bilas, Marlyn Meltzer, and Ruth Lichterman. They were chosen from a pool of 200 and set about writing the necessary formulas for the machine to process the requirements provided from people using time on the machine. In fact, Kay McNulty invented the concept of subroutines while working on the project. They would flip switches and plug in cables as a means of programming the computer. And programming took weeks of figuring up complex calculations on paper. . Then it took days of fiddling with cables, switches, tubes, and panels to input the program. Debugging was done step by step, similar to how we use break points today. They would feed ENIAC input using IBM punch cards and readers. The output was punch cards as well and these punch cards acted as persistent storage. The machine then used standard octal radio tubes. 18000 tubes and they ran at a lower voltage than they could in order to minimize them blowing out and creating heat. Each digit used in calculations took 36 of those vacuum tubes and 20 accumulators that could run 5,000 operations per second. The accumulators used two of those tubes to form a flip-flop and they got them from the Kentucky Electrical Lamp Company. Given the number that blew every day they must have loved life until engineers got it to only blowing a tube every couple of days. ENIAC was modular computer and used different panels to perform different tasks, or functions. It used ring counters with 10 positions for a lot of operations making it a digital computer as opposed to the modern binary computational devices we have today. The pulses between the rings were used to count. Suddenly computers were big money. A lot of research had happened in a short amount of time. Some had been government funded and some had been part of corporations and it became impossible to untangle the two. This was pretty common with technical advances during World War II and the early Cold War years. John Atanasoff and Cliff Berry had ushered in the era of the digital computer in 1939 but hadn’t finished. Maunchly had seen that in 1941. It was used to run a number of calculations for the Manhattan Project, allowing us to blow more things up than ever. That project took over a million punch cards and took precedent over artillery tables. Jon Von Neumann worked with a number of mathematicians and physicists including Stanislaw Ulam who developed the Monte Method. That led to a massive reduction in programming time. Suddenly programming became more about I/O than anything else. To promote the emerging computing industry, the Pentagon had the Moore School of Electrical Engineering at The University of Pennsylvania launch a series of lectures to further computing at large. These were called the Theory and Techniques for Design of Electronic Digital Computers, or just the Moore School Lectures for short. The lectures focused on the various types of circuits and the findings from Eckert and Mauchly on building and architecting computers. Goldstein would talk at length about math and other developers would give talks, looking forward to the development of the EDVAC and back at how they got where they were with ENIAC. As the University began to realize the potential business impact and monetization, they decided to bring a focus to University owned patents. That drove the original designers out of the University of Pennsylvania and they started the Eckert-Mauchly Computer Corporation in 1946. Eckert-Mauchley would the build EDVAC, taking use of progress the industry had made since the ENIAC construction had begun. EDVAC would effectively represent the wholesale move away from digital and into binary computing and while it weighed tons - it would become the precursor to the microchip. After the ENIAC was finished Mauchly filed for a patent in 1947. While a patent was granted, you could still count on your fingers the number of machines that were built at about the same time, including the Atanasoff Berry Computer, Colossus, the Harvard Mark I and the Z3. So luckily the patent was avoided and digital computers are a part of the public domain. That patent was voided in 1973. By then, the Eckert-Mauchly computer corporation had been acquired by Remington Rand, which merged with Sperry and is now called Unisys. The next wave of computers would be mainframes built by GE, Honeywell, IBM, and another of other vendors and so the era of batch processing mainframes began. The EDVAC begat the UNIVAC and Grace Hopper being brought in to write an assembler for that. Computers would become the big mathematical number crunchers and slowly spread into being data processors from there. Following decades of batch processing mainframes we would get minicomputers and interactivity, then time sharing, and then the PC revolution. Distinct eras in computing. Today, computers do far more than just the types of math the ENIAC did. In fact, the functionality of ENIAC was duplicated onto a 20 megahertz microchip in 1996. You know, ‘cause the University of Pennsylvania wanted to do something to celebrate the 50th birthday. And a birthday party seemed underwhelming at the time. And so the date of release for this episode is February 15th, now ENIAC Day in Philadelphia, dedicated as a way to thank the university, creators, and programmers. And we should all reiterate their thanks. They helped put computers front and center into the thoughts of the next generation of physicists, mathematicians, and engineers, who built the mainframe era. And I should thank you - for listening to this episode. I’m pretty lucky to have ya’. Have a great day! .

Polish Innovations In Computing


Computing In Poland Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us to innovate (and sometimes cope with) the future! Today we’re going to do something a little different. Based on a recent trip to Katowice and Krakow, and a great visit to the Museum of Computer and Information Technology in Katowice, we’re going to look at the history of computing in Poland. Something they are proud of and should be proud of. And I’m going to mispronounce some words. Because they are averse to vowels. But not really, instead because I’m just not too bright. Apologies in advance. First, let’s take a stroll through an overly brief history of Poland itself. Atilla the Hun and other conquerors pushed Germanic tribes from Poland in the fourth century which led to a migration of Slavs from the East into the area. After a long period of migration, duke Mieszko established the Piast dynasty in 966, and they created the kingdom of Poland in 1025, which lasted until 1370 when Casimir the Great died without an heir. That was replaced by the Jagiellonian dynasty which expanded until they eventually developed into the Polish-Lithuanian Commonwealth in 1569. Turns out they overextended themselves until the Russians, Prussians, and Austria invaded and finally took control in 1795, partitioning Poland. Just before that, Polish clockmaker Jewna Jakobson built a mechanical computing machine, a hundred years after Pascal, in 1770. And innovations In mechanical computing continued on with Abraham Izrael Stern and his son through the 1800s and Bruno’s Intergraph, which could solve complex differential equations. And so the borders changed as Prussia gave way to Germany until World War I when the Second Polish Republic was established. And the Poles got good at cracking codes as they struggled to stay sovereign against Russian attacks. Just as they’d struggled to stay sovereign for well over a century. Then the Germans and Soviets formed a pact in 1939 and took the country again. During the war, Polish scientists not only assisted with work on the Enigma but also with the nuclear program in the US, the Manhattan Project. Stanislaw Ulam was recruited to the project and helped with ENIAC by developing the Monte Carlo method along with Jon Von Neumann. The country remained partitioned until Germany fell in WWII and the Soviets were able to effectively rule the Polish People’s Republic until a socal-Democratic movement swept the country in 1989, resulting in the current government and Poland moving from the Eastern Bloc to NATO and eventually the EU around the same time the wall fell in Berlin. Able to put the Cold War behind them, Polish cities are now bustling with technical innovation and is now home some of the best software developers I’ve ever met. Polish contributions to a more modern computer science began in 1924 when Jan Lukasiewicz developed Polish Notation, a way of writing mathematical expressions such that they are operator-first. during World War II when the Polish Cipher Bureau were the first that broke the Enigma encryption, at different levels from 1932 to 1939. They had been breaking codes since using them to thwart a Russian invasion in the 1920s and had a pretty mature operation at this point. But it was a slow, manUal process, so Marian Rejewski, one of the cryptographers developed a card catalog of permutations and used a mechanical computing device he invented a few years earlier called a cyclometer to decipher the codes. The combination led to the bomba kryptologiczna which was shown to the allies 5 weeks before the war started and in turn led to the Ultra program and eventually Colossus once Alan Turing got a hold of it, conceptually after meeting Rejewski. After the war he became an accountant to avoid being forced into slave cryptographic work by the Russians. In 1948 the Group for Mathematical Apparatus of the Mathematical Institute in Warsaw was formed and the academic field of computer research was formed in Poland. Computing continued in Poland during the Soviet-controlled era. EMAL-1 was started in 1953 but was never finished. The XYZ computer came along in 1958. Jack Karpiński built the first real vacuum tube mainframe in Poland, called the AAH in 1957 to analyze weather patterns and improve forecasts. He then worked with a team to build the AKAT-1 to simulate lots of labor intensive calculations like heat transfer mechanics. Karpinski founded the Laboratory for Artificial Intelligence of the Polish Academy of Sciences. He would win a UNESCO award and receive a 6 month scholarship to study in the US, which the polish government used to spy on American progress in computing. He came home armed with some innovative ideas from the West and by 1964 built what he called the Perceptron, a computer that could be taught to identify shapes and even some objects. Nothing like that had existed in Poland or anywhere else controlled by communist regimes at the time. From 65 to 68 he built the KAR-65, even faster, to study CERN data. By then there was a rising mainframe and minicomputer industry outside of academia in Poland. Production of the Odra mainframe-era computers began in 1959 in Wroclaw, Poland and his work was seen by them and Elwro as a threat do they banned him from publishing for a time. Elwro built a new factory in 1968, copying IBM standardization. In 1970, Karpiński realized he had to play ball with the government and got backing from officials in the government. He would then designed the k-202 minicomputer in 1971. Minicomputers were on the rise globally and he introduced the concept of paging to computer science, key in virtual memory. This time he recruited 113 programmers and hardware engineers and by 73 were using Intel 4004 chips to build faster computers than the DEC PDP-11. But the competitors shut him down. They only sold 30 and by 1978 he retired to Switzerland (that sounds better than fled) - but he returned to Poland following the end of communism in the country and the closing of the Elwro plant in 1989. By then the Personal Computing revolution was upon us. That had begun in Poland with the Meritum, a TRS-80 clone, back in 1983. More copying. But the Elwro 800 Junior shipped in 1986 and by 1990 when the communists split the country could benefit from computers being mass produced and the removal of export restrictions that were stifling innovation and keeping Poles from participating in the exploding economy around computers. Energized, the Poles quickly learned to write code and now graduate over 40,000 people in IT from universities, by some counts making Poland a top 5 tech country. And as an era of developers graduate they are founding museums to honor those who built their industry. It has been my privilege to visit two of them at this point. The description of the one in Krakow reads: The Interactive Games and Computers Museum of the Past Era is a place where adults will return to their childhood and children will be drawn into a lots of fun. We invite you to play on more than 20 computers / consoles / arcade machines and to watch our collection of 200 machines and toys from the '70's-'90's. The second is the Museum of Computer and Information Technology in Katowice, and the most recent that I had the good fortune to visit. Both have systems found at other types of computer history museums such as a Commodore PET but showcasing the locally developed systems and looking at them on a timeline it’s quickly apparent that while Poland had begun to fall behind by the 80s, it was more a reflection of why the strikes throughout caused the Eastern Bloc to fall, because Russian influence couldn’t. Much as the Polish-Lithuanian Commonwealth couldn’t support Polish control of Lithuania in the late 1700s. There were other accomplishments such as The ZAM-2. And the first fully Polish machine, the BINEG. And rough set theory. And ultrasonic mercury memory.



BASIC Welcome to the History of Computing Podcast, where we explore the history of information technology. Because by understanding the past prepares us to innovate the future! Today we’re going to look at the computer that was the history of the BASIC programming language. We say BASIC but really BASIC is more than just a programming language. It’s a family of languages and stands for Beginner’s All-purpose Symbolic Instruction Code. As the name implies it was written to help students that weren’t math nerds learn how to use computers. When I was selling a house one time, someone was roaming around in my back yard and apparently they’d been to an open house and they asked if I’m a computer scientist after they saw a dozen books I’d written on my bookshelf. I really didn’t know how to answer that question We’ll start this story with Hungarian John George Kemeny. This guy was pretty smart. He was born in Budapest and moved to the US with his family in 1940 when his family fled anti-Jewish sentiment and laws in Hungary. Some of his family would go on to die in the Holocaust, including his grandfather. But safely nestled in New York City, he would graduate high school at the top of his class and go on to Princeton. Check this out, he took a year off to head out to Los Alamos and work on the Manhattan Project under Nobel laureate Richard Feynman. That’s where he met fellow Hungarian immigrant Jon Von Neumann - two of a group George Marx wrote about in his book on great Hungarian Emmigrant Scientists and thinkers called The Martians. When he got back to Princeton he would get his Doctorate and act as an assistant to Albert Einstein. Seriously, THE Einstein. Within a few years he was a full professor at Dartmouth and go on to publish great works in mathematics. But we’re not here to talk about those contributions to the world as an all around awesome place. You see, by the 60s math was evolving to the point that you needed computers. And Kemeny and Thomas Kurtz would do something special. Now Kurtz was another Dartmoth professor who got his PhD from Princeton. He and Kemeny got thick as thieves and wrote the Dartmouth Time-Sharing System (keep in mind that Time Sharing was all the rage in the 60s, as it gave more and more budding computer scientists access to those computer-things that prior to the advent of Unix and the PC revolution had mostly been reserved for the high priests of places like IBM. So Time Sharing was cool, but the two of them would go on to do something far more important. In 1956, they would write DARSIMCO, or Dartmouth Simplified Code. As with Pascal, you can blame Algol. Wait, no one has ever heard of DARSIMCO? Oh… I guess they wrote that other language you’re here to hear the story of as well. So in 59 they got a half million dollar grant from the Alfred P. Sloan foundation to build a new department building. That’s when Kurtz actually joined the department full time. Computers were just going from big batch processed behemoths to interactive systems. They tried teaching with DARSIMCO, FORTRAN, and the Dartmouth Oversimplified Programming Experiment, a classic acronym for 1960s era DOPE. But they didn’t love the command structure nor the fact that the languages didn’t produce feedback immediately. What was it called? Oh, so in 1964, Kemeny wrote the first iteration of the BASIC programming language and Kurtz joined him very shortly thereafter. They did it to teach students how to use computers. It’s that simple. And as most software was free at the time, they released it to the public. We might think of this as open source-is by todays standards. I say ish as Dartmouth actually choose to copyright BASIC. Kurtz has said that the name BASIC was chosen because “We wanted a word that was simple but not simple-minded, and BASIC was that one.” The first program I wrote was in BASIC. BASIC used line numbers and read kinda’ like the English language. The first line of my program said 10 print “Charles was here” And the computer responded that “Charles was here” - the second program I wrote just added a second line that said: 20 goto 10 Suddenly “Charles was here” took up the whole screen and I had to ask the teacher how to terminate the signal. She rolled her eyes and handed me a book. And that my friend, was the end of me for months. That was on an Apple IIc. But a lot happened with BASIC between 1964 and then. As with many technologies, it took some time to float around and evolve. The syntax was kinda’ like a simplified FORTRAN, making my FORTRAN classes in college a breeze. That initial distribution evolved into Dartmouth BASIC, and they received a $300k grant and used student slave labor to write the initial BASIC compiler. Mary Kenneth Keller was one of those students and went on to finish her Doctorate in 65 along with Irving Tang, becoming the first two PhDs in computer science. After that she went off to Clarke College to found their computer science department. The language is pretty easy. I mean, like PASCAL, it was made for teaching. It spread through universities like wildfire during the rise of minicomputers like the PDP from Digital Equipment and the resultant Data General Nova. This lead to the first text-based games in BASIC, like Star Trek. And then came the Altair and one of the most pivotal moments in the history of computing, the porting of BASIC to the platform by Microsoft co-founders Bill Gates and Paul Allen. But Tiny BASIC had appeared a year before and suddenly everyone needed “a basic.” You had Commodore BASIC, BBC Basic, Basic for the trash 80, the Apple II, Sinclair and more. Programmers from all over the country had learned BASIC in college on minicomputers and when the PC revolution came, a huge part of that was the explosion of applications, most of which were written in… you got it, BASIC! I typically think of the end of BASIC coming in 1991 when Microsoft bought Visual Basic off of Alan Cooper and object-oriented programming became the standard. But the things I could do with a simple if, then else statement. Or a for to statement or a while or repeat or do loop. Absolute values, exponential functions, cosines, tangents, even super-simple random number generation. And input and output was just INPUT and PRINT or LIST for source. Of course, functional programming was always simpler and more approachable. So there, you now have Kemeny as a direct connection between Einstein and the modern era of computing. Two immigrants that helped change the world. One famous, the other with a slightly more nuanced but probably no less important impact in a lot of ways. Those early BASIC programs opened our eyes. Games, spreadsheets, word processors, accounting, Human Resources, databases. Kemeny would go on to chair the commission investigating Three Mile Island, a partial nuclear meltdown that was a turning point in nuclear proliferation. I wonder what Kemeny thought when he read the following on the Statue of Liberty: Give me your tired, your poor, Your huddled masses yearning to breathe free, The wretched refuse of your teeming shore. Perhaps, like many before and after, he thought that he would breathe free and with that breath, do something great, helping bring the world into the nuclear era and preparing thousands of programmers to write software that would change the world. When you wake up in the morning, you have crusty bits in your eyes and things seem blurry at first. You have to struggle just a bit to get out of bed and see the sunrise. BASIC got us to that point. And for that, we owe them our sincerest thanks. And thank you dear listeners, for your contributions to the world in whatever way they may be. You’re beautiful. And of course thank you for giving me some meaning on this planet by tuning in. We’re so lucky to have you, have a great day!

The Altair 8800


Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Todays episode is on Agile Software Development. Agile software development is a methodology, or anti-methodology, or approach to software development that evolves the requirements a team needs to fulfill and the solutions they need to build in a collaborative, self-organized, and cross-functional way. Boy, that’s a lot to spit out there. I was in an elevator the other day and I heard someone say: “That’s not very agile.” And at that moment, I knew that I just couldn’t help but do an episode on agile. I’ve worked in a lot of teams that use a lot of variants of agile, scrum, Kanban, scrumban, Extreme Programing, Lean Software Development. Some of these are almost polar opposites and you still hear people talk about what is agile and if they want to make fun of people doing things an old way, they’ll say something like waterfall. Nothing ever was waterfall, given that you learn on the fly, find re-usable bits or hit a place where you just say that’s not possible. But that’s another story. The point here is that agile is, well, weaponized to back up what a person wants someone to do. Or how they want a team to be run. And it isn’t always done from an informed point of view. Why is Agile an anti-methodology? Think of it more like a classification maybe. There were a number of methodologies like Extreme Programming, Scrum, Kanban, Feature Driven Development, Adaptive Software Development, RAD, and Lean Software Development. These had come out to bring shape around a very similar idea. But over the course of 10-20 years, each had been developed in isolation. In college, I had a computer science professor who talked about “adaptive software development” from his days at a large power company in Georgia back in the 70s. Basically, you are always adapting what you’re doing based on speculation of how long something will take, collaboration on that observation and what you learn while actually building. This shaped how I view software development for years to come. He was already making fun of Waterfall methodologies, or a cycle where you write a large set of requirements and stick to them. Waterfall worked well if you were building a computer to land people on the moon. It was a way of saying “we’re not engineers, we’re software developers.” Later in college, with the rapid proliferation of the Internet and computers into dorm rooms I watched the emergence of rapid application development, where you let the interface requirements determine how you build. But once someone weaponized that by putting a label on it, or worse forking the label into spiral and unified models, then they became much less useful and the next hot thing had to come along. Kent Beck built a methodology called Extreme Programming - or XP for short - in 1996 and that was the next hotness. Here, we release software in shorter development cycles and software developers, like police officers on patrol work in pairs, reviewing and testing code and not writing each feature until it’s required. The idea of unit testing and rapid releasing really came out of the fact that the explosion of the Internet in the 90s meant people had to ship fast and this was also during the rise of really main-stream object-oriented programming languages. The nice thing about XP was that you could show a nice graph where you planned, managed, designed, coded, and tested your software. The rules of Extreme Programming included things like “Code the unit test first” - and “A stand up meeting starts each day.” Extreme Programming is one of these methodologies. Scrum is probably the one most commonly used today. But the rest, as well as the Crystal family of methodologies, are now classified as Agile software development methodologies. So it’s like a parent. Is agile really just a classification then? No. So where did agile come from? By 2001, Kent Beck, who developed Extreme Programming met with Ward Cunningham (who built WikiWikiWeb, the first wiki), Dave Thomas, a programmer who has since written 11 books, Jeff Sutherland and Ken Schwaber, who designed Scrum. Jim Highsmith, who developed that Adaptive Software Development methodology, and many others were at the time involved in trying to align an organizational methodology that allowed software developers to stop acting like people that built bridges or large buildings. Most had day jobs but they were like-minded and decided to meet at a quaint resort in Snowbird, Utah. They might have all wanted to use the methodologies that each of them had developed. But if they had all been jerks then they might not have had a shift in how software would be written for the next 20+ years. They decided to start with something simple, a statement of values; instead of Instead of bickering and being dug into specific details, they were all able to agree that software development should not be managed in the same fashion as engineering projects are run. So they gave us the Manifesto for Agile Software Development… The Manifesto reads: We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value: * Individuals and interactions over processes and tools * Working software over comprehensive documentation * Customer collaboration over contract negotiation * Responding to change over following a plan That is, while there is value in the items on the right, we value the items on the left more. But additionally, the principles dig into and expand upon some of that adjacently. The principles behind the Agile Manifesto: Our highest priority is to satisfy the customer through early and continuous delivery of valuable software. Welcome changing requirements, even late in development. Agile processes harness change for the customer's competitive advantage. Deliver working software frequently, from a couple of weeks to a couple of months, with a preference to the shorter timescale. Business people and developers must work together daily throughout the project. Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done. The most efficient and effective method of conveying information to and within a development team is face-to-face conversation. Working software is the primary measure of progress. Agile processes promote sustainable development. The sponsors, developers, and users should be able to maintain a constant pace indefinitely. Continuous attention to technical excellence and good design enhances agility. Simplicity--the art of maximizing the amount of work not done--is essential. The best architectures, requirements, and designs emerge from self-organizing teams. At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behavior accordingly. Many of the words here are easily weaponized. For example, “satisfy the customer.” Who’s the customer? The product manager? The end user? The person in an enterprise who actually buys the software? The person in that IT department that made the decision to buy the software? In the scrum methodology, the customer is not known. The product owner is their representative. But the principles should need to identify that, just use the word so each methodology makes sure to cover it. Now take “continuous delivery.” People frequently just lump CI in there with CD. I’ve heard continuous design, continuous improvement, continuous deployment, continuous podcasting. Wait, I made the last one up. We could spend hours going through each of these and identifying where they aren’t specific enough. Or, again, we could revel in their lack of specificity by pointing us into the direction of a methodology where these words get much more specific meanings. Ironically, I know accounting teams at very large companies that have scrum masters, engineering teams for big projects with a project manager and a scrum master, and even a team of judges that use agile methodologies. There are now scrum masters embedded in most software teams of note. But once you see Agile on the cover of The Harvard Business Review, you hate to do this given all the classes in agile/XP/scrum - but you have to start wondering what’s next? For 20 years, we’ve been saying “stop treating us like engineers” or “that’s waterfall.” Every methodology seems to grow. Right after I finished my PMP I was on a project with someone else that had just finished theirs. I think they tried to implement the entire Project management Body of Knowledge. If you try to have every ceremony from Scrum, you’re not likely to even have half a day left over to write any code. But you also don’t want to be like the person on the elevator, weaponizing only small parts of a larger body of work, just to get your way. And more importantly, to admit that none of us have all the right answers and be ready to, as they say in Extreme Programming: Fix XP when it breaks - which is similar to Boyd’s Destruction and Creation, or the sustenance and destruction in Lean Six-Sigma. Many of us forget that last part: be willing to walk away from the dogma and start over. Thomas Jefferson called for a revolution every 20 years. We have two years to come up with a replacement! And until you replace me, thank you so very much for tuning into another episode of the History of Computing Podcast. We’re lucky to have you. Have a great day!



Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Todays episode is on the history of Wikipedia. The very idea of a single location that could store all the known information in the world began with Ptolemy I, founder of the Greek dynasty that ruled Egypt following the death of Alexander the great. He and his son amassed 100s of thousands of scrolls in the Library and Alexandria from 331 BC and on. The Library was part of a great campus of the Musaeum where they also supported great minds starting with Ptolemy I’s patronage of Euclid, the father of geometry, and later including Archimedes, the father of engineering, Hipparchus, the founder of trigonometry, Her, the father of math, and Herophilus, who gave us the scientific method and countless other great hellenistic thinkers. The Library entered into a slow decline that began with the expulsion of intellectuals from Alexandria in 145BC. Ptolemy VIII was responsible for that. Always be weary of people who attack those that they can’t win over especially when they start blaming the intellectual elite for the problems of the world. This began a slow decline of the library until it burned, first with a small fire accidentally set by Caesar in 48BC and then for good in the 270s AD. In the centuries since there have been attempts here and there to gather great amounts of information. The first known encyclopedia was the Naturalis Historiae by Pliny the Elder, never completed because he was killed in the eruption of Vesuvius. One of the better known being the Encyclopedia Britannica, starting off in 1768. Mass production of these was aided by the printing press but given that there’s a cost to producing those materials and a margin to be made in the sale of those materials that encouraged a somewhat succinct exploration of certain topics. The advent of the computer era of course led to encyclopedias on CD and then to online encyclopedias. Encyclopedias at the time employed experts in certain fields and paid them for compiling and editing articles for volumes that would then be sold. As we say these days, this was a business model just waiting to be disrupted. Jimmy Wales was moderating an online discussion board on Objectivism and happened across Larry Sanger in the early 90s. They debated and became friends. Wales started Nupedia, which was supposed to be a free encyclopedia, funded by advertising revenue. As it was to be free, they were to recruit thousands of volunteer editors. People of the caliber that had been previously hired to research and write articles for encyclopedias. Sanger, who was pursuing a PhD in philosophy from Ohio State University, was hired on as editor-in-chief. This was a twist on the old model of compiling an encyclopedia and a twist that didn’t work out as intended. Volunteers were slow to sign up, but Nupedia went online in 2000. Later in the year there had only been two articles that made it through the review process. When Sanger told Ben Kovitz about this, he recommended looking at the emerging wiki culture. This had been started with WikiWikiWeb, developed by Ward Cunningham in 1994, named after a shuttle bus that ran between airport terminals at the Honolulu airport. WikiWikiWeb had been inspired by Hypercard but needed to be multi-user so people could collaborate on web pages, quickly producing content on new patterns in programming. He wanted to make non-writers feel ok about writing. Sanger proposed using a wiki to be able to accept submissions for articles and edits from anyone but still having a complicated review process to accept changes. The reviewers weren’t into that, so they started a side project they called Wikipedia in 2001 with a user-generated model for content, or article, generation. The plan was to generate articles on Wikipedia and then move or copy them into Nupedia once they were ready. But Wikipedia got mentioned on Slashdot. In 2001 there were nearly 30 million websites but half a billion people using the web. Back then a mention on the influential Slashdot could make a site. And it certainly helped. They grew and more and more people started to contribute. They hit 1,000 articles in March of 2001 and that increased by 10 fold by September, By And another 4 fold the next year. It started working independent of Nupedia. The dot-com bubble burst in 2000 and by 2002 Nupedia had to lay Sanger off and he left both projects. Nupedia slowly died and was finally shut down in 2003. Eventually the Wikimedia Foundation was built to help unlock the world’s knowledge, which now owns and operates Wikipedia. Wikimedia also includes Commons for media, Wikibooks that includes free textbooks and manuals, Wikiquote for quotations, Wikiversity for free learning materials, MediaWiki the source code for the site, Wikidata for pulling large amounts of data from Wikimedia properties using APIs, Wikisource, a library of free content, Wikivoyage, a free travel guide, Wikinews, free news, Wikispecies, a directory containing over 687,000 species. Many of the properties have very specific ways of organizing data, making it easier to work with en masse. The properties have grown because people like to be helpful and Wales allowed self-governance of articles. To this day he rarely gets involved in the day-to-day affairs of the wikipedia site, other than the occasional puppy dog looks in banners asking for donations. You should donate. He does have 8 principles the site is run by: 1. Wikipedia’s success to date is entirely a function of our open community. 2. Newcomers are always to be welcomed. 3. “You can edit this page right now” is a core guiding check on everything that we do. 4. Any changes to the software must be gradual and reversible. 5. The open and viral nature of the GNU Free Documentation License and the Create Commons Attribution/Share-Alike License is fundamental to the long-term success of the site. 6. Wikipedia is an encyclopedia. 7. Anyone with a complaint should be treated with the utmost respect and dignity. 8. Diplomacy consists of combining honesty and politeness. This culminates in 5 pillars wikipedia is built on: 1. Wikipedia is an encyclopedia. 2. Wikipedia is written from a neutral point of view. 3. Wikipedia is free content that anyone can use, edit, and distribute. 4. Wikipedia’s editors should treat each other with respect and civility. 5. Wikipedia has no firm rules. Sanger went on to found Citizendium, which uses real names instead of handles, thinking maybe people will contribute better content if their name is attached to something. The web is global. Throughout history there have been encyclopedias produced around the world, with the Four Great Books of Song coming out of 11th century China, the Encyclopedia of the Brethren of Purity coming out of 10th century Persia. When Wikipedia launched, it was in English. Wikipedia launched a German version using the deutsche.wikipedia.com subdomain. It now lives at de.wikipedia.com and Wikipedia has gone from being 90% English to being almost 90 % non-English, meaning that Wikipedia is able to pull in even more of the world’s knowledge. Wikipedia picked up nearly 20,000 English articles in 2001, over 75,000 new articles in 2002, and that number has steadily climbed wreaching over 3,000,000 by 2010, and we’re closing in on 6 Million today. The English version is 10 terabytes of data uncompressed. If you wanted to buy a printed copy of wikipedia today, it would be over 2500 books. By 2009 Microsoft Encarta shut down. By 2010 Encyclopedia Britannica stopped printing their massive set of books and went online. You can still buy encyclopedias from specialty makers, such as the World Book. Ironically, Encyclopedia Britannica does now put real names of people on articles they produce on their website, in an ad-driven model. There are a lot of ads. And the content isn’t linked to as many places nor as thorough. Creating a single location that could store all the known information in the world seems like a pretty daunting task. Compiling the non-copywritten works of the world is now the mission of Wikipedia. The site receives the fifth most views per month and is read by nearly half a billion people a month with over 15 billion page views per month. Anyone who has gone down the rabbit hole of learning about Ptolemy I’s involvement in developing the Library of Alexandria and then read up on his children and how his dynasty lasted until Cleopatra and how… well, you get the point… can understand how they get so much traffic. Today there are over 48,000,000 articles and over 37,000,000 registered users who have contributed articles meaning if we set 160 Great Libraries of Alexandria side-by-side we would have about the same amount of information Wikipedia has amassed. And it’s done so because of the contributions of so many dedicated people. People who spend hours researching and building pages, undergoing the need to provide references to cite the data in the articles (btw wikipedia is not supposed to represent original research), more people to patrol and look for content contributed by people on a soapbox or with an agenda, rather than just reporting the facts. Another team looking for articles that need more information. And they do these things for free. While you can occasionally see frustrations from contributors, it is truly one of the best things humanity has done. This allows us to rediscover our own history, effectively compiling all the facts that make up the world we live in, often linked to the opinions that shape them in the reference materials, which include the over 200 million works housed at the US Library of Congress, and over 25 million books scanned into Google Books (out of about 130 million). As with the Great Library of Alexandria, we do have to keep those who seek to throw out the intellectuals of the world away and keep the great works being compiled from falling to waste due to inactivity. Wikipedia keeps a history of pages, to avoid revisionist history. The servers need to be maintained, but the database can be downloaded and is routinely downloaded by plenty of people. I think the idea of providing an encyclopedia for free that was sponsored by ads was sound. Pivoting the business model to make it open was revolutionary. With the availability of the data for machine learning and the ability to enrich it with other sources like genealogical research, actual books, maps, scientific data, and anything else you can manage, I suspect we’ll see contributions we haven’t even begun to think about! And thanks to all of this, we now have a real compendium of the worlds knowledge, getting more and more accurate and holistic by the day. Thank you to everyone involved, from Jimbo and Larry, to the moderators, to the staff, and of course to the millions of people who contribute pages about all the history that makes up the world as we know it today. And thanks to you for listening to yet another episode of the History of Computing Podcast. We’re lucky to have you. Have a great day! Note: This work was produced in large part due to the compilation of historical facts available at https://en.wikipedia.org/wiki/History_of_Wikipedia

Open Atari


In this episode of ANTIC The Atari 8-bit Computer Podcast: Mike Maginnis of the Open Apple and Drop III Inches podcasts joins the Antic crew and starts a computer war, Nir Dary tells us about disk drive upgrades, we catch up with Curt Vendel about his projects including the 2nd Atari history book, and more Atari news than you can possibly imagine!


Recurring Links

Floppy Days Podcast



Kevin’s Book “Terrible Nerd”

New Atari books scans at archive.org

ANTIC feedback at AtariAge

Atari interview discussion thread on AtariAge

ANTIC Facebook Page


Eaten By a Grue  

What we’ve been up to



YouTube videos this month


End of Show Music

Possible side effects of listening to the Antic podcast include stuffy nose, sneezing, sore throat; drowsiness, dizziness, feeling nervous; mild nausea, upset stomach, constipation;

increased appetite, weight changes; insomnia, decreased sex drive, impotence, or difficulty having an orgasm; dry mouth, intense hate of Commodore, and Amiga lust. Certain conditions apply. Offer good for those with approved credit. Member FDIC. An equal housing lender.

The Atari 8-bit Podcast - Curt Vendel & Dennis Harkins



On this episode of Antic, the Atari 8-bit Podcast: an interview with Curt Vendel, Atari historian and co-author of “Atari, Inc: Business Is Fun” … and  an interview with Dennis Harkins, author of the APX program Message Display Program … and how a pack of bubble gum led to life with an Atari and a career in computers.


Links mentioned in this episode:


Recurring Links

Floppy Days Podcast



Kevins Book “Terrible Nerd”


New Atari books scans at archive.org

ANTIC feedback at AtariAge


What We’ve Been Up To

"Compute's Atari Collection Volume 1"

"CoCo: The Colorful History of Tandy's Underdog Computer" by Boisy G Pitre and Bill Loguidice

“Sophistication and Simplicity, The Life and Times of the Apple II Computer” by Steven Weyhrich



Vintage Computer Festival Southeast (VCFSE) 2.0

Intellivisionaries Podcast

Movie Musical Madness

Kevin's black metal 850 interface

"A Mind Forever Voyaging - a history of storytelling in video games"

Atari User Magazine

Kevin's Atari 400/800 Posters



VCF East 9.1

Atari Gamer Magazine

Atari Gamer Promotion on YouTube

30th Anniversary Edition of Boulder Dash

TapStar Interactive

Terry Stewart (Tez) HD remake  of Atari 400 Video on YouTube

Band Of Outsiders Atari Clothing Article

Band of Outsiders Website

Atari 800 mentioned on Colbert Report Video

Atari 800 on Colbert Report Discussion on AtariAge

Google and YouTube Atari Easter Eggs

Learning Curve programming articles

Learning Curve Discussion on AtariAge

Nolan Bushnell interview

NOMAM 2014 programming contest for 10 line games

Bill Kendrick - Paddleship Entry for NOMAM

Archive.org Computer Magazines

Archive.org Computer Newsletters

Archive.org Game Magazines

Archive.org Manuals

Archive.org The Business Case: Applications and Programs for the Home Office




Atari Encyclopedia

History of Atari Computers from CIO Magazine

Bits of the Past store


Interview - Curt Vendel

Atari Museum

Atari History Book Website

“Atari Inc.: Business is Fun (Volume I)” by Curt Vendel, Marty Goldberg at Amazon


Interview - Dennis Harkins

Unedited Dennis Harkins Interview



Taste My Beeper 1-bit GTIA Music

Mash-up of the Beastie Boys and the music from Ballblazer


The Atari 8-bit Podcast - Chris Crawford



In this episode of Antic, the Atari 8-bit Podcast, an interview with Chris Crawford, author of Eastern Front 1941; we rescue Atari hardware and TI 99/4a hardware; we find a new source for reliable Atari power supplies; and we take a look at an Atari emulator that works in your web browser.

Links mentioned in this episode:

Floppy Days Podcast



Kevins Book “Terrible Nerd”

Atlanta Historical Computing Society

Vintage Computer Festival MW 8.0

New Atari books scans at archive.org

ANTIC feedback at AtariAge

Atari vintage commercial at YouTube

Phoenix Art Museum - Art of Video Games Exhibit

Commodore Computer Club

New book projects announced

Chris Crawford Eastern Front Source Code and More

Follow-up to Atari Bankruptcy Saga

JSMESS Atari Emulation in a Browser

4MB Flash MegaCart Web Site

4MB Flash MegaCart Discussion on AtariAge

Atari Party 2013 Pictures

More Atari Party 2013 Pictures

Atari Computer USB Power Adapter Cable on eBay

Atari Computer Replacement Power Supply on eBay

Atari Computer Power Supply Discussion on AtariAge

GTIABlast! Demo Site

GTIABlast! GTIA Mode 10 Video on YouTube

GTIABlast! GTIA Mode 11 Video on YouTube

Atari Software Competition 2013 Web site

Atari Software Competition 2013 Discussion on AtariAge

Atari Box Art Article on The Verge

Atari User Magazine Site

Atari User Magazine at Magcloud

Atari User Magazine at Lulu

Starring the Computer

B&C ComputerVision

Atari Legacy Group on LinkedIn

Full Chris Crawford Interview






The Atari 8-bit Podcast - JD Casten & Steve Wilds



On this episode of Antic, the Atari 8-bit Podcast: an interview with JD Casten, Antic magazine’s prolific game author, an interview with Steve Wilds, editor of Atari User Magazine … and lots of retrogaming news and reviews.


Links mentioned in this episode:


Recurring Links

Floppy Days Podcast



Kevins Book “Terrible Nerd”

New Atari books scans at archive.org

ANTIC feedback at AtariAge


What We’ve Been Up To


Turbo-BASIC XL at Page6.org

Turbo-BASIC XL Expanded Documentation

More Turbo-BASIC XL Information


Retro Gamer Magazine

Vintage Computer Festival Southeast (VCFSE) 2.0



VCF East 9.1

Seattle Retro Gaming Expo

Classic Gaming Expo

Portland Retro Gaming Expo

Retro Gamer Magazine picks top 10 Atari 8-bit games

Floppy Bird Article

Floppy Bird Download

Stampede Article

Stampede Download

Perplexity Article

Perplexity Download

Retro Gaming Magazine

Nolan Bushnell Article on using Games to Teach

Atari Dump Dig Update

Tablet-Friendly Revamp for Hitchhiker's Guide to the Galaxy

“Vintage Game Consoles: An Inside Look at Apple, Atari, Commodore, Nintendo, and the Greatest Gaming Platforms of All Time” by Bill Loguidice and Matt Barton

TitanFall Arcade

Discussion on AtariAge about Buying Atari

Archive.org BusinessCase


Software of the Month

Synapse Software Syn Business Application Series


Hardware of the Month

Atari XE Game System (XEGS)


Website of the Month

Atari Museum

Facebook Page for Atari Museum


Interview - JD Casten

JD Casten Website


Interview - Steve Wilds

Atari User Magazine


The Atari 8-bit Podcast - Gray Chang & Jonathan Halliday


On this episode of Antic, the Atari 8-bit PODCAST: We delve into the SIDE2 compact flash interface, look at arcade games ported to the 8-bits, discuss another  new atari podcast,  and interviews with Gray Chang -- author of Claim Jumper -- and Jonathan Halliday, creator of the new Atari GUI.


Links mentioned in this episode:


Recurring Links

Floppy Days Podcast



Kevins Book “Terrible Nerd”

New Atari books scans at archive.org

ANTIC feedback at AtariAge


What We’ve Been Up To

VCF Midwest 9.0

Jim Brain Retro Innovations

iTalk II Video on YouTube

Atari 800 with Encore Video Productions Info Display System

Covox VoiceMaster Video on YouTube



Atlanta Maker Faire




Retro Gamer Magazine

New Atari 8-bit Podcast Inverse Atascii

Mini Atari 800XL with Atari 1050 disk drive (3D printed) at MakerBot

Mini Atari 800XL with Atari 1050 disk drive (3D printed) Blog

Mini Atari 400 (3D printed) at MakerBot

Mini Atari 400 (3D printed) Blog

ABBUC 2014 Hardware contest entries

SIO2BT (SIO to Bluetooth) at YouTube

SIO2BT Discussion at AtariAge

New keyboard interface for Atari 8-bit

WUDSN Atari 8-bit cross-compiling

New Cover for the 2nd Edition of Atari Inc. - Business Is Fun

Nolan Bushnell Reddit AMA

Atari User Magazine

HTML5 version of the classic Star Raiders that runs in your browser 


Bill’s Modern Segment

Asteroids Emulator at AtariMania

Norbert's Emulators page: Asteroids Emulator for the Atari 800XL

YouTube: Asteroids emulator on the Atari 800XL

Pac-Man Arcade Orders at AtariAge

AtariAge Forum: "Pac-man Update for Atari 8-bit"

The Pac-Man Dossier


Software of the Month



Hardware of the Month



Website of the Month

Lotharek’s Lair



McDonald’s Atari Commercial

AtariBBS by Thom Cherryhomes

AtariBBS ATA and ASC welcome screens

AtariBBS BBSConf status

AtariBBS User Module

AtariBBS filemenu functionality

AtariBBS flatmsg board functionality 


Interview - Gray Chang

Gray Chang Website

another interview with Gray

archive.org full version

Download APX programs


Interview - Jonathan Halliday

GUI Videos

Jonathan’s Website





The Atari 8-bit Podcast - Live from VCFSE 2.0!



On this episode of Antic, the Atari 8-bit podcast, we broadcast live from Vintage Computer Festival Southeast 2.0, interview attendees with Atari stories, find out who's going to win the grand prize for the quiz show (hint: It's someone you may know!), and answer questions from the audience.  Come join us for the most fun-packed show we've had yet!  READY

Links mentioned in this episode:

Recurring Links

Floppy Days Podcast



Kevins Book “Terrible Nerd”

New Atari books scans at archive.org

ANTIC feedback at AtariAge


Vintage Computer Festival Southeast 2.0
Serge's Boxed Atari Collection

The Atari 8-bit Podcast - Darren Doyle & Michael Current


On this episode of Antic, the Atari 8-bit Podcast: Randy does a horrible impersonation of Rod Serling, we talk with Darren Doyle of Atari Gamer Magazine, have a discussion with Michael Current of the Atari 8-bit FAQ AND give you the scoop on Vintage Computer Festival Southeast 2.0. Also Kevin gives excuses about why his alien voice box isn’t working...still.

Links mentioned in this episode:


Recurring Links

Floppy Days Podcast



Kevins Book “Terrible Nerd”

New Atari books scans at archive.org

ANTIC feedback at AtariAge


What We’ve Been Up To

VCF Southeast 2.0

VCFSE 2.0 Kickstarter

The Future Was Here: The Commodore Amiga by Jimmy Maher

Finding The Next Steve Jobs by Nolan Bushnell and Gene Stone

The Making of Karateka: Journals 1982-1985 by Jordan Mechner

Kevin's 10-line Contest Entry: Abduction

Kevin's 10-line Contest Entry: Joy Joy Revolution

CoCoFest 2014

Atari 5200 Information on WikiPedia




The Art of Atari: From Pixels to Paintbrush

City Updates Agreement for Atari Dump Dig

Retro Gamer Magazine
Southern-Fried Gameroom Expo

Classic Console & Arcade Gaming Show 2014

Video Game Summit

High Score Club (HSC) on AtariAge  - 11th season

Article: Learn more about the legends of game design from GDC 1997

Video: Learn more about the legends of game design from GDC 1997

New ACUSOL language being developed for the Atari 8-bit, discussion on AtariAge

Action! Language for the Atari

Atari Casino

More Atari Casino

Bushnell could have been rich!

ColecoVisions Podcast Forum

Colecovisions Podcast Show Notes

Dennis Harkins Atari Papers

Archive.org - MicroTimes magazine

Atari 800 on v1n1 - interview with FreeFall (archon)’s creators Jon Freeman and Anne Westfall

BBS land

Atari 520 ST First Impressions, Preview of Amiga

Br0derbund software interview

Mindset computer


Website of the Month

Atari Mail Archive


Software of the Month (Software Automatic Mouth, SAM)

SAM Manual, disk image, and MP3s

SAM Online simulator

SAM Creator SoftVoice



Hardware of the Month (VoiceBox Speech Synthesizer by The Alien Group)

Ad for VoiceBox

AtariAge discussion


Listener Feedback

Ten Pence Arcade Podcast

Atari Technical Information Maintained by Dan


Interview - Darren Doyle

Atari Gamer Magazine

Homebrew Heroes Magazine


Interview - Michael Current

Unedited version of the interview (1 hour)

Michael current’s web site

Atari 8-Bit Computers: Frequently Asked Questions

Atari 8-Bit Computers: Vendors and Developers list

Welcome to comp.sys.atari.8bit!

Atari History Timelines

St. Paul Atari Computer Enthusiasts (SPACE)

The Atari 8-bit Podcast - Kieren Hawken & Dale Yocum


On this episode of Antic, the Atari 8-bit codpast: interviews with atari author and enthusiast Kieren Hawken; and Dale Yocum, the guy who thought up Atari Program Exchange. And Bill kendrick complains and ends up with his own segment, reviewing Space harrier. . . And we don’t talk about the Atari Dump Dig.


Links mentioned in this episode:


Recurring Links

Floppy Days Podcast



Kevins Book “Terrible Nerd”

New Atari books scans at archive.org

ANTIC feedback at AtariAge


What We’ve Been Up To

Maker Faire Atlanta




Briel Computers

Ten Pence Arcade


Pro(c) Magazine -Euro 5,00 / World incl. postage. Payment by PayPal to 8bit@proc-atari.de



Nolan Bushnell interview on Retro Obscura - Discussion on AtariAge

Nolan Bushnell interview on Retro Obscura

Dump Dig movie trailer. movie to be titled “Atari: Game Over”

RetroChallenge 2014

RetroChallenge 2014 - Earl Evans’ entry

Atari SAP Music Archive

Classic Gaming Expo

VCF Midwest

Portland Retro Gaming Expo

Player/Missile Atari Podcast

Translating ATASCII text files to ASCII text files on AtariAge

"Invenies Verba" for Atari 8-bit by Bill Kendrick on YouTube

Archive.org Atari Emulator Screenshots


Bill Kendrick’s Modern Segment

Chris Hutt's website (Wayback Machine archive)
Chris Hutt's YouTube channel
Release announcement on AtariAge forums (with video and download link)
AtariMania entry


Software of the Month

Atari 800 Best Game Pack


Hardware of the Month

Atari-styled USB Joystick


Website of the Month

Once Upon Atari



Listener Feedback

James Hague’s DaisyPop iPhone Game on iTunes

Computer Art and Animation: A User's Guide to Atari LOGO



Interview - Kieren Hawken

Retro Video Gamer

Homebrew Heroes

Revival Retro Event

ROM Retro Event

Retro Gamer Magazine

Atari User

Nolan Bushnell Interview by Kieren on YouTube


Interview - Dale Yocum

Unedited version of Interview at Archive.org



Atari Tape Music


The Simpsons Screen Saver


I didn't do it. Wait, no, I finally did. This week: Berkeley Systems' Simpsons After Dark Screensaver.



This week: Chris Pirih's SkiFree and some wonderful listener e-mail!

Castle of the Winds


This week we discuss Rick Saada's role-playing game Castle of the Winds, and listen to some wonderful listener e-mail!



This week we discuss the Maxis simulator classic SimFarm, as well a story errata from a prior podcast and listener mail.

This Episode is Under Construction


Three days of jackhammering and concrete cutting outside of my window means episode 6 is delayed until the madness stops.

Indiana Jones and his Desktop Adventures


This is a big fat episode! In this episode you'll learn about the poorly-selling LucasArts roguelite, warezing over ISDN lines using DCC bots, and I summarize an interview with creator Hal Barwood.

San Diego Zoo's - The Animals


San Diego Zoo Presents: The Animals! The Multimedia PC specification, a short discussion of Blender magazine, and memories of Video for Windows.

A Father's Day Story


A short episode to wish you all a happy Father's Day, and share a little fatherly story of my own.

The Adventures of MicroMan


Our first foray into the world of Windows 3.1 gaming: The Adventures of MicroMan. I reflect on the history of this much loved cult classic, and talk a little about its creator. You can download the original shareware version of and even play it in-browser. Read technical information on and check out Brian Goble's

An Introduction to Windows 3.1


In my inaugural episode I talk about my first Windows 3.1 computer and sketch out some ideas for future episodes.

The History of After Dark


Feeling totally twisted? I know I am! This week: The history of Berkeley Systems and its After Dark suite.

KansasFest Diary


KansasFest 2014 through the eyes of a first-timer!

The Apple II (Part III)


Third part on the Apple II:

  • News

  • New acquisitions

  • Feedback

  • Books, Software, Modern Upgrades, Online Stores, Emulation, Current Web Sites

  • Special guest host Carrington Vanston!!

Items mentioned in this episode:


New Acquisitions

Vintage Computer Shows


  • Compute’s First, Second and Third Book of Apple

  • Apple II User’s Guide by Lon Poole

  • Programming Surprises & Tricks for your Apple II/IIe Computer by David L. Heiserman

  • AppleSoft Tutorial from Apple, Inc. - based on Apple II BASIC Programming Manual by Jef Raskin; rewritten for AppleSoft by Caryl Richardson

  • Beneath Apple DOS by Don Worth and Peter Lechner - Beneath Apple DOS is intended to serve as a companion to Apple's DOS Manual, providing additional information for the advanced programmer or the novice Apple user who wants to know more about the structure of diskettes.

  • Apple II/IIe Computer Graphics by Ken Williams, founder and CEO of Sierra On-Line Inc

  • AppleSoft BASIC Toolbox by Larry Wintermeyer

  • Apple Graphics Games by Paul Coletta

  • Machine Language for Beginners by Richard Mansfield

  • Micro Adventure is the title of a series of books for young adult readers, published by Scholastic, Inc.

  • Golden Flutes & Great Escapes by Delton Horn

  • Sophistication and Simplicity, the Life and Times of the Apple II Computer by Steve Weyhrich, 2013 - http://www.amazon.com/dp/0986832278/?tag=flodaypod-20

  • The New Apple II User’s Guide by David Finnegan, 2012 - http://www.amazon.com/dp/0615639879/?tag=flodaypod-20

  • iWoz: Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It,  by Steve Wozniak and Gina Smith, 2006 - http://www.amazon.com/dp/0393061434/?tag=flodaypod-20

  • Steve Jobs by Walter Isaacson, 2011 - http://www.amazon.com/dp/1451648537/?tag=flodaypod-20

  • WOZPAK Special Edition - http://www.amazon.com/dp/1304231321/?tag=flodaypod-20

  • What’s Where in the Apple by Prof. William F. Luebbert - http://www.whatswhereintheapple.com/


Modern Upgrades & Connectivity Options

Online Stores


Current Web Sites & Other Forums

Other Books/Sites Used for Reference


The APF Imagination Machine


News, upcoming vintage computer shows, feedback.


Main topic: The APF Imagination Machine


Links Mentioned in the Show:








Current Web Sites/Links/Mail Lists





The Exidy Sorcerer - Live from VCFSE 2.0


Recorded live from VCFSE 2.0 in Roswell, GA!  Main topic: The Exidy Sorcerer.  Special guest host David Greelish!

Links Mentioned in the Show:



Current Web Sites



Interview with Apple II Fan Ken Gagne


Bonus episode this month. Interview with Apple II Enthusiast Ken Gagne about KFest, Open Apple Podcast, Juiced.GS and more.



Ohio Scientific Challenger


Main Topic: The OSI Challenger series of computers


At this point in the podcast run, we are still in the late 1970’s time frame, and the OSI machines fall into that time frame for their release.  No vintage computer historical journey would be complete without including these very important machines.  As usual, we’ll cover the history, technical specs, peripherals, Web sites, books, emulation and much, much more.  I am joined by special guest host Terry Stewart of the Classic Computers website who will help me cover these machines.  In addition, we are joined by OSI aficionados Mark Csele and David Fenyes who share their first-hand memories of the OSI.  But first, I’ll cover new acquisitions, news, and feedback before diving into the OSI.


Links Mentioned in the Show:


New Acquisitions/What I’ve Been up to














Buying and Using One Today (eBay and replicas)


Current Web Sites 

The TRS-80 Model I (Part I)


News, reviews, and a discussion of the TRS-80 Model I:

  • personal memories

  • history up to its introduction

  • interview with David and Theresa Welsh (Part I), authors of "Priming the Pump: How TRS-80 Enthusiasts Helped Start the PC Revolution"



Links Mentioned in the Show:

The Apple II (Part II)


Second part on the Apple II:

  • New acquisitions

  • News

  • Tech specs, peripherals, magazines, user groups, shows

  • Special guest host Carrington Vanston!!

Items mentioned in this episode:

New Acquisitions

Vintage Computer Shows


User Groups


Other Books/Sites Used for Reference

The Apple II, Part I, History with Steve Weyhrich


First part on the Apple II:

  • Personal memories of the Apple II.

  • New acquisitions.

  • News.

  • Feedback.

  • History of the Apple II.

  • Special guest host Steve Weyhrich, the man who literally wrote the book on Apple II history!!

 Links mentioned in this episode:

 New Acquisitions

 Vintage Computer Shows



Other News



Vintage Computer Festival Midwest 8.0


News and a completion discussion of VCF Midwest 8.0:

  • personal memories

  • feedback

  • new acquisitions

  • upcoming shows

  • new vintage computer books

  • overview of VCFMW

  • interviews with attendees of the show (Jim Leonard and Jason Timmons) 


Links Mentioned in the Show:

VCF East 9.1 Preview w/Evan Koblentz


1 year anniversary of Floppy Days!!  Special Bonus Episode!  I talk with Evan Koblentz of MARCH, who gives us a preview of the upcoming VCF East 9.1.

Links and Info:

VCF East 9.1 - http://www.midatlanticretro.org
Evan Koblentz email - evan@snarc.net

NOTE: Evan misspoke about Bil Herd's Friday session. It is not just CRT repair. It's overall video issues, of which CRT is just one part.

CoCo Book Interview w/Boisy Pitre & Bill Loguidice


Special Bonus episode!  Interview with Boisy Pitre and Bill Loguidice about their new book "CoCo: The Colorful History of Tandy's Underdog Computer."


VCFSE 2.0 Preview


Hello, welcome to Floppy Days Episode #15.  This is a bonus episode and the topic of this show is the upcoming (as of this podcast) 2014 Vintage Computer Festival Southeast 2.0 near Atlanta Georgia on May 3rd and 4th.  I interview Lonnie Mimms and Flash Corliss about the show and they give you the highlights about what you can expect to see there.  I hope you enjoy it.


The History of Computing Ep 10: Computers and the Space Race


We go knee-deep into available computing technology in the late 1950's and what it was used for: Missles and Satellites.  We see the creation of the NASA RTCC in a muddy field and revisit what IBM is up to.

Magnetic: The History of Computing ep 4


This episode looks at some really interesting inventions with the Bulb and Electricity that played a major role in the development of the electronic computer.  This includes vacuum tubes and Cathode Ray Tubes (CRT's).

Magnetic: The History of Computing Ep 11: Fly By Wire


The early 1960's were full of gigantic leaps in computing technology, and a lot of it was used in NASA to get astronauts into space!  We see the first use of the word Mainframes, and see how Neil Armstrong used a new invention called "secondary storage" to try to save his own life!

Magnetic: The History of Computing


In this episode, we solve the problem of calculating from inaccurate tables, learn about NYU Art Professor and inventor of the first electrical language: Samuel Morse, and we discover how the Tabulating Machine company got its start and made one man very rich (and the census a lot easier).  Tune in!

Magnetic: the History of Computing


In this first episode, I go over the beginning of computing: why did we start this thing in the first place?  We review the Abacus, the plague, and the loom, and see why those factored into the device you're reading this on. 

DOS Prompt: Betrayal at Krondor (Part 2)


Welcome back for the second half of our DOS Prompt series on Betrayal at Krondor, where I discuss the development history of the game.

DOS Prompt: Betrayal at Krondor (Part 1)


Welcome all you Northwarden Piggies! Today's episode is a first: we're dropping down to a DOS Prompt to talk about Betrayal at Krondor.

Mercury Memories


This episode we take a look at the earliest days of computing, and one of the earliest forms of computer memory. Mercury delay lines, originally developed in the early 40s for use in radar, are perhaps one of the strangest technologies I've even encountered. Made primarily from liquid mercury and quartz crystals these devices store digital data as a recirculating acoustic wave. They can only be sequentially accessed. Operations are temperature dependent. And, well, the can also be dangerous to human health. So how did mercury find it's way into some of the first computers?

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and bonus content: https://www.patreon.com/adventofcomputing

The Innovations Of Bell Labs


What is the nature of innovation? Is it overhearing a conversation as with Morse and the telegraph? Working with the deaf as with Bell? Divine inspiration? Necessity? Science fiction? Or given that the answer to all of these is yes, is it really more the intersectionality between them and multiple basic and applied sciences with deeper understandings in each domain? Or is it being given the freedom to research? Or being directed to research? Few have as storied a history of innovation as Bell Labs and few have had anything close to the impact.

Bell Labs gave us 9 Nobel Prizes and 5 Turing awards. Their alumni have even more, but those were the ones earned while at Bell. And along the way they gave us 26,000 patents. They researched, automated, and built systems that connected practically every human around the world - moving us all into an era of instant communication. It’s a rich history that goes back in time from the 2018 Ashkin Nobel for applied optical tweezers and 2018 Turing award for Deep Learning to an almost steampunk era of tophats and the dawn of the electrification of the world.

Those late 1800s saw a flurry of applied and basic research. One reason was that governments were starting to fund that research. Alessandro Volta had come along and given us the battery and it was starting to change the world. So Napolean’s nephew, Napoleon III, during the second French Empire gave us the Volta Prize in 1852.

One of those great researchers to receive the Volta Prize was Alexander Graham Bell. He invented the telephone in 1876 and was awarded the Volta Prize, getting 50,000 francs. He used the money to establish the Volta Laboratory, which would evolve or be a precursor to a research lab that would be called Bell Labs. He also formed the Bell Patent Association in 1876. They would research sound. Recording, transmission, and analysis - so science.

There was a flurry of business happening in preparation to put a phone in every home in the world. We got the Bell System, The Bell Telephone Company, American Bell Telephone Company patent disputes with Elisha Gray over the telephone (and so the acquisition of Western Electric), and finally American Telephone and Telegraph, or AT&T. Think of all this as Ma’ Bell. Not Pa’ Bell mind you - as Graham Bell gave all of his shares except 10 to his new wife when they were married in 1877. And her dad ended up helping build the company and later creating National Geographic, even going international with International Bell Telephone Company. Bell’s assistant Thomas Watson sold his shares off to become a millionaire in the 1800s, and embarking on a life as a Shakespearean actor.

But Bell wasn’t done contributing. He still wanted to research all the things. Hackers gotta’ hack. And the company needed him to - keep in mind, they were a cutting edge technology company (then as in now). That thirst for research would infuse AT&T - with Bell Labs paying homage to the founder’s contribution to the modern day. Over the years they’d be on West Street in New York and expand to have locations around the US. Think about this: it was becoming clear that automation would be able to replace human efforts where electricity is concerned. The next few decades gave us the vacuum tube, flip flop circuits, mass deployment of radio. The world was becoming ever so slightly interconnected. And Bell Labs was researching all of it. From physics to the applied sciences.

By the 1920s, they were doing sound synchronized with motion and shooting that over long distances and calculating the noise loss. They were researching encryption. Because people wanted their calls to be private. That began with things like one-time pad cyphers but would evolve into speech synthesizers and even SIGSALY, the first encrypted (or scrambled) speech transmission that led to the invention of the first computer modem. They had engineers like Harry Nyquist, whose name is on dozens of theories, frequencies, even noise. He arrived in 1917 and stayed until he retired in 1954. One of his most important contributions was to move beyond printing telegraph to paper tape and to helping transmit pictures over electricity - and Herbert Ives from there sent color photos, thus the fax was born (although it would be Xerox who commercialized the modern fax machine in the 1960s).

Nyquist and others like Ralph Hartley worked on making audio better, able to transmit over longer lines, reducing feedback, or noise. While there, Hartley gave us the oscillator, developed radio receivers, parametric amplifiers, and then got into servomechanisms before retiring from Bell Labs in 1950. The scientists who’d been in their prime between the two world wars were titans and left behind commercializable products, even if they didn’t necessarily always mean to.

By the 40s a new generation was there and building on the shoulders of these giants. Nyquist’s work was extended by Claude Shannon, who we devoted an entire episode to. He did a lot of mathematical analysis like writing “A Mathematical Theory of Communication” to birth Information Theory as a science.

They were researching radio because secretly I think they all knew those leased lines would some day become 5G. But also because the tech giants of the era included radio and many could see a day coming when radio, telephony, and aThey were researching how electrons diffracted, leading to George Paget Thomson receiving the Nobel Prize and beginning the race for solid state storage.

Much of the work being done was statistical in nature. And they had William Edwards Deming there, whose work on statistical analysis when he was in Japan following World War II inspired a global quality movement that continues to this day in the form of frameworks like Six Sigma and TQM. Imagine a time when Japanese manufacturing was of such low quality that he couldn’t stay on a phone call for a few minutes or use a product for a time. His work in Japan’s reconstruction paired with dedicated founders like Akio Morita, who co-founded Sony, led to one of the greatest productivity increases, without sacrificing quality, of any time in the world. Deming would change the way Ford worked, giving us the “quality culture.”

Their scientists had built mechanical calculators going back to the 30s (Shannon had built a differential analyzer while still at MIT) - first for calculating the numbers they needed to science better then for ballistic trajectories, then with the Model V in 1946, general computing. But these were slow; electromechanical at best.

Mary Torrey was another statistician of the era who along with Harold Hodge gave us the theory of acceptance sampling and thus quality control for electronics. And basic electronics research to do flip-flop circuits fast enough to establish a call across a number of different relays was where much of this was leading. We couldn’t use mechanical computers for that, and tubes were too slow. And so in 1947 John Bardeen, Walter Brattain, and William Shockley invented the transistor at Bell Labs, which be paired with Shannon’s work to give us the early era of computers as we began to weave Boolean logic in ways that allowed us to skip moving parts and move to a purely transistorized world of computing.

In fact, they all knew one day soon, everything that monster ENIAC and its bastard stepchild UNIVAC was doing would be done on a single wafer of silicon. But there was more basic research to get there. The types of wires we could use, the Marnaugh map from Maurice Karnaugh, zone melting so we could do level doping. And by 1959 Mohamed Atalla and Dawon Kahng gave us metal-oxide semiconductor field-effect transistors, or MOSFETs - which was a step on the way to large-scale integration, or LSI chips. Oh, and they’d started selling those computer modems as the Bell 101 after perfecting the tech for the SAGE air-defense system.

And the research to get there gave us the basic science for the solar cell, electronic music, and lasers - just in the 1950s. The 1960s saw further work work on microphones and communication satellites like Telstar, which saw Bell Labs outsource launching satellites to NASA. Those transistors were coming in handy, as were the solar panels. The 14 watts produced certainly couldn’t have moved a mechanical computer wheel. Blaise Pascal and would be proud of the research his countries funds inspired and Volta would have been perfectly happy to have his name still on the lab I’m sure. Again, shoulders and giants. Telstar relayed its first television signal in 1962. The era of satellites was born later that year when Cronkite televised coverage of Kennedy manipulating world markets on this new medium for the first time and IBM 1401 computers encrypted and decrypted messages, ushering in an era of encrypted satellite communications. Sputnik may heave heated the US into orbit but the Telstar program has been an enduring system through to the Telstar 19V launched in 2018 - now outsourced to a Falcon 9 rocket from Space X.

It might seem like Bell Labs had done enough for the world. But they still had a lot of the basic wireless research to bring us into the cellular age. In fact, they’d plotted out what the cellular age would look like all the way back in 1947!

The increasing use of computers to do the all the acoustics and physics meant they were working closely with research universities during the rise of computing. They were involved in a failed experiment to create an operating system in the late 60s. Multics influenced so much but wasn’t what we might consider a commercial success. It was the result of yet another of DARPA’s J.C.R. Licklider’s wild ideas in the form of Project MAC, which had Marvin Minsky and John McCarthy. Big names in the scientific community collided with cooperation and GE, Bell Labs and Multics would end up inspiring many a feature of a modern operating system.

The crew at Bell Labs knew they could do better and so set out to take the best of Multics and implement a lighter, easier operating system. So they got to work on Uniplexed Information and Computing Service, or Unics, which was a pun on Multics. Ken Thompson, Dennis Ritchie, Doug McIllroy, Joe Assana, Brian Kernigan, and many others wrote Unix originally in assembly and then rewrote it in C once Dennis Ritchie wrote that to replace B. Along the way, Alfred Aho, Peter Weinber, and Kernighan gave us AWSK and with all this code they needed a way to keep the source under control so Marc Rochkind gave us the SCCS, or Course Code Control System, first written for an IBM S/3370 and then ported to C - which would be how most environments maintained source code until CVS came along in 1986. And Robert Fourer, David Gay, and Brian Kernighan wrote A Mathematical Programming Language, or AMPL, while there.

Unix began as a bit of a shadow project but would eventually go to market as Research Unix when Don Gillies left Bell to go to the University of Illinois at Champaign-Urbana. From there it spread and after it fragmented in System V led to the rise of IBM’s AIX, HP-UX, SunOS/Solaris, BSD, and many other variants - including those that have evolved into the macOS through Darwin, and Android through Linux. But Unix wasn’t all they worked on - it was a tool to enable other projects. They gave us the charge-coupled device, which resulted in yet another Nobel Prize. That is an image sensor built on the MOS technologies. While fiber optics goes back to the 1800s, they gave us attenuation over fiber and thus could stretch cables to only need repeaters every few dozen miles - again reducing the cost to run the ever-growing phone company.

All of this electronics allowed them to finally start reducing their reliance on electromechanical and human-based relays to transistor-to-transistor logic and less mechanical meant less energy, less labor to repair, and faster service. Decades of innovation gave way to decades of profit - in part because of automation. The 5ESS was a switching system that went online in 1982 and some of what it did - its descendants still do today. Long distance billing, switching modules, digital line trunk units, line cards - the grid could run with less infrastructure because the computer managed distributed switching. The world was ready for packet switching.

5ESS was 100 million lines of code, mostly written in C. All that source was managed with SCCS. Bell continued with innovations. They produced that modem up into the 70s but allowed Hayes, Rockewell, and others to take it to a larger market - coming back in from time to time to help improve things like when Bell Labs, branded as Lucent after the breakup of AT&T, helped bring the 56k modem to market.

The presidents of Bell Labs were as integral to the success and innovation as the researchers. Frank Baldwin Jewett from 1925 to 1940, Oliver Buckley from 40 to 51, the great Mervin Kelly from 51 to 59, James Fisk from 59 to 73, William Oliver Baker from 73 to 79, and a few others since gave people like Bishnu Atal the space to develop speech processing algorithms and predictive coding and thus codecs. And they let Bjarne Stroustrup create C++, and Eric Schmidt who would go on to become a CEO of Google and the list goes on. Nearly every aspect of technology today is touched by the work they did.

All of this research. Jon Gerstner wrote a book called The Idea Factory: Bell Labs and the Great Age of American Innovation. He chronicles the journey of multiple generations of adventurers from Germany, Ohio, Iowa, Japan, and all over the world to the Bell campuses. The growth and contraction of the basic and applied research and the amazing minds that walked the halls. It’s a great book and a short episode like this couldn’t touch the aspects he covers. He doesn’t end the book as hopeful as I remain about the future of technology, though.

But since he wrote the book, plenty has happened. After the hangover from the breakup of Ma Bell they’re now back to being called Nokia Bell Labs - following a $16.6 billion acquisition by Nokia. I sometimes wonder if the world has the stomach for the same level of basic research. And then Alfred Aho and Jeffrey Ullman from Bell end up sharing the Turing Award for their work on compilers. And other researchers hit a terabit a second speeds. A storied history that will be a challenge for Marcus Weldon’s successor. He was there as a post-doc there in 1995 and rose to lead the labs and become the CTO of Nokia - he said the next regeneration of a Doctor Who doctor would come in after him. We hope they are as good of stewards as those who came before them.

The world is looking around after these decades of getting used to the technology they helped give us. We’re used to constant change. We’re accustomed to speed increases from 110 bits a second to now terabits. The nature of innovation isn’t likely to be something their scientists can uncover. My guess is Prometheus is guarding that secret - if only to keep others from suffering the same fate after giving us the fire that sparked our imaginations. For more on that, maybe check out Hesiod’s Theogony.

In the meantime, think about the places where various sciences and disciplines intersect and think about the wellspring of each and the vast supporting casts that gave us our modern life. It’s pretty phenomenal when ya’ think about it.

Sage: The Semi-Automatic Ground Environment Air Defense


The Soviet Union detonated their first nuclear bomb in 1949, releasing 20 kilotons worth of an explosion and sparking the nuclear arms race. A weather reconnaissance mission confirmed that the Soviets did so and Klaus Fuchs was arrested for espionage, after passing blueprints for the Fat Man bomb that had been dropped on Japan. A common name in the podcast is Vannevar Bush. At this point he was the president of the Carnegie Institute and put together a panel to verify the findings.

The Soviets were catching up to American science. Not only did they have a bomb but they also had new aircraft that were capable of dropping a bomb. People built bomb shelters, schools ran drills to teach students how to survive a nuclear blast and within a few years we’d moved on to the hydrogen bomb. And so the world lived in fear of nuclear fall-out.

Radar had come along during World War II and we’d developed Ground Control of Intercept, an early radar network. But that wouldn’t be enough to protect against this new threat. If one of these Soviet bombers, like the Tupolev 16 “Badger” were to come into American airspace, the prevailing thought was that we needed to shoot it down before the payload could be delivered.

The Department of Defense started simulating what a nuclear war would look like. And they asked the Air Force to develop an air defense system. Given the great work done at MIT, much under the careful eye of Vannevar Bush, they reached out to George Valley, a professor in the Physics Department who had studied nuclear weapons. He also sat on the Air Force Scientific Advisory Board, and toured some of the existing sites and took a survey of the US assets.

He sent his findings and they eventually made their way to General Vandenberg, who assigned General Fairchild to assemble a committee which would become the Valley Committee, or more officially the Air Defense Systems Engineering Committee, or ADSEC.

ADSEC dug in deeper and decided that we needed a large number of radar stations with a computer that could aggregate and then analyze data to detect enemy aircraft in real time. John Harrington had worked out how to convert radar into code and could send that over telephone lines. They just needed a computer that could crunch the data as it was received. And yet none of the computer companies at the time were able to do this kind of real time operation. We were still in a batch processing mainframe world.

Jay Forrester at MIT was working on the idea of real-time computing. Just one problem, the Servomechanisms lab where he was working on Project Whirlwind for the Navy for flight simulation was over budget and while they’d developed plenty of ground-breaking technology, they needed more funding. So Forrester was added to ADSEC and added the ability to process the digital radar information. By the end of 1950, the team was able to complete successful tests of sending radar information to Whirlwind over the phone lines.

Now it was time to get funding, which was proposed at $2 million a year to fund a lab. Given that Valley and Forrester were both at MIT, they decided it should be at MIT. Here, they saw a way to help push the electronics industry forward and the Navy’s Chief Scientist Louis Ridenour knew that wherever that lab was built would become a the next scientific hotspot. The president at MIT at the time, James Killian, wasn’t exactly jumping on the idea of MIT becoming an arm of the department of defense so put together 28 scientists to review the plans from ADSEC, which became Project Charles and threw their support to forming the new lab.

They had measured twice and were ready to cut. There were already projects being run by the military during the arms buildup named after other places surrounding MIT so they picked Project Lincoln for the name of the project to Project Lincoln. They appointed F Wheeler Loomis as the director with a mission to design a defense system. As with all big projects, they broke it up into five small projects, or divisions; things like digital computers, aircraft control and warning, and communications. A sixth did the business administration for the five technical divisions and another delivered technical services as needed.

They grew to over 300 people by the end of 1951 and over 1,300 in 1952. They moved offsite and built a new campus - thus establishing Lincoln Lab. By the end of 1953 they had written a memo called A Proposal for Air Defense System Evolution: The Technical Phase. This called for a net of radars to be set up that would track the trajectory of all aircraft in the US airspace and beyond. And to build communications to deploy the weapons that could destroy those aircraft.

The Manhattan project had brought in the nuclear age but this project grew to be larger as now we had to protect ourselves from the potential devastation we wrought. We were firmly in the Cold War with America testing the hydrogen bomb in 52 and the Soviets doing so in 55. That was the same year the prototype of the AN/FSQ-7 to replace Whirlwind.

To protect the nation from these bombs they would need 100s of radars, 24 centers to receive data, and 3 combat centers. They planned for direction centers to have a pair of AN/FSQ-7 computers, which were the Whirlwind evolved. That meant half a million lines of code which was by far the most ambitious software ever written. Forrester had developed magnetic-core memory for Whirlwind. That doubled the speed of the computer. They hired IBM to build the AN/FSQ-7 computers and from there we started to see commercial applications as well when IBM added it to the 704 mainframe in 1955.

Stalin was running labor camps and purges. An estimated nine million people died in Gulags or from hunger. Chairman Mao visited Moscow in 1957, sparking the Great Leap Forward policy that saw 45 million people die. All in the name of building a utopian paradise. Americans were scared. And Stalin was distrustful of computers for any applications beyond scientific computing for the arms race. By contrast, people like Ken Olsen from Lincoln Lab left to found Digital Equipment Corporation and sell modular mini-computers on the mass market, with DEC eventually rising to be the number two computing company in the world.

The project also needed software and so that was farmed out to Rand who would have over 500 programmers work on it. And a special display to watch planes as they were flying, which began as a Stromberg-Carlson Charactron cathode ray tube. IBM got to work building the 24 FSQ-7s, with each coming in at a whopping 250 tons and nearly 50,000 vacuum tubes - and of course that magnetic core memory.

All this wasn’t just theoretical. Given the proximity, they deployed the first net of around a dozen radars around Cape Cod as a prototype. They ran dedicated phone lines from Cambridge and built the first direction center, equipping it with an interactive display console that showed an x for each object being tracked, adding labels and then Robert Everett came up with the idea of a light gun that could be used as a pointing device, along with a keyboard, to control the computers from a terminal.

They tested the Cape Cod installation in 1953 and added long range radars in Maine and New York by the end of 1954, working out bugs as they went. The Suffolk County Airfield in Long Island was added so Strategic Air Command could start running exercises for response teams. By the end of 1955 they put the system to the test and it passed all requirements from the Air Force. The radars detected the aircraft and were able to then control manned antiaircraft operations.

By 1957 they were adding logic and capacity to the system, having fine tuned over a number of test runs until they got to a 100 percent interception rate. They were ready to build out the direction centers. The research and development phase was done - now it was time to produce an operational system. Western Electric built a network of radar and communication systems across Northern Canada that became known as the DEW line, short for Distant Early Warning.

They added increasingly complicated radar, layers of protection, like Buckminster Fuller joining for a bit to develop a geodesic dome to protect the radars using fiberglass. They added radar to what looked like oil rigs around Texas, experimented with radar on planes and ships, and how to connect those back to the main system. By the end of 1957 the system was ready to move into production and integration with live weapons into the code and connections.

This is where MIT was calling it done for their part of the program. Only problem is when the Air Force looked around for companies willing to take on such a large project, no one could. So MITRE corporation was spun out of Lincoln Labs pulling in people from a variety of other government contractors and continues on to this day working on national security, GPS, election integrity, and health care.

They took the McChord airfare online as DC-12 in 1957, then Syracuse New York in 1958 and started phasing in automated response. Andrews, Dobbins, Geiger Field, Los Angeles Air Defense Sector, and others went online over the course of the next few years. The DEW line went operational in 1962, extending from Iceland to the Aleutians. By 1963, NORAD had a Combined Operations Center where the war room became reality.

Burroughs eventually won a contract to deploy new D825 computers to form a system called BUIC II and with the rapidly changing release of new solid state technology those got replaced with a Hughes AN/TSQ-51. With the rise of Airborn Warning and Control Systems (AWACS), the ground systems started to slowly get dismantled in 1980, being phased out completely in 1984, the year after WarGames was released.

In WarGames, Matthew Broderick plays David Lightman, a young hacker who happens upon a game. One Jon Von Neumann himself might have written as he applied Game Theory to the nuclear threat. Lightman almost starts World War III when he tries to play Global Thermonuclear War. He raises the level of DEFCON and so inspires a generation of hackers who founded conferences like DEFCON and to this day war dial, or war drive, or war whatever.

The US spent countless tax money on advancing technology in the buildup for World War II and the years after. The Manhattan Project, Project Whirlwind, SAGE, and countless others saw increasing expenditures. Kennedy continued the trend in 1961 when he started the process of putting humans on the moon. And the unpopularity of the Vietnam war, which US soldiers had been dying in since 1959, caused a rollback of spending.

The legacy of these massive projects was huge spending to advance the sciences required to produce each. The need for these computers in SAGE and other critical infrastructure to withstand a nuclear war led to ARPANET, which over time evolved into the Internet. The subsequent privatization of these projects, the rapid advancement in making chips, and the drop in costs while frequent doubling of speeds based on findings from each discipline finding their way into others then gave us personal computing and the modern era of PCs then mobile devices. But it all goes back to projects like ENIAC, Whirlwind, and SAGE. Here, we can see generations of computing evolve with each project.

I’m frequently asked what’s next in our field. It’s impossible to know exactly. But we can look to mega projects, many of which are transportation related - and we can look at grants from the NSF. And DARPA and many major universities. Many of these produce new standards so we can also watch for new RFCs from the IETF. But the coolest tech is probably classified, so ask again in a few years!

And we can look to what inspires - sometimes that’s a perceived need, like thwarting nuclear war. Sometimes mapping human genomes isn’t a need until we need to rapidly develop a vaccine. And sometimes, well… sometimes it’s just returning to some sense of normalcy. Because we’re all about ready for that. That might mean not being afraid of nuclear war as a society any longer. Or not being afraid to leave our homes. Or whatever the world throws at us next.

Project MAC and Multics


Welcome to the history of computing podcast. Today we’re going to cover a cold war-era project called Project MAC that bridged MIT with GE and Bell Labs.

The Russians beat the US to space when they launched Sputnik in 1958. Many in the US felt the nation was falling behind and so later that year president Dwight D. Eisenhower appointed then president of MIT James Killian as the Presidential Assistant for Science and created ARPA. The office was lean and funded a few projects without much oversight. One was Project MAC at MIT, which helped cement the university as one of the top in the field of computing as it grew.

Project MAC, short for Project on Mathematics and Computation, was a 1960s collaborative endeavor to develop a workable timesharing system. The concept of timesharing initially emerged during the late 1950s. Scientists and Researchers finally went beyond batch processing with Whirlwind and its spiritual predecessors, the TX-0 through TX-2 computers at MIT. We had computer memory now and so had interactive computing. That meant we could explore different ways to connect directly with the machine.

In 1959, British mathematician Christopher Strachey presented the first public presentation on timesharing at a UNESCO meeting, and John McCarthy distributed an internal letter regarding timesharing at MIT. Timesharing was initially demonstrated at the MIT Computational Center in November 1961, under the supervision of Fernando Corbato, an MIT professor. J.C.R. Licklider at ARPA had been involved with MIT for most of his career in one way or another and helped provide vision and funding along with contacts and guidance, including getting the team to work with Bolt, Beranek & Newman (BBN).

Yuri Alekseyevich Gagarin went to space in 1961. The Russians were still lapping us. Money. Governments spend money. Let’s do that.

Licklider assisted in the development of Project MAC, machine-assisted cognition, led by Professor Robert M. Fano. He then funded the project with $3 million per year. That would become the most prominent initiative in timesharing. In 1967, the Information Processing Techniques Office invested more than $12 million in over a dozen timesharing programs at colleges and research institutions. Timesharing then enabled the development of new software and hardware separate from that used for batch processing. Thus, one of the most important innovations to come out of the project was an operating system capable of supporting multiple parallel users - all of whom could have complete control of the machine.

The operating system they created would be known as Multics, short for Multiplexed Information and Computing Service. It was created for a GE 645 computer but modular in nature and could be ported to other computers. The project was a collaborative effort between MIT, GE, and Bell Labs. Multics was the first time we really split files away from objects read in memory and wrote them into memory for processing then back to disk. They developed the concepts of dynamic linking, daemons, procedural calls, hierarchical file systems, process stacks, a split between user land and the system, and much more.

By the end of six months after Project MAC was created, 200 users in 10 different MIT departments had secured access to the system. The Project MAC laboratory was apart from its former Department of Electrical Engineering by 1967 and evolved into its interdepartmental laboratory.

Multics progressed from computer timesharing to a networked computer system, integrating file sharing and administration capabilities and security mechanisms into its architecture. The sophisticated design, which could serve 300 daily active users on 1,000 MIT terminal computers within a couple more years, inspired engineers Ken Thompson and Dennis Ritchie to create their own at Bell Labs, which evolved into the C programming language and the Unix operating system.

See, all the stakeholders with all the things they wanted in the operating system had built something slow and fragile. Solo developers don’t tend to build amazing systems, but neither do large intracompany bureaucracies.

GE never did commercialize Multics because they ended their computer hardware business in 1970. Bell Labs dropped out of the project as well. So Honeywell acquired the General Electric computer division and so rights to the Multics project. In addition, Honeywell possessed several other operating systems, each supported by its internal organizations.

In 1976, Project MAC was renamed the Laboratory for Computer Science (LCS) at MIT, broadening its scope. Michael L. Dertouzos, the lab's director, advocated developing intelligent computer programs. To increase computer use, the laboratory analyzed how to construct cost-effective, user-friendly systems and the theoretical underpinnings of computer science to recognize space and time constraints. Some of their project ran for decades afterwards. In 2000, several Multics sites were shut down.

The concept of buying corporate “computer utilities” was a large area of research in the late 60s to 70s. Scientists bought time on computers that universities purchased. Companies did the same. The pace of research at both increased dramatically. Companies like Tymeshare and IBM made money selling time or processing credits, and then after an anti-trust case, IBM handed that business over to Control Data Corporation, who developed training centers to teach people how to lease time. These helped prepare a generation of programmers when the microcomputers came along, often taking people who had spent their whole careers on CDC Cybers or Burroughs mainframes by surprise. That seems to happen with the rapid changes in computing. But it was good to those who invested in the concept early. And the lessons learned about scalable architectures were skills that transitioned nicely into a microcomputer world. In fact, many environments still run on applications built in this era.

The Laboratory for Computer Science (LCS) accomplished other ground-breaking work, including playing a critical role in advancing the Internet. It was often larger but less opulent than the AI lab at MIT. And their role in developing applications that would facilitate online processing and evaluation across various academic fields, such as engineering, medical, and library sciences led to advances in each. In 2004, LCS merged with MIT's AI laboratory to establish the Computer Science and Artificial Intelligence Laboratory (CSAIL), one of the flagship research labs at MIT. And in the meantime countless computer scientists who contributed at every level of the field flowed through MIT - some because of the name made in those early days. And the royalties from patents have certainly helped the universities endowment.

The Cold War thawed. The US reduced ARPA spending after the Mansfield Amendment was passed in 1969. The MIT hackers flowed out to the world, changing not only how people thought of automating business processes, but how they thought of work and collaboration. And those hackers were happy to circumvent all the security precautions put on Multics, and so cultural movements evolved from there. And the legacy of Multics lived on in Unix, which evolved to influence Linux and is in some way now a part of iOS, Mac OS, Android, and Chrome OS.

(OldComputerPods) ©Sean Haas, 2020