'computers' Episodes

Happy Birthday ENIAC

     2/15/2020

Today we’re going to celebrate the birthday of the first real multi-purpose computer: the gargantuan ENIAC which would have turned 74 years old today, on February 15th. Many generations ago in computing. The year is 1946. World War II raged from 1939 to 1945. We’d cracked Enigma with computers and scientists were thinking of more and more ways to use them. The press is now running articles about a “giant brain” built in Philadelphia. The Electronic Numerical Integrator and Computer was a mouthful, so they called it ENIAC. It was the first true electronic computer. Before that there were electromechanical monstrosities. Those had to physically move a part in order to process a mathematical formula. That took time. ENIAC used vacuum tubes instead. A lot of them. To put things in perspective: very hour of processing by the ENiAC was worth 2,400 hours of work calculating formulas by hand. And it’s not like you can do 2,400 hours in parallel between people or in a row of course. So it made the previous almost impossible, possible. Sure, you could figure out the settings to fire a bomb where you wanted two bombs to go in a minute rather than about a full day of running calculations. But math itself, for the purposes of math, was about to get really, really cool. The Bush Differential Analyzer, a later mechanical computer, had been built in the basement of the building that is now the ENIAC museum. The University of Pennsylvania ran a class on wartime electronics, based on their experience with the Differential Analyzer. John Mauchly and J. Presper Eckert met in 1941 while taking that class, a topic that had included lots of shiny new or newish things like radar and cryptanalysis. That class was mostly on ballistics, a core focus at the Moore School of Electrical Engineering at the University of Pennsylvania. More accurate ballistics would be a huge contribution to the war effort. But Echert and Mauchly wanted to go further, building a multi-purpose computer that could analyze weather and calculate ballistics. Mauchly got all fired up and wrote a memo about building a general purpose computer. But the University shot it down. And so ENIAC began life as Project PX when Herman Goldstine acted as the main sponsor after seeing their proposal and digging it back up. Mauchly would team up with Eckert to design the computer and the effort was overseen and orchestrated by Major General Gladeon Barnes of the US Army Ordnance Corps. Thomas Sharpless was the master programmer. Arthur Burkes built the multiplier. Robert Shaw designed the function tables. Harry Huskey designed the reader and the printer. Jeffrey Chu built the dividers. And Jack Davis built the accumulators. Ultimately it was just a really big calculator and not a computer that ran stored programs in the same way we do today. Although ENIAC did get an early version of stored programming that used a function table for read only memory. The project was supposed to cost $61,700. The University of Pennsylvania Department of Computer and Information Science in Philadelphia actually spent half a million dollars worth of metal, tubes and wires. And of course the scientists weren’t free. That’s around $6 and a half million worth of cash today. And of course it was paid for by the US Army. Specifically the Ballistic Research Laboratory. It was designed to calculate firing tables to make blowing things up a little more accurate. Herman Goldstine chose a team of programmers that included Betty Jennings, Betty Snyder, Kay McNulty, Fran Bilas, Marlyn Meltzer, and Ruth Lichterman. They were chosen from a pool of 200 and set about writing the necessary formulas for the machine to process the requirements provided from people using time on the machine. In fact, Kay McNulty invented the concept of subroutines while working on the project. They would flip switches and plug in cables as a means of programming the computer. And programming took weeks of figuring up complex calculations on paper. . Then it took days of fiddling with cables, switches, tubes, and panels to input the program. Debugging was done step by step, similar to how we use break points today. They would feed ENIAC input using IBM punch cards and readers. The output was punch cards as well and these punch cards acted as persistent storage. The machine then used standard octal radio tubes. 18000 tubes and they ran at a lower voltage than they could in order to minimize them blowing out and creating heat. Each digit used in calculations took 36 of those vacuum tubes and 20 accumulators that could run 5,000 operations per second. The accumulators used two of those tubes to form a flip-flop and they got them from the Kentucky Electrical Lamp Company. Given the number that blew every day they must have loved life until engineers got it to only blowing a tube every couple of days. ENIAC was modular computer and used different panels to perform different tasks, or functions. It used ring counters with 10 positions for a lot of operations making it a digital computer as opposed to the modern binary computational devices we have today. The pulses between the rings were used to count. Suddenly computers were big money. A lot of research had happened in a short amount of time. Some had been government funded and some had been part of corporations and it became impossible to untangle the two. This was pretty common with technical advances during World War II and the early Cold War years. John Atanasoff and Cliff Berry had ushered in the era of the digital computer in 1939 but hadn’t finished. Maunchly had seen that in 1941. It was used to run a number of calculations for the Manhattan Project, allowing us to blow more things up than ever. That project took over a million punch cards and took precedent over artillery tables. Jon Von Neumann worked with a number of mathematicians and physicists including Stanislaw Ulam who developed the Monte Method. That led to a massive reduction in programming time. Suddenly programming became more about I/O than anything else. To promote the emerging computing industry, the Pentagon had the Moore School of Electrical Engineering at The University of Pennsylvania launch a series of lectures to further computing at large. These were called the Theory and Techniques for Design of Electronic Digital Computers, or just the Moore School Lectures for short. The lectures focused on the various types of circuits and the findings from Eckert and Mauchly on building and architecting computers. Goldstein would talk at length about math and other developers would give talks, looking forward to the development of the EDVAC and back at how they got where they were with ENIAC. As the University began to realize the potential business impact and monetization, they decided to bring a focus to University owned patents. That drove the original designers out of the University of Pennsylvania and they started the Eckert-Mauchly Computer Corporation in 1946. Eckert-Mauchley would the build EDVAC, taking use of progress the industry had made since the ENIAC construction had begun. EDVAC would effectively represent the wholesale move away from digital and into binary computing and while it weighed tons - it would become the precursor to the microchip. After the ENIAC was finished Mauchly filed for a patent in 1947. While a patent was granted, you could still count on your fingers the number of machines that were built at about the same time, including the Atanasoff Berry Computer, Colossus, the Harvard Mark I and the Z3. So luckily the patent was avoided and digital computers are a part of the public domain. That patent was voided in 1973. By then, the Eckert-Mauchly computer corporation had been acquired by Remington Rand, which merged with Sperry and is now called Unisys. The next wave of computers would be mainframes built by GE, Honeywell, IBM, and another of other vendors and so the era of batch processing mainframes began. The EDVAC begat the UNIVAC and Grace Hopper being brought in to write an assembler for that. Computers would become the big mathematical number crunchers and slowly spread into being data processors from there. Following decades of batch processing mainframes we would get minicomputers and interactivity, then time sharing, and then the PC revolution. Distinct eras in computing. Today, computers do far more than just the types of math the ENIAC did. In fact, the functionality of ENIAC was duplicated onto a 20 megahertz microchip in 1996. You know, ‘cause the University of Pennsylvania wanted to do something to celebrate the 50th birthday. And a birthday party seemed underwhelming at the time. And so the date of release for this episode is February 15th, now ENIAC Day in Philadelphia, dedicated as a way to thank the university, creators, and programmers. And we should all reiterate their thanks. They helped put computers front and center into the thoughts of the next generation of physicists, mathematicians, and engineers, who built the mainframe era. And I should thank you - for listening to this episode. I’m pretty lucky to have ya’. Have a great day! .


Stewart Brand: Hippy Godfather of the Interwebs

     12/7/2019

Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Today we’re going to look at the impact Stewart Brand had on computing. Brand was one of the greatest muses of the interactive computing and then the internet revolutions. This isn’t to take anything away from his capacity to create, but the inspiration he provided gave him far more reach than nearly anyone in computing. There’s a decent chance you might not know who he his. There’s even a chance that you’ve never heard of any of his creations. But you live and breath some of his ideas on a daily basis. So who was this guy and what did he do? Well, Stewart Brand was born in 1938, in Rockford, Illinois. He would go on to study biology at Stanford, enter the military and then study design and photography at other schools in the San Francisco area. This was a special time in San Francisco. Revolution was in the air. And one of the earliest scientific studies had him legitimately dosing on LSD. One of my all-time favorite books was The Electric Kool-Aid Acid Test, by Tom Wolfe. In the book, Wolfe follows Ken Kesey and his band of Merry Pranksters along a journey of LSD and Benzedrine riddled hippy goodness, riding a converted school bus across the country and delivering a new kind of culture straight out of Haight-Ashbury and to the heart of middle America. All while steering clear of the shoes FBI agents of the day wore. Here he would have met members of the Grateful Dead, Neal Cassady, members of the Hells Angels, Wavy Gravy, Paul Krassner, and maybe even Kerouac and Ginsberg. This was a transition from the Beat Generation to the Hippies of the 60s. Then he started the Whole Earth Catalog. Here, he showed the first satallite imagery of the planet Earth, which he’d begun campaigning NASA to release two years earlier. In the 5 years he made the magazine, he spread ideals like ecology, a do it yourself mentality, self-sufficiency, and what the next wave of progress would look like. People like Craig Newmark of Craig’s List would see the magazine and it would help to form a new world view. In fact, the Whole Earth Catalog was a direct influence on Craig’s List. Steve Jobs compared the Whole Earth Catalog to a 60s era Google. It inspired Wired Magazine. Earth Day would be created two years later. Brand would loan equipment and inspire spinoffs of dozens of magazines and books. And even an inspiration for many early websites. The catalog put him in touch with so, so many influential people. One of the first was Doug Engelbart and The Mother Of All Demos involves him in the invention of the mouse and the first video conferencing. In fact, Brand helped produce the Mother Of All Demos! As we moved into the 70s he chronicled the oncoming hacker culture, and the connection to the 60s-era counterculture. He inspired and worked with Larry Brilliant, Lee Felsenstein, and Ted Nelson. He basically invented being a “futurist” founding CoEvolution Quarterly and spreading the word of digital utopianism. The Whole Earth Software Review would come along with the advent of personal computers. The end of the 70s would also see him become a special advisor to former California governor Jerry Brown. In the 70s and 80s, he saw the Internet form and went on to found one of the earliest Internet communities, called The WELL, or Whole Earth Lectronic Link. Collaborations in the WELL gave us Barlow’s The Electronic Frontier Foundation, a safe haunt for Kevin Mitnick while on the run, Grateful Dead tape trading, and many other Digerati. There would be other virtual communities and innovations to the concept like social networks, eventually giving us online forums, 4chan, Yelp, Facebook, LinkedIn, and corporate virtual communities. But it started with The Well. He would go on to become a visiting scientist in the MIT Media Lab, organize conferences, found the Global Business Network with Peter Schwarts, Jay Ogilvy and other great thinkers to help with promoting values and various planning like scenario planning, a corporate strategy that involves thinking from the outside in. This is now a practice inside Deloitte. The decades proceeded on and Brand inspired whole new generations to leverage humor to push the buttons of authority. Much as the pranksters inspired him on the bus. But it wasn’t just anti-authority. It was a new and innovative approach in an upcoming era of maximizing short-term profits at the expense of the future. Brand founded The Long Now Foundation with an outlook that looked 10,000 years in the future. They started a clock on Jeff Bezos’ land in Texas, they started archiving languages approaching extinction, Brian Eno led seminars about long-term thinking, and inspired Anathem, a novel from one of my favorite authors, Neal Stephenson. Peter Norton, Pierre Omidyar, Bruce Sterling, Chris Anderson of the Economist and many others are also involved. But Brand inspired other counter-cultures as well. In the era of e-zines, he inspired Jesse Dresden, who Brand knew as Jefferson Airplane Spencer Drydens kid. The kid turned out to be dFx, who would found HoHo Con an inspiration for DefCon. Stewart Brand wrote 5 books in addition to the countless hours he spent editing books, magazines, web sites, and papers. Today, you’ll find him pimping blockchain and cryptocurrency, in an attempt to continue decentralization and innovation. He inherited a playful counter-culture. He watched the rise and fall and has since both watched and inspired the innovative iterations of countless technologies, extending of course into bio-hacking. He’s hobnobbed with the hippies, the minicomputer timeshares, the PC hackers, the founders of the internet, the tycoons of the web, and then helped set strategy for industry, NGOs, and governments. He left something with each. Urania was the muse of astronomy, some of the top science in ancient Greece. And he would probably giggle if anyone compared him to the muse. Both on the bus in the 60s, and in his 80s today. He’s one of the greats and we’re lucky he graced us with his presence on this rock - that he helped us see from above for the first time. Just as I’m lucky you elected to listen to this episode. So next time you’re arguing about silly little things at work, think about what really matters and listen to one of his Ted Talks. Context. 10,000 years. Have a great week and thanks for listening to this episode of the History of Computing Podcast.


BASIC

     11/24/2019

BASIC Welcome to the History of Computing Podcast, where we explore the history of information technology. Because by understanding the past prepares us to innovate the future! Today we’re going to look at the computer that was the history of the BASIC programming language. We say BASIC but really BASIC is more than just a programming language. It’s a family of languages and stands for Beginner’s All-purpose Symbolic Instruction Code. As the name implies it was written to help students that weren’t math nerds learn how to use computers. When I was selling a house one time, someone was roaming around in my back yard and apparently they’d been to an open house and they asked if I’m a computer scientist after they saw a dozen books I’d written on my bookshelf. I really didn’t know how to answer that question We’ll start this story with Hungarian John George Kemeny. This guy was pretty smart. He was born in Budapest and moved to the US with his family in 1940 when his family fled anti-Jewish sentiment and laws in Hungary. Some of his family would go on to die in the Holocaust, including his grandfather. But safely nestled in New York City, he would graduate high school at the top of his class and go on to Princeton. Check this out, he took a year off to head out to Los Alamos and work on the Manhattan Project under Nobel laureate Richard Feynman. That’s where he met fellow Hungarian immigrant Jon Von Neumann - two of a group George Marx wrote about in his book on great Hungarian Emmigrant Scientists and thinkers called The Martians. When he got back to Princeton he would get his Doctorate and act as an assistant to Albert Einstein. Seriously, THE Einstein. Within a few years he was a full professor at Dartmouth and go on to publish great works in mathematics. But we’re not here to talk about those contributions to the world as an all around awesome place. You see, by the 60s math was evolving to the point that you needed computers. And Kemeny and Thomas Kurtz would do something special. Now Kurtz was another Dartmoth professor who got his PhD from Princeton. He and Kemeny got thick as thieves and wrote the Dartmouth Time-Sharing System (keep in mind that Time Sharing was all the rage in the 60s, as it gave more and more budding computer scientists access to those computer-things that prior to the advent of Unix and the PC revolution had mostly been reserved for the high priests of places like IBM. So Time Sharing was cool, but the two of them would go on to do something far more important. In 1956, they would write DARSIMCO, or Dartmouth Simplified Code. As with Pascal, you can blame Algol. Wait, no one has ever heard of DARSIMCO? Oh… I guess they wrote that other language you’re here to hear the story of as well. So in 59 they got a half million dollar grant from the Alfred P. Sloan foundation to build a new department building. That’s when Kurtz actually joined the department full time. Computers were just going from big batch processed behemoths to interactive systems. They tried teaching with DARSIMCO, FORTRAN, and the Dartmouth Oversimplified Programming Experiment, a classic acronym for 1960s era DOPE. But they didn’t love the command structure nor the fact that the languages didn’t produce feedback immediately. What was it called? Oh, so in 1964, Kemeny wrote the first iteration of the BASIC programming language and Kurtz joined him very shortly thereafter. They did it to teach students how to use computers. It’s that simple. And as most software was free at the time, they released it to the public. We might think of this as open source-is by todays standards. I say ish as Dartmouth actually choose to copyright BASIC. Kurtz has said that the name BASIC was chosen because “We wanted a word that was simple but not simple-minded, and BASIC was that one.” The first program I wrote was in BASIC. BASIC used line numbers and read kinda’ like the English language. The first line of my program said 10 print “Charles was here” And the computer responded that “Charles was here” - the second program I wrote just added a second line that said: 20 goto 10 Suddenly “Charles was here” took up the whole screen and I had to ask the teacher how to terminate the signal. She rolled her eyes and handed me a book. And that my friend, was the end of me for months. That was on an Apple IIc. But a lot happened with BASIC between 1964 and then. As with many technologies, it took some time to float around and evolve. The syntax was kinda’ like a simplified FORTRAN, making my FORTRAN classes in college a breeze. That initial distribution evolved into Dartmouth BASIC, and they received a $300k grant and used student slave labor to write the initial BASIC compiler. Mary Kenneth Keller was one of those students and went on to finish her Doctorate in 65 along with Irving Tang, becoming the first two PhDs in computer science. After that she went off to Clarke College to found their computer science department. The language is pretty easy. I mean, like PASCAL, it was made for teaching. It spread through universities like wildfire during the rise of minicomputers like the PDP from Digital Equipment and the resultant Data General Nova. This lead to the first text-based games in BASIC, like Star Trek. And then came the Altair and one of the most pivotal moments in the history of computing, the porting of BASIC to the platform by Microsoft co-founders Bill Gates and Paul Allen. But Tiny BASIC had appeared a year before and suddenly everyone needed “a basic.” You had Commodore BASIC, BBC Basic, Basic for the trash 80, the Apple II, Sinclair and more. Programmers from all over the country had learned BASIC in college on minicomputers and when the PC revolution came, a huge part of that was the explosion of applications, most of which were written in… you got it, BASIC! I typically think of the end of BASIC coming in 1991 when Microsoft bought Visual Basic off of Alan Cooper and object-oriented programming became the standard. But the things I could do with a simple if, then else statement. Or a for to statement or a while or repeat or do loop. Absolute values, exponential functions, cosines, tangents, even super-simple random number generation. And input and output was just INPUT and PRINT or LIST for source. Of course, functional programming was always simpler and more approachable. So there, you now have Kemeny as a direct connection between Einstein and the modern era of computing. Two immigrants that helped change the world. One famous, the other with a slightly more nuanced but probably no less important impact in a lot of ways. Those early BASIC programs opened our eyes. Games, spreadsheets, word processors, accounting, Human Resources, databases. Kemeny would go on to chair the commission investigating Three Mile Island, a partial nuclear meltdown that was a turning point in nuclear proliferation. I wonder what Kemeny thought when he read the following on the Statue of Liberty: Give me your tired, your poor, Your huddled masses yearning to breathe free, The wretched refuse of your teeming shore. Perhaps, like many before and after, he thought that he would breathe free and with that breath, do something great, helping bring the world into the nuclear era and preparing thousands of programmers to write software that would change the world. When you wake up in the morning, you have crusty bits in your eyes and things seem blurry at first. You have to struggle just a bit to get out of bed and see the sunrise. BASIC got us to that point. And for that, we owe them our sincerest thanks. And thank you dear listeners, for your contributions to the world in whatever way they may be. You’re beautiful. And of course thank you for giving me some meaning on this planet by tuning in. We’re so lucky to have you, have a great day!


The Altair 8800

     9/19/2019

Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Todays episode is on Agile Software Development. Agile software development is a methodology, or anti-methodology, or approach to software development that evolves the requirements a team needs to fulfill and the solutions they need to build in a collaborative, self-organized, and cross-functional way. Boy, that’s a lot to spit out there. I was in an elevator the other day and I heard someone say: “That’s not very agile.” And at that moment, I knew that I just couldn’t help but do an episode on agile. I’ve worked in a lot of teams that use a lot of variants of agile, scrum, Kanban, scrumban, Extreme Programing, Lean Software Development. Some of these are almost polar opposites and you still hear people talk about what is agile and if they want to make fun of people doing things an old way, they’ll say something like waterfall. Nothing ever was waterfall, given that you learn on the fly, find re-usable bits or hit a place where you just say that’s not possible. But that’s another story. The point here is that agile is, well, weaponized to back up what a person wants someone to do. Or how they want a team to be run. And it isn’t always done from an informed point of view. Why is Agile an anti-methodology? Think of it more like a classification maybe. There were a number of methodologies like Extreme Programming, Scrum, Kanban, Feature Driven Development, Adaptive Software Development, RAD, and Lean Software Development. These had come out to bring shape around a very similar idea. But over the course of 10-20 years, each had been developed in isolation. In college, I had a computer science professor who talked about “adaptive software development” from his days at a large power company in Georgia back in the 70s. Basically, you are always adapting what you’re doing based on speculation of how long something will take, collaboration on that observation and what you learn while actually building. This shaped how I view software development for years to come. He was already making fun of Waterfall methodologies, or a cycle where you write a large set of requirements and stick to them. Waterfall worked well if you were building a computer to land people on the moon. It was a way of saying “we’re not engineers, we’re software developers.” Later in college, with the rapid proliferation of the Internet and computers into dorm rooms I watched the emergence of rapid application development, where you let the interface requirements determine how you build. But once someone weaponized that by putting a label on it, or worse forking the label into spiral and unified models, then they became much less useful and the next hot thing had to come along. Kent Beck built a methodology called Extreme Programming - or XP for short - in 1996 and that was the next hotness. Here, we release software in shorter development cycles and software developers, like police officers on patrol work in pairs, reviewing and testing code and not writing each feature until it’s required. The idea of unit testing and rapid releasing really came out of the fact that the explosion of the Internet in the 90s meant people had to ship fast and this was also during the rise of really main-stream object-oriented programming languages. The nice thing about XP was that you could show a nice graph where you planned, managed, designed, coded, and tested your software. The rules of Extreme Programming included things like “Code the unit test first” - and “A stand up meeting starts each day.” Extreme Programming is one of these methodologies. Scrum is probably the one most commonly used today. But the rest, as well as the Crystal family of methodologies, are now classified as Agile software development methodologies. So it’s like a parent. Is agile really just a classification then? No. So where did agile come from? By 2001, Kent Beck, who developed Extreme Programming met with Ward Cunningham (who built WikiWikiWeb, the first wiki), Dave Thomas, a programmer who has since written 11 books, Jeff Sutherland and Ken Schwaber, who designed Scrum. Jim Highsmith, who developed that Adaptive Software Development methodology, and many others were at the time involved in trying to align an organizational methodology that allowed software developers to stop acting like people that built bridges or large buildings. Most had day jobs but they were like-minded and decided to meet at a quaint resort in Snowbird, Utah. They might have all wanted to use the methodologies that each of them had developed. But if they had all been jerks then they might not have had a shift in how software would be written for the next 20+ years. They decided to start with something simple, a statement of values; instead of Instead of bickering and being dug into specific details, they were all able to agree that software development should not be managed in the same fashion as engineering projects are run. So they gave us the Manifesto for Agile Software Development… The Manifesto reads: We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value: * Individuals and interactions over processes and tools * Working software over comprehensive documentation * Customer collaboration over contract negotiation * Responding to change over following a plan That is, while there is value in the items on the right, we value the items on the left more. But additionally, the principles dig into and expand upon some of that adjacently. The principles behind the Agile Manifesto: Our highest priority is to satisfy the customer through early and continuous delivery of valuable software. Welcome changing requirements, even late in development. Agile processes harness change for the customer's competitive advantage. Deliver working software frequently, from a couple of weeks to a couple of months, with a preference to the shorter timescale. Business people and developers must work together daily throughout the project. Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done. The most efficient and effective method of conveying information to and within a development team is face-to-face conversation. Working software is the primary measure of progress. Agile processes promote sustainable development. The sponsors, developers, and users should be able to maintain a constant pace indefinitely. Continuous attention to technical excellence and good design enhances agility. Simplicity--the art of maximizing the amount of work not done--is essential. The best architectures, requirements, and designs emerge from self-organizing teams. At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behavior accordingly. Many of the words here are easily weaponized. For example, “satisfy the customer.” Who’s the customer? The product manager? The end user? The person in an enterprise who actually buys the software? The person in that IT department that made the decision to buy the software? In the scrum methodology, the customer is not known. The product owner is their representative. But the principles should need to identify that, just use the word so each methodology makes sure to cover it. Now take “continuous delivery.” People frequently just lump CI in there with CD. I’ve heard continuous design, continuous improvement, continuous deployment, continuous podcasting. Wait, I made the last one up. We could spend hours going through each of these and identifying where they aren’t specific enough. Or, again, we could revel in their lack of specificity by pointing us into the direction of a methodology where these words get much more specific meanings. Ironically, I know accounting teams at very large companies that have scrum masters, engineering teams for big projects with a project manager and a scrum master, and even a team of judges that use agile methodologies. There are now scrum masters embedded in most software teams of note. But once you see Agile on the cover of The Harvard Business Review, you hate to do this given all the classes in agile/XP/scrum - but you have to start wondering what’s next? For 20 years, we’ve been saying “stop treating us like engineers” or “that’s waterfall.” Every methodology seems to grow. Right after I finished my PMP I was on a project with someone else that had just finished theirs. I think they tried to implement the entire Project management Body of Knowledge. If you try to have every ceremony from Scrum, you’re not likely to even have half a day left over to write any code. But you also don’t want to be like the person on the elevator, weaponizing only small parts of a larger body of work, just to get your way. And more importantly, to admit that none of us have all the right answers and be ready to, as they say in Extreme Programming: Fix XP when it breaks - which is similar to Boyd’s Destruction and Creation, or the sustenance and destruction in Lean Six-Sigma. Many of us forget that last part: be willing to walk away from the dogma and start over. Thomas Jefferson called for a revolution every 20 years. We have two years to come up with a replacement! And until you replace me, thank you so very much for tuning into another episode of the History of Computing Podcast. We’re lucky to have you. Have a great day!


Wikipedia

     9/2/2019

Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Todays episode is on the history of Wikipedia. The very idea of a single location that could store all the known information in the world began with Ptolemy I, founder of the Greek dynasty that ruled Egypt following the death of Alexander the great. He and his son amassed 100s of thousands of scrolls in the Library and Alexandria from 331 BC and on. The Library was part of a great campus of the Musaeum where they also supported great minds starting with Ptolemy I’s patronage of Euclid, the father of geometry, and later including Archimedes, the father of engineering, Hipparchus, the founder of trigonometry, Her, the father of math, and Herophilus, who gave us the scientific method and countless other great hellenistic thinkers. The Library entered into a slow decline that began with the expulsion of intellectuals from Alexandria in 145BC. Ptolemy VIII was responsible for that. Always be weary of people who attack those that they can’t win over especially when they start blaming the intellectual elite for the problems of the world. This began a slow decline of the library until it burned, first with a small fire accidentally set by Caesar in 48BC and then for good in the 270s AD. In the centuries since there have been attempts here and there to gather great amounts of information. The first known encyclopedia was the Naturalis Historiae by Pliny the Elder, never completed because he was killed in the eruption of Vesuvius. One of the better known being the Encyclopedia Britannica, starting off in 1768. Mass production of these was aided by the printing press but given that there’s a cost to producing those materials and a margin to be made in the sale of those materials that encouraged a somewhat succinct exploration of certain topics. The advent of the computer era of course led to encyclopedias on CD and then to online encyclopedias. Encyclopedias at the time employed experts in certain fields and paid them for compiling and editing articles for volumes that would then be sold. As we say these days, this was a business model just waiting to be disrupted. Jimmy Wales was moderating an online discussion board on Objectivism and happened across Larry Sanger in the early 90s. They debated and became friends. Wales started Nupedia, which was supposed to be a free encyclopedia, funded by advertising revenue. As it was to be free, they were to recruit thousands of volunteer editors. People of the caliber that had been previously hired to research and write articles for encyclopedias. Sanger, who was pursuing a PhD in philosophy from Ohio State University, was hired on as editor-in-chief. This was a twist on the old model of compiling an encyclopedia and a twist that didn’t work out as intended. Volunteers were slow to sign up, but Nupedia went online in 2000. Later in the year there had only been two articles that made it through the review process. When Sanger told Ben Kovitz about this, he recommended looking at the emerging wiki culture. This had been started with WikiWikiWeb, developed by Ward Cunningham in 1994, named after a shuttle bus that ran between airport terminals at the Honolulu airport. WikiWikiWeb had been inspired by Hypercard but needed to be multi-user so people could collaborate on web pages, quickly producing content on new patterns in programming. He wanted to make non-writers feel ok about writing. Sanger proposed using a wiki to be able to accept submissions for articles and edits from anyone but still having a complicated review process to accept changes. The reviewers weren’t into that, so they started a side project they called Wikipedia in 2001 with a user-generated model for content, or article, generation. The plan was to generate articles on Wikipedia and then move or copy them into Nupedia once they were ready. But Wikipedia got mentioned on Slashdot. In 2001 there were nearly 30 million websites but half a billion people using the web. Back then a mention on the influential Slashdot could make a site. And it certainly helped. They grew and more and more people started to contribute. They hit 1,000 articles in March of 2001 and that increased by 10 fold by September, By And another 4 fold the next year. It started working independent of Nupedia. The dot-com bubble burst in 2000 and by 2002 Nupedia had to lay Sanger off and he left both projects. Nupedia slowly died and was finally shut down in 2003. Eventually the Wikimedia Foundation was built to help unlock the world’s knowledge, which now owns and operates Wikipedia. Wikimedia also includes Commons for media, Wikibooks that includes free textbooks and manuals, Wikiquote for quotations, Wikiversity for free learning materials, MediaWiki the source code for the site, Wikidata for pulling large amounts of data from Wikimedia properties using APIs, Wikisource, a library of free content, Wikivoyage, a free travel guide, Wikinews, free news, Wikispecies, a directory containing over 687,000 species. Many of the properties have very specific ways of organizing data, making it easier to work with en masse. The properties have grown because people like to be helpful and Wales allowed self-governance of articles. To this day he rarely gets involved in the day-to-day affairs of the wikipedia site, other than the occasional puppy dog looks in banners asking for donations. You should donate. He does have 8 principles the site is run by: 1. Wikipedia’s success to date is entirely a function of our open community. 2. Newcomers are always to be welcomed. 3. “You can edit this page right now” is a core guiding check on everything that we do. 4. Any changes to the software must be gradual and reversible. 5. The open and viral nature of the GNU Free Documentation License and the Create Commons Attribution/Share-Alike License is fundamental to the long-term success of the site. 6. Wikipedia is an encyclopedia. 7. Anyone with a complaint should be treated with the utmost respect and dignity. 8. Diplomacy consists of combining honesty and politeness. This culminates in 5 pillars wikipedia is built on: 1. Wikipedia is an encyclopedia. 2. Wikipedia is written from a neutral point of view. 3. Wikipedia is free content that anyone can use, edit, and distribute. 4. Wikipedia’s editors should treat each other with respect and civility. 5. Wikipedia has no firm rules. Sanger went on to found Citizendium, which uses real names instead of handles, thinking maybe people will contribute better content if their name is attached to something. The web is global. Throughout history there have been encyclopedias produced around the world, with the Four Great Books of Song coming out of 11th century China, the Encyclopedia of the Brethren of Purity coming out of 10th century Persia. When Wikipedia launched, it was in English. Wikipedia launched a German version using the deutsche.wikipedia.com subdomain. It now lives at de.wikipedia.com and Wikipedia has gone from being 90% English to being almost 90 % non-English, meaning that Wikipedia is able to pull in even more of the world’s knowledge. Wikipedia picked up nearly 20,000 English articles in 2001, over 75,000 new articles in 2002, and that number has steadily climbed wreaching over 3,000,000 by 2010, and we’re closing in on 6 Million today. The English version is 10 terabytes of data uncompressed. If you wanted to buy a printed copy of wikipedia today, it would be over 2500 books. By 2009 Microsoft Encarta shut down. By 2010 Encyclopedia Britannica stopped printing their massive set of books and went online. You can still buy encyclopedias from specialty makers, such as the World Book. Ironically, Encyclopedia Britannica does now put real names of people on articles they produce on their website, in an ad-driven model. There are a lot of ads. And the content isn’t linked to as many places nor as thorough. Creating a single location that could store all the known information in the world seems like a pretty daunting task. Compiling the non-copywritten works of the world is now the mission of Wikipedia. The site receives the fifth most views per month and is read by nearly half a billion people a month with over 15 billion page views per month. Anyone who has gone down the rabbit hole of learning about Ptolemy I’s involvement in developing the Library of Alexandria and then read up on his children and how his dynasty lasted until Cleopatra and how… well, you get the point… can understand how they get so much traffic. Today there are over 48,000,000 articles and over 37,000,000 registered users who have contributed articles meaning if we set 160 Great Libraries of Alexandria side-by-side we would have about the same amount of information Wikipedia has amassed. And it’s done so because of the contributions of so many dedicated people. People who spend hours researching and building pages, undergoing the need to provide references to cite the data in the articles (btw wikipedia is not supposed to represent original research), more people to patrol and look for content contributed by people on a soapbox or with an agenda, rather than just reporting the facts. Another team looking for articles that need more information. And they do these things for free. While you can occasionally see frustrations from contributors, it is truly one of the best things humanity has done. This allows us to rediscover our own history, effectively compiling all the facts that make up the world we live in, often linked to the opinions that shape them in the reference materials, which include the over 200 million works housed at the US Library of Congress, and over 25 million books scanned into Google Books (out of about 130 million). As with the Great Library of Alexandria, we do have to keep those who seek to throw out the intellectuals of the world away and keep the great works being compiled from falling to waste due to inactivity. Wikipedia keeps a history of pages, to avoid revisionist history. The servers need to be maintained, but the database can be downloaded and is routinely downloaded by plenty of people. I think the idea of providing an encyclopedia for free that was sponsored by ads was sound. Pivoting the business model to make it open was revolutionary. With the availability of the data for machine learning and the ability to enrich it with other sources like genealogical research, actual books, maps, scientific data, and anything else you can manage, I suspect we’ll see contributions we haven’t even begun to think about! And thanks to all of this, we now have a real compendium of the worlds knowledge, getting more and more accurate and holistic by the day. Thank you to everyone involved, from Jimbo and Larry, to the moderators, to the staff, and of course to the millions of people who contribute pages about all the history that makes up the world as we know it today. And thanks to you for listening to yet another episode of the History of Computing Podcast. We’re lucky to have you. Have a great day! Note: This work was produced in large part due to the compilation of historical facts available at https://en.wikipedia.org/wiki/History_of_Wikipedia


Learning Along the Oregon Trail

     9/20/2020

We've all played the Oregon Trail, but what do you know about it's origins? First developed as a mainframe program all the way back in 1971, the Oregon Trail was intended as an educational game first and foremost. In fact, it traces its linage to some of the first efforts to get computers into the classroom. Today we are following the trail back to it's source and seeing how the proper environment was built to create this classic game.

You can play the 1975 version here: https://archive.org/details/OregonTrailMainframe 

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing


Brad Chase Interview, Marketing Lead for Windows 95 and Much More

     7/5/2020

I recently got the chance to sit down and talk with Microsoft alumni Brad Chase. He was the product manager for Microsoft Works on the Macintosh, DOS 5, DOS 6, and the marketing lead for Windows 95 as well as much more. We talk about the Apple-Microsoft relationship, the groundbreaking launch of Windows 95, and what it takes to sell software.

Editing for this episode was handled by Franck, you can follow him on instagram: www.instagram.com/frc.audio/

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing


Floppy Days Episode 1

     2/17/2013

Discussion of what's to come in this new vintage computing podcast.  Also, a discussion of the vintage computers in the host's collection.


The History of Computing Ep 10: Computers and the Space Race

     3/25/2020

We go knee-deep into available computing technology in the late 1950's and what it was used for: Missles and Satellites.  We see the creation of the NASA RTCC in a muddy field and revisit what IBM is up to.


Magnetic: The History of Computing ep 4

     12/9/2019

This episode looks at some really interesting inventions with the Bulb and Electricity that played a major role in the development of the electronic computer.  This includes vacuum tubes and Cathode Ray Tubes (CRT's).


ITS: Open Computing

     1/11/2021

Modern operating systems adhere to a pretty rigid formula. They all have users with password-protected accounts and secure files. They all have restrictions to keep programs from breaking stuff. That design has been common for a long time, but that doesn't make it the best solution. In the late 60s ITS, the Incompatible Timesharing System, was developed as a more exciting alternative. ITS was built for hackers to play, there were no passwords, any anyone who could find ITS was welcome to log in.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and bonus content: https://www.patreon.com/adventofcomputing


Lars Brinkhoff Interview, Preserving ITS

     1/18/2021

Lars Brinkhoff has been spearheading the effort to keep the incompatible Timesharing System alive. Today we sit down to talk about the overall ITS restoration project, software preservation, and how emulation can help save the past.

You can find the full restoration project at github: https://github.com/PDP-10/its

And follow Lars on twitter: @larsbrinkhoff


On Chariots of the Gods?

     2/6/2021

Humanity is searching for meaning. We binge tv shows. We get lost in fiction. We make up amazing stories about super heroes. We hunt for something deeper than what’s on the surface. We seek conspiracies or... aliens.

I finally got around to reading a book that had been on my list for a long time, recently. Not because I thought I would agree with its assertions - but because it came up from time to time in my research. 

Chariots of the Gods? is a book written in 1968 by German Erich Von Daniken. He goes through a few examples to, in his mind, prove that aliens not only had been to Earth but that they destroyed Sodom with fire and brimstone which he said was a nuclear explosion. He also says the Ark of the Covenant was actually a really big walkie-talkie for calling space. 

Ultimately, the thesis centers around the idea than humans could not possibly have made the technological leaps we did and so must have been given to us from the gods. I find this to be a perfectly satisfactory science fiction plot. In fact, various alien conspiracy theories seemed to begin soon after Orson Welles 1938 live adaption of H.G. Wells’ War of the Worlds and like a virus, they mutated. But did this alien virus start in a bat in Wuhan or in Roman Syria. 

The ancient Greeks and then Romans had a lot of gods. Lucian of Samosata thought they should have a couple more. He wove together a story, which he called “A True Story.” In it, he says it’s all make-believe. Because they believed in multiple pantheons of gods in modern day Syria in the second century AD. In the satire, Lucian and crew get taken to the Moon where they get involved in a war between the Moon and the Sun kings for the rights to colonize the Morning Star. They then get eaten by a whale and escape and travel meeting great Greeks through time including Pythagoras, Homer, and Odysseus. And they find the new world. Think of how many modern plots are wrapped up in that book from the second century, made to effectively make fun of storytellers like Homer?

The 1800s was one of the first centuries where humanity began to inherit a rapid merger and explosion of scientific understanding and Edgar Allan Poe again took us to the moon in "The Unparalleled Adventure of One Hans Pfaall" in 1835. Jules Verne, Mary Shelley, and then H.G. Welles with that War of the Worlds in 1898. By then we’d mapped the surface of the moon with telescopes, so they wrote of Mars and further. H.P. Lovecraft gave us the Call of Cthulhu. These authors predicted the future - but science fiction became a genre that did more. It helped us create satire or allegory or just comparisons to these rapid global changes in ways that called out the social impact to consider before or after we invent. And to just cope with evolving social norms. The magazine Amazing Stories came in 1926 and the greatest work of science fiction premiered in 1942 with Isaac Asimov’s Foundation. Science fiction was opening our eyes to what was possible and opened the minds of scientists to study what we might create in the future. But it wasn’t real. 

Von Daniken and French author Robert Charroux seemed to influence one another in taking history and science and turning them into pseudohistory and pseudoscience. And both got many of their initial ideas from the 1960 book, The Morning of the Magicians. But Chariots of the Gods? was a massive success and a best seller. And rather than be dismissed it has now spread to include conspiracy and other theories. Which is fine as fiction, not as non-fiction. 

Let’s look at some other specific examples from Chariots of the Gods? Von Daniken claims that Japanese Dogu figures were carvings of aliens. He claims there were alien helicopter carvings in an Egyptian temple. He claims the Nazca lines in Peru were a way to call aliens and that a map from 1513 actually showed the earth from space rather than thinking it possible that cartography was capable of showing a somewhat accurate representation of the world in the Age of Discovery. He claimed stories in the Bible were often inspired by alien visits much as some First Nation peoples and cargo cults thought people in ships visiting their lands for the first time might be gods. 

The one thing I’ve learned researching these episodes is that technology has been a constant evolution. Many of our initial discoveries like fire, agriculture, and using the six simple machines could be observed in nature. From the time we learned to make fire, it was only a matter of time before humanity discovered that stones placed in or around fire might melt in certain ways - and so metallurgy was born. We went through population booms as we discovered each of these.

We used the myths and legends that became religions to hand down knowledge, as I was taught to use mnemonics to memorize the seven layers of the OSI model. That helped us preserve knowledge of astronomy across generations so we could explore further and better maintain our crops. 

The ancient Sumerians then Babylonians gave us writing. But we had been drawing on caves for thousands of years. Which seems more likely, that we were gifted this advance or that as we began to settle in more dense urban centers that we out of a need to scale operations tracked the number of widgets we had with markings that, over time evolved into a written language? First through pictures and then through words that evolved into sentences and then epics? We could pass down information more reliably across generation. 

Trade and commerce and then ziggurats and pyramids help hone our understanding of mathematics. The study of logic and automata allowed us to build bigger and faster and process more raw materials. Knowledge of all of these discoveries spread across trade routes. 

So ask yourself this. Which is more likely, the idea that humans maintained a constant, ever-evolving stream of learned ingenuity that was passed down for tens of thousands of years until it accelerated when we learned to write, or do you think aliens from outer space instead gave us technology? 

I find it revokes our very agency to assert anything but the idea that humans are capable of the fantastic feats we have reached and believe it insulting to take away from the great philosophers, discoverers, scientists, and thinkers that got us where we are today. 

Our species has long made up stories to explain that which the science of the day cannot. Before we understand the why, we make up stories about the how. This allowed us to pass knowledge down between generations. We see this in ancient explanations of the movements of stars before we had astrolabes. We see humans want to leave something behind that helps the next generations, or burial sites like with Stonehenge - not summon Thor from an alien planet as Marvel has rewritten their own epics to indicate. In part based on rethinking these mythos in the context of Chariots of the Gods?

Ultimately the greater our gaps in understanding, the more disconnected with ourselves I find that most people are. We listen to talking heads rather than think for ourselves. We get lost in theories of cabals. We seek a deeper, missing knowledge because we can’t understand everything in front of us. 

Today, if we know where to look, and can decipher the scientific jargon, all the known knowledge of science and history are at our fingertips. But it can take a lifetime to master one of thousands of fields of scientific research. If we don’t have that specialty then we can perceive it as unreachable and think maybe this pseudohistorical account of humanity is true and maybe aliens gave us 

If we feel left behind then it becomes easier to blame others when we can’t get below the surface of complicated concepts. Getting left behind might mean that jobs don’t pay what they paid our parents. We may perceive others as getting attention or resources we feel we deserve. We may feel isolated and alone. And all of those are valid feelings. When they’re heard then maybe we can look to the future instead of accepting pseudoscience and pseudohistory and conspiracies. Because while they make for fun romps on the big screen, they’re dangerous when taken as fact.


(OldComputerPods) ©Sean Haas, 2020