'history' Episodes

The Altair 8800

     9/19/2019

Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Todays episode is on Agile Software Development. Agile software development is a methodology, or anti-methodology, or approach to software development that evolves the requirements a team needs to fulfill and the solutions they need to build in a collaborative, self-organized, and cross-functional way. Boy, that’s a lot to spit out there. I was in an elevator the other day and I heard someone say: “That’s not very agile.” And at that moment, I knew that I just couldn’t help but do an episode on agile. I’ve worked in a lot of teams that use a lot of variants of agile, scrum, Kanban, scrumban, Extreme Programing, Lean Software Development. Some of these are almost polar opposites and you still hear people talk about what is agile and if they want to make fun of people doing things an old way, they’ll say something like waterfall. Nothing ever was waterfall, given that you learn on the fly, find re-usable bits or hit a place where you just say that’s not possible. But that’s another story. The point here is that agile is, well, weaponized to back up what a person wants someone to do. Or how they want a team to be run. And it isn’t always done from an informed point of view. Why is Agile an anti-methodology? Think of it more like a classification maybe. There were a number of methodologies like Extreme Programming, Scrum, Kanban, Feature Driven Development, Adaptive Software Development, RAD, and Lean Software Development. These had come out to bring shape around a very similar idea. But over the course of 10-20 years, each had been developed in isolation. In college, I had a computer science professor who talked about “adaptive software development” from his days at a large power company in Georgia back in the 70s. Basically, you are always adapting what you’re doing based on speculation of how long something will take, collaboration on that observation and what you learn while actually building. This shaped how I view software development for years to come. He was already making fun of Waterfall methodologies, or a cycle where you write a large set of requirements and stick to them. Waterfall worked well if you were building a computer to land people on the moon. It was a way of saying “we’re not engineers, we’re software developers.” Later in college, with the rapid proliferation of the Internet and computers into dorm rooms I watched the emergence of rapid application development, where you let the interface requirements determine how you build. But once someone weaponized that by putting a label on it, or worse forking the label into spiral and unified models, then they became much less useful and the next hot thing had to come along. Kent Beck built a methodology called Extreme Programming - or XP for short - in 1996 and that was the next hotness. Here, we release software in shorter development cycles and software developers, like police officers on patrol work in pairs, reviewing and testing code and not writing each feature until it’s required. The idea of unit testing and rapid releasing really came out of the fact that the explosion of the Internet in the 90s meant people had to ship fast and this was also during the rise of really main-stream object-oriented programming languages. The nice thing about XP was that you could show a nice graph where you planned, managed, designed, coded, and tested your software. The rules of Extreme Programming included things like “Code the unit test first” - and “A stand up meeting starts each day.” Extreme Programming is one of these methodologies. Scrum is probably the one most commonly used today. But the rest, as well as the Crystal family of methodologies, are now classified as Agile software development methodologies. So it’s like a parent. Is agile really just a classification then? No. So where did agile come from? By 2001, Kent Beck, who developed Extreme Programming met with Ward Cunningham (who built WikiWikiWeb, the first wiki), Dave Thomas, a programmer who has since written 11 books, Jeff Sutherland and Ken Schwaber, who designed Scrum. Jim Highsmith, who developed that Adaptive Software Development methodology, and many others were at the time involved in trying to align an organizational methodology that allowed software developers to stop acting like people that built bridges or large buildings. Most had day jobs but they were like-minded and decided to meet at a quaint resort in Snowbird, Utah. They might have all wanted to use the methodologies that each of them had developed. But if they had all been jerks then they might not have had a shift in how software would be written for the next 20+ years. They decided to start with something simple, a statement of values; instead of Instead of bickering and being dug into specific details, they were all able to agree that software development should not be managed in the same fashion as engineering projects are run. So they gave us the Manifesto for Agile Software Development… The Manifesto reads: We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value: * Individuals and interactions over processes and tools * Working software over comprehensive documentation * Customer collaboration over contract negotiation * Responding to change over following a plan That is, while there is value in the items on the right, we value the items on the left more. But additionally, the principles dig into and expand upon some of that adjacently. The principles behind the Agile Manifesto: Our highest priority is to satisfy the customer through early and continuous delivery of valuable software. Welcome changing requirements, even late in development. Agile processes harness change for the customer's competitive advantage. Deliver working software frequently, from a couple of weeks to a couple of months, with a preference to the shorter timescale. Business people and developers must work together daily throughout the project. Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done. The most efficient and effective method of conveying information to and within a development team is face-to-face conversation. Working software is the primary measure of progress. Agile processes promote sustainable development. The sponsors, developers, and users should be able to maintain a constant pace indefinitely. Continuous attention to technical excellence and good design enhances agility. Simplicity--the art of maximizing the amount of work not done--is essential. The best architectures, requirements, and designs emerge from self-organizing teams. At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behavior accordingly. Many of the words here are easily weaponized. For example, “satisfy the customer.” Who’s the customer? The product manager? The end user? The person in an enterprise who actually buys the software? The person in that IT department that made the decision to buy the software? In the scrum methodology, the customer is not known. The product owner is their representative. But the principles should need to identify that, just use the word so each methodology makes sure to cover it. Now take “continuous delivery.” People frequently just lump CI in there with CD. I’ve heard continuous design, continuous improvement, continuous deployment, continuous podcasting. Wait, I made the last one up. We could spend hours going through each of these and identifying where they aren’t specific enough. Or, again, we could revel in their lack of specificity by pointing us into the direction of a methodology where these words get much more specific meanings. Ironically, I know accounting teams at very large companies that have scrum masters, engineering teams for big projects with a project manager and a scrum master, and even a team of judges that use agile methodologies. There are now scrum masters embedded in most software teams of note. But once you see Agile on the cover of The Harvard Business Review, you hate to do this given all the classes in agile/XP/scrum - but you have to start wondering what’s next? For 20 years, we’ve been saying “stop treating us like engineers” or “that’s waterfall.” Every methodology seems to grow. Right after I finished my PMP I was on a project with someone else that had just finished theirs. I think they tried to implement the entire Project management Body of Knowledge. If you try to have every ceremony from Scrum, you’re not likely to even have half a day left over to write any code. But you also don’t want to be like the person on the elevator, weaponizing only small parts of a larger body of work, just to get your way. And more importantly, to admit that none of us have all the right answers and be ready to, as they say in Extreme Programming: Fix XP when it breaks - which is similar to Boyd’s Destruction and Creation, or the sustenance and destruction in Lean Six-Sigma. Many of us forget that last part: be willing to walk away from the dogma and start over. Thomas Jefferson called for a revolution every 20 years. We have two years to come up with a replacement! And until you replace me, thank you so very much for tuning into another episode of the History of Computing Podcast. We’re lucky to have you. Have a great day!


Wikipedia

     9/2/2019

Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Todays episode is on the history of Wikipedia. The very idea of a single location that could store all the known information in the world began with Ptolemy I, founder of the Greek dynasty that ruled Egypt following the death of Alexander the great. He and his son amassed 100s of thousands of scrolls in the Library and Alexandria from 331 BC and on. The Library was part of a great campus of the Musaeum where they also supported great minds starting with Ptolemy I’s patronage of Euclid, the father of geometry, and later including Archimedes, the father of engineering, Hipparchus, the founder of trigonometry, Her, the father of math, and Herophilus, who gave us the scientific method and countless other great hellenistic thinkers. The Library entered into a slow decline that began with the expulsion of intellectuals from Alexandria in 145BC. Ptolemy VIII was responsible for that. Always be weary of people who attack those that they can’t win over especially when they start blaming the intellectual elite for the problems of the world. This began a slow decline of the library until it burned, first with a small fire accidentally set by Caesar in 48BC and then for good in the 270s AD. In the centuries since there have been attempts here and there to gather great amounts of information. The first known encyclopedia was the Naturalis Historiae by Pliny the Elder, never completed because he was killed in the eruption of Vesuvius. One of the better known being the Encyclopedia Britannica, starting off in 1768. Mass production of these was aided by the printing press but given that there’s a cost to producing those materials and a margin to be made in the sale of those materials that encouraged a somewhat succinct exploration of certain topics. The advent of the computer era of course led to encyclopedias on CD and then to online encyclopedias. Encyclopedias at the time employed experts in certain fields and paid them for compiling and editing articles for volumes that would then be sold. As we say these days, this was a business model just waiting to be disrupted. Jimmy Wales was moderating an online discussion board on Objectivism and happened across Larry Sanger in the early 90s. They debated and became friends. Wales started Nupedia, which was supposed to be a free encyclopedia, funded by advertising revenue. As it was to be free, they were to recruit thousands of volunteer editors. People of the caliber that had been previously hired to research and write articles for encyclopedias. Sanger, who was pursuing a PhD in philosophy from Ohio State University, was hired on as editor-in-chief. This was a twist on the old model of compiling an encyclopedia and a twist that didn’t work out as intended. Volunteers were slow to sign up, but Nupedia went online in 2000. Later in the year there had only been two articles that made it through the review process. When Sanger told Ben Kovitz about this, he recommended looking at the emerging wiki culture. This had been started with WikiWikiWeb, developed by Ward Cunningham in 1994, named after a shuttle bus that ran between airport terminals at the Honolulu airport. WikiWikiWeb had been inspired by Hypercard but needed to be multi-user so people could collaborate on web pages, quickly producing content on new patterns in programming. He wanted to make non-writers feel ok about writing. Sanger proposed using a wiki to be able to accept submissions for articles and edits from anyone but still having a complicated review process to accept changes. The reviewers weren’t into that, so they started a side project they called Wikipedia in 2001 with a user-generated model for content, or article, generation. The plan was to generate articles on Wikipedia and then move or copy them into Nupedia once they were ready. But Wikipedia got mentioned on Slashdot. In 2001 there were nearly 30 million websites but half a billion people using the web. Back then a mention on the influential Slashdot could make a site. And it certainly helped. They grew and more and more people started to contribute. They hit 1,000 articles in March of 2001 and that increased by 10 fold by September, By And another 4 fold the next year. It started working independent of Nupedia. The dot-com bubble burst in 2000 and by 2002 Nupedia had to lay Sanger off and he left both projects. Nupedia slowly died and was finally shut down in 2003. Eventually the Wikimedia Foundation was built to help unlock the world’s knowledge, which now owns and operates Wikipedia. Wikimedia also includes Commons for media, Wikibooks that includes free textbooks and manuals, Wikiquote for quotations, Wikiversity for free learning materials, MediaWiki the source code for the site, Wikidata for pulling large amounts of data from Wikimedia properties using APIs, Wikisource, a library of free content, Wikivoyage, a free travel guide, Wikinews, free news, Wikispecies, a directory containing over 687,000 species. Many of the properties have very specific ways of organizing data, making it easier to work with en masse. The properties have grown because people like to be helpful and Wales allowed self-governance of articles. To this day he rarely gets involved in the day-to-day affairs of the wikipedia site, other than the occasional puppy dog looks in banners asking for donations. You should donate. He does have 8 principles the site is run by: 1. Wikipedia’s success to date is entirely a function of our open community. 2. Newcomers are always to be welcomed. 3. “You can edit this page right now” is a core guiding check on everything that we do. 4. Any changes to the software must be gradual and reversible. 5. The open and viral nature of the GNU Free Documentation License and the Create Commons Attribution/Share-Alike License is fundamental to the long-term success of the site. 6. Wikipedia is an encyclopedia. 7. Anyone with a complaint should be treated with the utmost respect and dignity. 8. Diplomacy consists of combining honesty and politeness. This culminates in 5 pillars wikipedia is built on: 1. Wikipedia is an encyclopedia. 2. Wikipedia is written from a neutral point of view. 3. Wikipedia is free content that anyone can use, edit, and distribute. 4. Wikipedia’s editors should treat each other with respect and civility. 5. Wikipedia has no firm rules. Sanger went on to found Citizendium, which uses real names instead of handles, thinking maybe people will contribute better content if their name is attached to something. The web is global. Throughout history there have been encyclopedias produced around the world, with the Four Great Books of Song coming out of 11th century China, the Encyclopedia of the Brethren of Purity coming out of 10th century Persia. When Wikipedia launched, it was in English. Wikipedia launched a German version using the deutsche.wikipedia.com subdomain. It now lives at de.wikipedia.com and Wikipedia has gone from being 90% English to being almost 90 % non-English, meaning that Wikipedia is able to pull in even more of the world’s knowledge. Wikipedia picked up nearly 20,000 English articles in 2001, over 75,000 new articles in 2002, and that number has steadily climbed wreaching over 3,000,000 by 2010, and we’re closing in on 6 Million today. The English version is 10 terabytes of data uncompressed. If you wanted to buy a printed copy of wikipedia today, it would be over 2500 books. By 2009 Microsoft Encarta shut down. By 2010 Encyclopedia Britannica stopped printing their massive set of books and went online. You can still buy encyclopedias from specialty makers, such as the World Book. Ironically, Encyclopedia Britannica does now put real names of people on articles they produce on their website, in an ad-driven model. There are a lot of ads. And the content isn’t linked to as many places nor as thorough. Creating a single location that could store all the known information in the world seems like a pretty daunting task. Compiling the non-copywritten works of the world is now the mission of Wikipedia. The site receives the fifth most views per month and is read by nearly half a billion people a month with over 15 billion page views per month. Anyone who has gone down the rabbit hole of learning about Ptolemy I’s involvement in developing the Library of Alexandria and then read up on his children and how his dynasty lasted until Cleopatra and how… well, you get the point… can understand how they get so much traffic. Today there are over 48,000,000 articles and over 37,000,000 registered users who have contributed articles meaning if we set 160 Great Libraries of Alexandria side-by-side we would have about the same amount of information Wikipedia has amassed. And it’s done so because of the contributions of so many dedicated people. People who spend hours researching and building pages, undergoing the need to provide references to cite the data in the articles (btw wikipedia is not supposed to represent original research), more people to patrol and look for content contributed by people on a soapbox or with an agenda, rather than just reporting the facts. Another team looking for articles that need more information. And they do these things for free. While you can occasionally see frustrations from contributors, it is truly one of the best things humanity has done. This allows us to rediscover our own history, effectively compiling all the facts that make up the world we live in, often linked to the opinions that shape them in the reference materials, which include the over 200 million works housed at the US Library of Congress, and over 25 million books scanned into Google Books (out of about 130 million). As with the Great Library of Alexandria, we do have to keep those who seek to throw out the intellectuals of the world away and keep the great works being compiled from falling to waste due to inactivity. Wikipedia keeps a history of pages, to avoid revisionist history. The servers need to be maintained, but the database can be downloaded and is routinely downloaded by plenty of people. I think the idea of providing an encyclopedia for free that was sponsored by ads was sound. Pivoting the business model to make it open was revolutionary. With the availability of the data for machine learning and the ability to enrich it with other sources like genealogical research, actual books, maps, scientific data, and anything else you can manage, I suspect we’ll see contributions we haven’t even begun to think about! And thanks to all of this, we now have a real compendium of the worlds knowledge, getting more and more accurate and holistic by the day. Thank you to everyone involved, from Jimbo and Larry, to the moderators, to the staff, and of course to the millions of people who contribute pages about all the history that makes up the world as we know it today. And thanks to you for listening to yet another episode of the History of Computing Podcast. We’re lucky to have you. Have a great day! Note: This work was produced in large part due to the compilation of historical facts available at https://en.wikipedia.org/wiki/History_of_Wikipedia


Once Upon A Friendster

     8/17/2019

Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Todays episode is on former Social Networking pioneer, Friendster. Today when you go to friendster.com you get a page that the social network is taking a break. The post was put up in 2018. How long did Rip Van Winkle Sleep? But what led to the rise of the first big social network and well, what happened? The story begins in 1973. Talkomatic was a chat room and was a hit in the PLATO or Programmed Logic for Automatic Teaching Operations community at the University of Illinois, an educational learning system that had been running since 1960. Dave Woolley and Douglas Brows at the University of Illinois brought chat and then the staff built TERM-Talk the same year, adding screen sharing and PLATO Notes would be added where you could add notes to your profile. This was the inspiration for the name of Lotus Notes. Then in the 80s came Bulletin Board Systems, 84 brought FidoNet, 88 brought IRC, 96 brought ICQ, and in 96 we got Bolt.com, the first social networking and video website with SixDegrees coming in 1997 as the first real social media website. AOL Instant Messenger showed up the same year and AOL bought ICQ in 99. It was pretty sweet that I didn’t have to remember all those ICQ numbers any more! 1999 - Yahoo! And Microsoft got in the game launching tools called Messenger at about the same time and LiveJournal came along, as well as Habbo, a social networking site for games. By 2001 Six Degrees shut down and Messenger was shipped with XP. But 2002. That was the year the Euro hit the street. Before England dissed it. That was the year Israeli and Palestinian conflicts escalated. Actually, that’s a lot of years, regrettably. I remember scandals at Enron and Worldcom well that year, ultimate resulting in Sarbanes Oxley to counter the more than 5 trillion dollars in corporate scandals that sent the economy into a tailspin. My Georgia Bulldogs football team beat Arkansas to win the SEC title and then beat Florida State in the Sugar Bowl. Nelly released Hot In Here and Eminem released Lose Yourself and Without Me. If film, Harry Potter was searching for the Chamber of Secrets and Frodo was on a great trek to the Two Towers. Eminem was in the theaters as well with 8 Mile. And Friendster was launched by Jonathan Abrams in Mountain View California. They wanted to get people making new friends and meeting in person. It was an immediate hit and people flocked to the site. They grew to three million users in just a few months, catching the attention of investors. As a young consultant, I loved keeping track of my friends who I never got to see in person using Friendster. Napster was popular at the time and the name Friendster came from a mashup of friends and Napster. With this early success, Friendster took $12 million dollars in funding from VC firm Kleiner Perkins Caufield & Byers, Benchmark Capital the next year. That was the year a Harvard student named Mark Zuckerburg launched FaceMash with his roommate Eduardo Saverin for Harvard students in a kinda’ “Hot or Not” game. They would later buy Instagram as a form of euphoric recall, looking back on those days. Google has long wanted a social media footprint and tried to buy Friendster in 2003, but when rejected launched Orkut in 2004 - which just ran in Brazil, tried Google Friend Connect in 2008, which lasted until 2012, Google Buzz, which launched in 2010 and only lasted a year, Google Wave, which launched in 2009 and also only lasted a year, and of course, Google + which ran from 2011 to 2019. Google is back at it again with a new social network called Shoelace out of their Area 120 incubator. The $30 million dollars in Google stock would be worth a billion dollars today. MySpace was also launched in 2003 by Chris DeWolfe and Tom Anderson, growing to have more traffic than Google over time. But Facebook launched in 2004 and after having problems keeping the servers up and running, Friendster's board replaced Abrams as CEO and moved him to chairmen of the board. He was replaced by Scott Sassa. And then in 2005 Sassa was replaced by Taek Kwn and then he was replaced by Kent Lindstrom who was replaced by Richard Kimber. Such rapid churn in the top spot means problems. A rudderless ship. In 2006 they added widgets to keep up with MySpace. They didn’t. They also opened up a developer program and opened up APIs. They still had 52 million unique visitors worldwide in June 2008. But by then, MySpace had grown to 7 times their size. MOL Global, an online payments processor from Malaysia bought the company in 2009 and relaunched the site. All user data was erased and Friendster provided an export tool to move data to other popular sites at the time, such as Flickr. In 2009 Friendster had 3 Million unique visitors per day. They relaunched But that dropped to less than a quarter million by the end of 2010. People abandoned the network. What happened? Facebook eclipsed the Friendster traffic in 2009. Friendster became something more used in Asia than the US. Really, though, I remember early technical problems. I remember not being able to log in, so moving over to MySpace. I remember slow loading times. And I remember more and more people spending time on MySpace, customizing their MySpace page. Facebook did something different. Sure, you couldn’t customize the page, but the simple layout loaded fast and was always online. This reminds me of the scene in the show Silicon Valley, when they have to grab the fire extinguisher because they set the house on fire from having too much traffic! In 2010, Facebook acquired Friendster's portfolio of social networking patents for $40 million dollars. In 2011, Newscorp sold MySpace for $35 million dollars after it had been valued at it peak in 2008. After continuing its decline, Friendster was sold to a social gaming site in 2015, trying to capitalize on the success that Facebook had doing online gaming. But after an immediate burst of users, it too was not successful. In 2018 the site finally closed its doors. Today Friendster is the 651,465th ranked site in the world. There are a few thing to think about when you look at the Friendster story: 1. The Internet would not be what it is today without sites like Friendster to help people want to be on it. 2. The first company on a new thing isn’t always the one that really breaks through 3. You have to, and I mean, have to keep your servers up. This is a critical aspect of maintaining you’re momentum. I was involved with one of the first 5 facebook apps. And we had no idea 2 million people would use that app in the weekend it was launched. We moved mountains to get more servers and clusters brought online and refactored sql queries on the fly, working over 70 hours in a weekend. And within a week we hit 10 million users. That app paid for dozens of other projects and was online for years. 4. When investors move in, the founder usually gets fired at the first sign of trouble. Many organizations simply can’t find their equilibrium after that and flounder. 5. Last but not least: Don’t refactor every year, but if you can’t keep your servers up, you might just have too much technical debt. I’m sure everyone involved with Friendster wishes they could go back and do many things differently. But hindsight is always 20/20. They played their part in the advent of the Internet. Without early pioneers like Friendster we wouldn’t be where we are at today. As Heinlein said, “yet another crew of Rip Van Winkle’s” But Buck Rogers eventually did actually wake back up, and maybe Friendster will as well. Thank you for tuning into another episode of the History of Computing Podcast. We’re lucky to have you. Have a great day!


The History Of Android

     8/22/2019

Welcome to the History of Computing Podcast, where we explore the history of information technology. Because by understanding the past, we’re able to be prepared for the innovations of the future! Today we’re going to look at the emergence of Google’s Android operating system. Before we look at Android, let’s look at what led to it. Frank Canova who built a device he showed off as “Angler” at COMDEX in 1992. This would be released as the Simon Personal Communicator by BellSouth and manufactured as the IBM Simon by Mitsubishi. The Palm, Newton, Symbian, and Pocket PC, or Windows CE would come out shortly thereafter and rise in popularity over the next few years. CDMA would slowly come down in cost over the next decade. Now let’s jump to 2003. At the time, you had Microsoft Windows CE, the Palm Treo was maturing and supported dual-band GSM, Handspring merged into the Palm hardware division, Symbian could be licensed but I never met a phone of theirs I liked. Like the Nokia phones looked about the same as many printer menu screens. One other device that is more relevant because of the humans behind it was the T-Mobile sidekick, which actually had a cool flippy motion to open the keyboard! Keep that Sidekick in mind for a moment. Oh and let’s not forget a fantastic name. The mobile operating systems were limited. Each was proprietary. Most were menu driven and reminded us more of an iPod, released in 2001. I was a consultant at the time and remember thinking it was insane that people would pay hundreds of dollars for a phone. At the time, flip phones were all the rage. A cottage industry of applications sprung up, like Notify, that made use of app frameworks on these devices to connect my customers to their Exchange accounts so their calendars could sync wirelessly. The browsing experience wasn’t great. The messaging experience wasn’t great. The phones were big and clunky. And while you could write apps for the Symbian in Qt Creator or Flash Lite or Python for S60, few bothered. That’s when Andy Rubin left Danger, the company the cofounded that made the Sidekick and joined up with Rich Miner, Nick Sears, and Chris White in 2003 to found a little company called Android Inc. They wanted to make better mobile devices than were currently on the market. They founded Android Inc and set out to write an operating system based on Linux that could rival anything on the market. Rubin was no noob when cofounding Danger. He had been a robotics engineer in the 80s, a manufacturing engineer at Apple for a few years and then got on his first mobility engineering gig when he bounced to General Magic to work on Magic Cap, a spinoff from Apple FROM 92 TO 95. He then helped build WebTV from 95-99. Many in business academia have noted that Android existed before Google and that’s why it’s as successful as it is today. But Google bought Android in 2005, years before the actual release of Android. Apple had long been rumor milling a phone, which would mean a mobile operating system as well. Android was sprinting towards a release that was somewhat Blackberry-like, focused on competing with similar devices on the market at the time, like the Blackberries that were all the rage. Obama and Hillary Clinton was all about theirs. As a consultant, I was stoked to become a Blackberry Enterprise Server reseller and used that to deploy all the things. The first iPhone was released in 2007. I think we sometimes think that along came the iPhone and Blackberries started to disappear. It took years. But the fall was fast. While the iPhone was also impactful, the Android-based devices were probably more-so. That release of the iPhone kicked Andy Rubin in the keister and he pivoted over from the Blackberry-styled keyboard to a touch screen, which changed… everything. Suddenly this weird innovation wasn’t yet another frivolous expensive Apple extravagance. The logo helped grow the popularity as well, I think. Internally at Google Dan Morrill started creating what were known as Dandroids. But the bugdroid as it’s known was designed by Irina Blok on the Android launch team. It was eventually licensed under Creative Commons, which resulted in lots of different variations of the logo; a sharp contrast to the control Apple puts around the usage of their own logo. The first version of the shipping Android code came along in 2008 and the first phone that really shipped with it wasn’t until the HTC Dream in 2009. This device had a keyboard you could press but also had a touch screen, although we hadn’t gotten a virtual keyboard yet. It shipped with an ARM11, 192MB of RAM, and 256MB of storage. But you could expand it up to 16 gigs with a microSD card. Oh, and it had a trackball. It bad 802.11b and g, Bluetooth, and shipped with Android 1.0. But it could be upgraded up to 1.6, Donut. The hacker in me just… couldn’t help but mod the thing much as I couldn’t help but jailbreak the iPhone back before I got too lazy not to. Of course, the Dev Phone 1 shipped soon after that didn’t require you to hack it, something Apple waited until 2019 to copy. The screen was smaller than that of an iPhone. The keyboard felt kinda’ junky. The app catalog was lacking. It didn’t really work well in an office setting. But it was open source. It was a solid operating system and it showed promise as to the future of not-Apple in a post-Blackberry world. Note: Any time a politician uses a technology it’s about 5 minutes past being dead tech. Of Blackberry, iOS, and Android, Android was last in devices sold using those platforms in 2009, although the G1 as the Dream was also known as, took 9% market share quickly. But then came Eclair. Unlike sophomore efforts from bands, there’s something about a 2.0 release of software. By the end of 2010 there were more Androids than iOS devices. 2011 showed the peak year of Blackberry sales, with over 50 million being sold, but those were the lagerts spinning out of the buying tornado and buying the pivot the R&D for the fruitless next few Blackberry releases. Blackberry marketshare would zero out in just 6 short years. iPhone continued a nice climb over the past 8 years. But Android sales are now in the billions per year. Ultimately the blackberry, to quote Time a “failure to keep up with Apple and Google was a consequence of errors in its strategy and vision.” If you had to net-net that, touch vs menus was a substantial part of that. By 2017 the Android and iOS marketshare was a combined 99.6%. In 2013, now Google CEO, Sundar Pichai took on Android when Andy Rubin was embroiled in sexual harassment charges and now acts as CEO of Playground Global, an incubator for hardware startups. The open source nature of Android and it being ready to fit into a device from manufacturers like HTC led to advancements that inspired and were inspired by the iPhone leading us to the state we’re in today. Let’s look at the released per year and per innovation: * 1.0, API 1, 2008: Include early Google apps like Gmail, Maps, Calendar, of course a web browser, a media player, and YouTube * 1.1 came in February the next year and was code named Petit Four * 1.5 Cupcake, 2009: Gave us on an-screen keyboard and third-party widgets then apps on the Android Market, now known as the Google Play Store. Thus came the HTC Dream. Open source everything. * 1.6 Donut, 2009: Customizeable screen sizes and resolution, CDMA support. And the short-lived Dell Streak! Because of this resolution we got the joy of learning all about the tablet. Oh, and Universal Search and more emphasis on battery usage! * 2.0 Eclair, 2009: The advent of the Motorola Droid, turn by turn navigation, real time traffic, live wallpapers, speech to text. But the pinch to zoom from iOS sparked a war with Apple.We also got the ability to limit accounts. Oh, new camera modes that would have impressed even George Eastman, and Bluetooth 2.1 support. * 2.2 Froyo, four months later in 2010 came Froyo, with under-the-hood tuning, voice actions, Flash support, something Apple has never had. And here came the HTC Incredible S as well as one of the most mobile devices ever built: The Samsung Galaxy S2. This was also the first hotspot option and we got 3G and better LCDs. That whole tethering, it took a year for iPhone to copy that. * 2.3 Gingerbread: With 2010 came Gingerbread. The green from the robot came into the Gingerbread with the black and green motif moving front and center. More sensors, NFC, a new download manager, copy and paste got better, * 3.0 Honeycomb, 2011. The most important thing was when Matias Duarte showed up and reinvented the Android UI. The holographic design traded out the green and blue and gave you more screen space. This kicked off a permanet overhaul and brought a card-UI for recent apps. Enter the Galaxy S9 and the Huawei Mate 2. * 4.0 Ice Cream Sandwich, later in 2011 - Duarte’s designs started really taking hold. For starters, let’s get rid of buttons. THat’s important and has been a critical change for other devices as well. We Reunited tablets and phones with a single vision. On screen buttons, brought the card-like appearance into app switching. Smarter swiping, added swiping to dismiss, which changed everything for how we handle email and texts with gestures. You can thank this design for Tinder. * 4.1 to 4.3 Jelly Bean, 2012: Added some sweet sweet fine tuning to the foundational elements from Ice Cream Sandwich. Google Now that was supposed to give us predictive intelligence, interactive notifications, expanded voice search, advanced search, sill with the card-based everything now for results. We also got multiuser support for tablets. And the Android Quick Settings pane. We also got widgets on the lock screen - but those are a privacy nightmare and didn’t last for long. Automatic widget resizing, wireless display projection support, restrict profiles on multiple user accounts, making it a great parent device. Enter the Nexus 10. AND TWO FINGER DOWN SWIPES. * 4.4 KitKat, in 2013 ended the era of a dark screen, lighter screens and neutral highlights moved in. I mean, Matrix was way before that after all. OK, Google showed up. Furthering the competition with Apple and Siri. Hands-free activation. A panel on the home screen, and a stand-alone launcher. AND EMOJIS ON THE KEYBOARD. Increased NFC security. * 5. Lollipop came in 2014 bringing 64 bit, Bluetooth Low Energy, flatter interface, But more importantly, we got annual releases like iOS. * 6: Marshmallow, 2015 gave us doze mode, sticking it to iPhone by even more battery saving features. App security and prompts to grant apps access to resources like the camera and phone were . The Nexus 5x and 6P ports brought fingerprint scanners and USB-C. * 7: Nougat in 2016 gave us quick app switching, a different lock screen and home screen wallpaper, split-screen multitasking, and gender/race-centric emojis. * 8: Oreo in 2017 gave us floating video windows, which got kinda’ cool once app makers started adding support in their apps for it. We also got a new file browser, which came to iOS in 2019. And more battery enhancements with prettied up battery menus. Oh, and notification dots on app icons, borrowed from Apple. * 9: Pie in 2018 brought notch support, navigations that were similar to those from the iPhone X adopting to a soon-to-be bezel-free world. And of course, the battery continues to improve. This brings us into the world of the Pixel 3. * 10, Likely some timed in 2019 While the initial release of Android shipped with the Linux 2.1 kernel, that has been updated as appropriate over the years with, 3 in Ice Cream Sandwich, and version 4 in Nougat. Every release of android tends to have an increment in the Linux kernel. Now, Android is open source. So how does Google make money? Let’s start with what Google does best. Advertising. Google makes a few cents every time you click on an ad in an advertisement in messages or web pages or any other little spot they’ve managed to drop an ad in there. Then there’s the Google Play Store. Apple makes 70% more revenue from apps than Android, despite the fact that Android apps have twice the number of installs. The old adage is if you don’t pay for a product, you are the product. I don’t tend to think Google goes overboard with all that, though. And Google is probably keeping Caterpillar in business just to buy big enough equipment to move their gold bars from one building to the next on campus. Any time someone’s making money, lots of other people wanna taste. Like Oracle, who owns a lot of open source components used in Android. And the competition between iOS and Android makes both products better for consumers! Now look out for Android Auto, Android Things, Android TV, Chrome OS, the Google Assistant and others - given that other types of vendors can make use of Google’s open source offerings to cut R&D costs and get to market faster! But more importantly, Android has contributed substantially to the rise of ubiquitious computing despite how much money you have. I like to think the long-term impact of such a democratization of Mobility and the Internet will make the world a little less idiocracy and a little more wikipedia. Thank you so very much for tuning into another episode of the History of Computing Podcast. We’re lucky to have you. Have a great day!


Coherent Is Not UNIX!

     5/17/2020

In the current day Linux is the most widely used UNIX-like operating system. It's rise to prominence has been an amazing success story. From it's humble beginnings Linux has grown to power everything from super computers to car stereos. But it's not the first UNIX clone. A much earlier system existed, called Coherent. And as it turns out both Linux and Coherent share a lot of similarities. The biggest difference being that Coherent was closed source.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1973: AT&T UNIX V4 Goes Public
1949: DOJ Sues AT&T Over Antitrust Violations
1975: AT&T UNIX V6 Released
1977: First Version of BSD Circulates
1977: XYBASIC Released by Mark Williams Company
1980: Coherent Released for PDP/11
1983: Coherent Comes to the IBM PC/XT
1995: Mark Williams Company Closes


Spam, Email, and Best Intentions

     10/4/2020

Spam emails are a fact of modern life. Who hasn't been sent annoying and sometimes cryptic messages from unidentified addresses? To understand where spam comes from we need to look at the origins of email itself. Email has had a long and strange history, so too have some of it's most dubious uses.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing


JOVIAL, the Evolution of Programming

     9/6/2020

The creation of FORTRAN and early compilers set the stage to change computing forever. However, they were just the start of a much longer process. Just like a spoken language, programming languages have morphed and changed over time. Today we are looking at an interesting case of this slow evolution. JOVIAL was developed during the Cold War for use in the US Military, and it's been in constant small-scale use ever since. It's story gives us a wonderful insight into how programming language change over time, and why some stick around while others die out.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing


Making Disks Flexible, Part 2

     3/8/2020

The floppy disk is one of the most iconic pieces of technology. While not in use in the modern day there was a period of 40 years where the floppy disk was synonymous with data storage. Today we pick up where we finished in the last episode, with the rise and fall of the 5 1/4 inch disk. We will be looking at the creation and spread of the 3 1/2 inch floppy disk. How did Sony, a non-player in the computer market, create this run away success? And how did Apple contribute to it's rise?

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1980: Sony Invents Microfloppy Disk
1983: Apple Builds Prototype MAC with 3 1/2 Inch Floppy


Applesoft BASIC, Microsoft and Apple's First Collaboration

     4/19/2020

It's easy to think of Apple and Microsoft as bitter rivals, but that's not always the case. The two companies have a very complicated relationship, and a very long history. This connection goes all the way back to the 1970s and a product called Applesoft BASIC. It would become stock software on nearly every Apple II computer ever sold, it kept Apple competitive in the early home computer market, and it may have saved Microsoft from bankruptcy.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1997: Bill Gates saves Apple from Bankruptcy
1976: Apple I hits shelves, Integer BASIC soon follows
1977: Apple II Released
1978: AppleSoft BASIC Ships


Road to Transistors, Part II

     6/14/2020

In this episode we finish up our look at the birth of the transistor. But to do that we have to go back to 1880, the crystal radio detector, and examine the development of semiconductor devices. Once created the transistor would change not just how computers worked, but change how they could be used. That change didn't happen over night, and it would take even longer for the transistor to move from theory to reality.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1939: Russel Ohl Discovers P-N Junction
1947: Point Contact Transistor Invented at Bell Labs
1954: TRADIC, First Transistorized Computer, Built


Learning Along the Oregon Trail

     9/20/2020

We've all played the Oregon Trail, but what do you know about it's origins? First developed as a mainframe program all the way back in 1971, the Oregon Trail was intended as an educational game first and foremost. In fact, it traces its linage to some of the first efforts to get computers into the classroom. Today we are following the trail back to it's source and seeing how the proper environment was built to create this classic game.

You can play the 1975 version here: https://archive.org/details/OregonTrailMainframe 

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing


Vectrex, Playing With Vectors

     4/5/2020

The 1980s were a turbulent and fast-moving decade for the video game industry. There were huge success stories, rapid advancements in technology, and the North American Video Game Crash. Caught up in all of this was an ambitious machine called the Vectrex. In an era dominated by pixelated graphics the Vectrex brought higher resolution vector images and early 3D to market. But ultimately it would be swept away during the market's crash. Today we are taking a dive into the development of the Vectrex, what made it different, and how it survives into the modern day.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing


Becoming Portable

     6/28/2020

Portable computing is now totally ubiquitous. There's a good chance you are listening to this episode on a tiny portable computer right now. But where did it all come from? As it turns out the first portable computer was designed all the way back in 1972. This machine, the DynaBook, only ever existed on paper. Despite that handicap, in the coming years it would inspire a huge shift in both personal and portable computing.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1972: DynaBook designed by Alan Kay

1976: NoteTaker project starts

1982: GRiD Compass released


PCM, Origins of Digital Audio

     5/3/2020

Every day we are inundated with digital audio: phone calls, music, even this podcast. Digitized sound has become so ubiquitous that it often fades into the background. What makes this all possible is a technology called Pulse Code Modulation, or PCM. This isn't new technology, its roots trace all the way back to 1937. So how exactly did digital audio come into being well before the first digital computers?

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1937: PCM Developed by Alec Reeves
1941: Germany Cracks A-3 Code
1943: Bell Labs Develops SIGSALY(aka The Green Hornet)
1957: First PCM Synthesizer, MUSIC I, Programmed by Max Mathews


A Guided Tour of the Macintosh

     5/10/2020

In this byte sized episode I take a look at a pack in that came with the first Macintosh. Along side Apple stickers, manuals, and the computer itself there was a single cassette tape labeled "A Guided Tour of the Macintosh". The purpose? It's a strange addition to the Mac's packing, but a great example of Apple's attention to detail and ingenuity.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1984: A Guided Tour of the Macintosh Released


8080 VS Z80

     7/12/2020

In 1974 Intel released the 8080 processor, a chip long in the making. It was the first microprocessor that had the right combination of power and price to make personal computers viable. But that same year a small group of employees defected and formed their own company called Zilog. Among this group were Masatoshi Shima and Federico Faggin, two of the principal architects behind the 8080 as well as Intel's other processors. Zilog would go on to release a better chip, the Z80, that blew Intel out of the water. Today we continue our Intel series with a look into this twisting story.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important Dates:

1974: Intel 8080 hits shelves

1976: Zilog Z80 goes on sale


Making Disks Flexible, Part 1

     2/24/2020

The floppy disk was a ubiquitous technology for nearly 40 years. From mainframes to home computers, the plastic disk was everywhere. And in the decades it was around there were very few changes made to how it fundamentally worked. So how did it get so popular? What made the floppy disk so flexible? And how did it finally fall out of favor? In this episode we will look at the technology's early days.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1971: 8 Inch Floppy Disk(Minnow) Created at IBM
1976: Shugart Invents 5 1/4 Inch Floppy Disk


Brad Chase Interview, Marketing Lead for Windows 95 and Much More

     7/5/2020

I recently got the chance to sit down and talk with Microsoft alumni Brad Chase. He was the product manager for Microsoft Works on the Macintosh, DOS 5, DOS 6, and the marketing lead for Windows 95 as well as much more. We talk about the Apple-Microsoft relationship, the groundbreaking launch of Windows 95, and what it takes to sell software.

Editing for this episode was handled by Franck, you can follow him on instagram: www.instagram.com/frc.audio/

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing


Memex and Hyperlinks

     3/22/2020

The widespread use of the internet has shaped our world, it's hard do imagine the modern day without it. One of the biggest featured would have to be the hyperlink. But despite the modern net feeling so new, links actually date back as far as the 1930s and the creation of the Memex: a machine that was never built but would influence coming generations of dreamers.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1927: Differential Analyzer Built at MIT
1938: Rapid Selector Built by Vannevar Bush
1945: As We May Think Published


The Rise of CP/M

     8/9/2020

The IBM PC and MS-DOS, the iconic duo of the early 80s. The two are so interconnected that it's hard to mention one without the other. But in 1980 DOS wasn't IBM's first choice for their soon-to-be flagship hardware. IBM had wanted to license Gary Kildall's CP/M, but in a strange series of events the deal fell through. Legend states that Kildall lost the contract b was too busy flying his private plane to talk business with IBM, but is that true? Today we look at the development of CP/M, why it was a big deal, and why the PC ultimately shipped with Microsoft software.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing


Analog Computing and the Automatic Totalisator

     7/26/2020

A lot of the technology we associate with the modern day started on anachronistic machines. I'm not talking about mainframes, I'm talking older. Today we are looking at George Julius's Automatic Totalisator, an analog computer used to manage betting at horse tracks around the world. These were massively complex machines, some networked over 200 input terminals, and they did it all mechanically.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important Dates:

1913: Premier Tote installed in Auckland


Road to Transistors: Part I

     5/31/2020

The transistor changed the world. It made small, complex, and cheap computing possible. But it wasn't the first attempt to crack the case. There is a long and strange lineage of similar devices leading up to the transistor. In this episode we take a look at two of those devices. First the vacuum tube, one of the first components that made computing possible. Then the cryotron, the first device purpose built for computers.

You can find the full audio of Atanasoff's talk here: https://www.youtube.com/watch?v=Yxrcp1QSPvw

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1880: Thomas Edison Rediscovers Thermionic Emission
1904: Ambrose Fleming Invents the Vacuum Tube
1906: Lee de Forest Patents the Audion Triode Tube
1937: George Stibitz Creates First Binary Adding Circuit from Spare Relays
1938: John Atanasoff Visits a 'Honkey-Tonk'
1941: ABC, First Vacuum Tube Calculator, is Completed
1953: Cryotron Invented by Dudley Allen Buck


Evolution of the Mouse

     12/2/2019

The computer mouse is a ubiquitous device, it's also one of the least changed devices we use with a computer. The mice we use today have only seen small incremental improvements since the first mouse was developed. So how did such a long lasting design take shape, and how did it travel the decades up to now?

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1961: First Mouse Developed at Engelbart's ARC Lab
1972: Xerox Develops Rollerball Mouse for Alto
1979: Apple LISA Mouse Designed


Bill's Problem with Piracy

     11/25/2019

In this mini-episode we look at a strange event in Microsoft's early history and their first case of piracy. Along the way you will learn about the best advetrizing campaign in history: the MITS MOBILE Computer Caravan!

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1976: 'Open Letter to Hobbyists' Written by Bill Gates

http://tee.pub/lic/4jnwv7m_ZPw


Spacewar! (the Game)

     8/25/2019

It really seems like in the last decade video games have gone from a somewhat niche hobby to a widespread part of our culture. Nowadays, there are a multitude of ways to get out gaming fix. Consoles, handheld game systems, and even smartphones make video games more accessible than ever. But when and how exactly did video games start to creep into the modern consciousness?

In this episode we look at some of the earliest video games and how they came to be.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1962: Spacewar! Developed


Minitel, the French Network Connection

     9/22/2019

Today we are dipping back into the deep and complex history of the proto-internet. We are going to be looking at Minitel, a France-Wide-Web that was built in the 1980s as a way to help the country stay relevant in the digital age.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1980: Minitel Program Networks France


Minitel Research Lab Interview, with Julien Mailland and Kevin Driscoll

     9/29/2019

Today I am joined by Julien Mailland and Kevon Driscoll, co-authors of Minitel: Welcome to the Internet and proprietors of the Minitel Research Lab(minitel.us). We talk about their book, how they first started working on Minitel terminals, and the ongoing work to preserve Minitel.


Acorn and the BBC

     7/14/2019

The Raspberry Pi had been a huge success at its stated goals, and continues to be. But, this isn't the first time a British company would design and develop a computer as an accessible platform for learning programming. In fact, if you've read much about the Pi then you've probably seen people calling it a "BBC Micro 2".

 

So what was the BBC Micro? What did the BBC have to do with creating a new computer? And how is any of this connected to the 21st century version?

 

Today I want to share the story from a slice of a somewhat forgotten age: BBC's involvement with Acorn Computers and how they worked together to educate a generation of programmers. Along the way we will see how a small UK company created an impressive series of computers who's legacy may not be known in the States, but has had a surprising impact on the world.

 

Special thanks to Neil from Retro Man Cave for sharing his memories of the BBC Micro. You can find him on YouTube here: https://www.youtube.com/channel/UCLEoyoOKZK0idGqSc6Pi23w


Digital Voices

     6/16/2019

What are the origins of our modern day text-to-speech systems? In this episode we will dive into the rich history of electronic talking machines. Along the way I will tell you the story of the vocoder, the first singing computer, and a little about the father of modern synthesized speech.


Attack of the PC Clones

     6/30/2019

Today, I want to share with you the story of the first PC clones and how they cemented the rise of the x86 chipset.

 

Most of this story takes place between 1981 and 1984, but I think it's fair to say that these 3 years are some of the most influential for the PC's rise to domination. So lets start the story with a discussion of the IBM PC, how it was special, and then examine how reverse engineering it lead to the current x86 monoculture we see today.


Creeping Towards Viruses

     10/6/2019

Computer viruses today pose a very real threat. However, it turns out that their origins are actually very non-threatening. Today, we are going to look at some of the first viruses. We will see how they developed from technical writing, to pulp sci-fi, to traveling code.

I talk about The Scarred Man by Gregory Benford in this episode, you can read the full short story here: http://www.gregorybenford.com/extra/the-scarred-man-returns/

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1949: John Von Neumann Writes 'Theory and Organization of Complex Automata'
1969: 'The Scarred Man' Written by Gregory Benford, Coined Term 'Virus'
1971: Creeper Virus Unleashed


Space Travel!

     5/27/2019

In this mini-episode we talk about Space Travel, an obscure video game from 1969.


Journey to Altair

     9/8/2019

Today we are going to be traveling back to the late 1970s to take a look at the early days of the home computer. And specifically how Microsoft found a foothold at just the right time and place. And for Bill Gates and Paul Allen that would come in the form of BASIC.

Along the way we will cover the Altair 8800, vaporware, and how Bill Gates violated Harvard student conduct.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1974: Altari 8800 Released
1975: Microsoft BASIC Released


Cooking in Y2K

     1/6/2020

In this mini episode we will look at the Y2K bug, and some of the recipes it spawned. That's right, we are talking about Y2K cookbooks!

You can find all more Y2K compliant food here: https://web.archive.org/web/19991012032855/http://y2kkitchen.com/

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1999: Y2K Kitchen Hits Shelves


Networking for a Nuclear War, the Soviets

     7/28/2019

Often times people assume the US is the homeland of the internet. Funded by the US Department of Defence, the first attempts at a large-scale network were started during the height of the Cold War, and a large part of it's design was redundancy and robust-ness. Some of the researchers were quite frank about it's purpose: to create a network that could survive an upcoming nuclear war. This military-hardened infrastructure was known as ARPANET.


But that's only part of the story, and the US wasn't the first to the party. The fact is, the internet was born during the Cold War. This was an era that saw huge advancements in science, both for better and for worse. The space race put humans on the moon, and the nuclear arms race put humans dangerously close to annihilation. So it should be no surprise that America's counterpart in this age, the Soviet Union, was working towards their own proto-internet.


4004: The First Microprocessor

     11/4/2019

Intel is one of the dominant forces in the computer industry today, they may be most well known for their line of microprocessors. These chips have powered computers going back to the early days of microcomputers. How did Intel become so entrenched in the field? Well, it all started with the 4004 CPU, the first "one-chip" computer.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1971: Intel 4004 Released


Networking for a Nuclear War, the Americans

     8/11/2019

In this episode we are going to explore the ARPANET. This is a companion to the last episode, which covered contemporary Soviet attempts to create an early internet.

Like with last time, today we are still in the Cold War era. Now, this won't be a point by point comparison of Soviet to US networks. They are totally different beasts. Instead, what I want to do is look at how ARPANET was developed, what influenced it, and how it would kick start the creation of the internet.


Lost in the Colossal Cave

     10/20/2019

Colossal Cave Adventure is one of the most influential video games of all time. Originally written for the DEC PDP-10 mainframe in 1975 the game has not only spread to just about any computer out there, but it has inspired the entire adventure/RPG genera. In this episode we are going to look at how Adventure got it's start, how it evolved into a full game, and how it came to be a lunch title for the IBM PC.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1975: Colossal Cave Adventure Developed

http://tee.pub/lic/MKt4UiBp22g


Unix for the People, Part 2

     6/2/2019

Now, as the name suggests this is the second part of a series on the history of UNIX. Part 1 mainly covers the background leading up to UNIX. If you haven't listened to it yet, I strongly suggest you go do that now. A lot of what was covered in part 1 provides needed context for our discussion today.

 

Just as a quick recap, last time I told you about CTSS and Multics, two of the earliest time-sharing operating systems. Today, we are going to be picking up where we left off: Bell Labs just left Project MAC and decided to start their own time-sharing project. What they didn't realize was that this new project, called UNIX, would soon outshine all of its predecessors. But when this all started, in 1969 on a spare mainframe at Bell Labs, there was no hint at it's amazing future.


Going Rogue

     1/26/2020

Many video games today make use of randomized content, some more than others. It may seem like an obvious feature, but it turns out that procedural generation didn't really catch on in video games until the 1980 release of Rogue. The game itself never saw much commercial success, but was wildly popular among UNIX users. In this episode we look at Rogue, how it was created, and the legacy that we still see today.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1980: Rogue Written for PDP/11
1984: Rogue Ported to PC, Macintosh, Atari ST


Return of Viruses: The Spread

     10/18/2020

It's time to round out spook month with a return to one of last year's topics: the computer virus. Malicious code traveling over networks is actually a relatively new phenomenon, early viruses were much different. In this episode we examine ANIMAL and Elk Cloner, two early viruses that were meant as practical jokes and spread by hapless computer users. Along the way we will see cases of parallel evolution, name calling, and find out if there is any one origin to the word "virus".

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and bonus content: https://www.patreon.com/adventofcomputing


IBM Gets Personal

     11/2/2020

This episode is not about the IBM PC. In 1981 the Personal Computer would change the world. Really, it's hard to talk about home computing without diving into it. But I've always had an issue with the traditional story. The PC didn't come out of left field, IBM had actually been trying to make a home computer for years. In 1981 those efforts would pay off, but the PC wasn't revolutionary hardware for Big Blue, it was evolutionary. So today we are looking at that run up with SCAMP, the 5100, and the Datamaster.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and bonus content: https://www.patreon.com/adventofcomputing


Landing the Eagle

     4/30/2020

In this episode we wrap up the greatest achievements and challenges in the computing era of the 1960's.  We learn about Howard Tindall, and how MIT became a dominant force in Apollo.  We also learn about Core Rope Memory and how Seamstresses helped program Apollo capsules.  This is the last episode of Season 1, it's time to go offline and research how a new era of computing arrives: Microcomputing.  


The History of Computing Ep 10: Computers and the Space Race

     3/25/2020

We go knee-deep into available computing technology in the late 1950's and what it was used for: Missles and Satellites.  We see the creation of the NASA RTCC in a muddy field and revisit what IBM is up to.


Magnetic: The History of Computing ep 8

     2/24/2020

Episode 8 covers the amazing achievements of women in early computing, including an excellent mathematician who makes sense of programming languages, and a bombshell actress who invented something that makes it possible for us to use computers wirelessly.


Magnetic: the History of Computing

     11/11/2019

In this first episode, I go over the beginning of computing: why did we start this thing in the first place?  We review the Abacus, the plague, and the loom, and see why those factored into the device you're reading this on. 


Keeping Things BASIC

     12/14/2020

BASIC is a strange language. During the early days of home computing it was everywhere you looked, pretty much every microcomputer in the 70s and early 80s ran BASIC. For a time it filled a niche almost perfectly, it was a useable language that anyone could learn. That didn't happen by accident. Today we are looking at the development of BASIC, how two mathematicians started a quest to expose more students to computers, and how their creation got away from them.


Hacker Folklore

     12/28/2020

Hacker hasn't always been used to describe dangerous computer experts will ill intent. More accurately it should be sued to describe those enamored with computers, programming, and trying to push machines to do interesting things. The values, ethics, morals, and practices around those people make up what's known as hacker culture. Today we are digging into the Jargon File, a compendium of all things hackish and hackable, to take a look at hacker culture through its folklore.
 
Huge thanks to some of my fellow podcasters for doing readings for me this episode. In order of appearance they are:
 
Randall Kindig of the FloppyDays Vintage Computing Podcast(floppydays.com)
Charles Edge from The History of Computing(thehistoryofcomputing.libsyn.com)
Sebastian Major of Our Fake History(ourfakehistory.com)
 
Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and bonus content: https://www.patreon.com/adventofcomputing


Lars Brinkhoff Interview, Preserving ITS

     1/18/2021

Lars Brinkhoff has been spearheading the effort to keep the incompatible Timesharing System alive. Today we sit down to talk about the overall ITS restoration project, software preservation, and how emulation can help save the past.

You can find the full restoration project at github: https://github.com/PDP-10/its

And follow Lars on twitter: @larsbrinkhoff


8086: The Unexpected Future

     2/22/2021

The Intel 8086 may be the most important processor ever made. It's descendants are central to modern computing, while retaining an absurd level of backwards compatibility. For such an important chip it had an unexpected beginning. The 8086 was meant as a stopgap measure while Intel worked on bigger and better projects. This episode we are looking at how Intel was trying to modernize, how the 8086 fit into that larger plan, and it's pre-IBM life.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and bonus content: https://www.patreon.com/adventofcomputing


The IBM PC

     3/8/2021

Released in August 1981, the IBM PC is perhaps one of the most important computers in history. It originated the basic architecture computers still use today, it flung the doors open to a thriving clone market, and created an ad-hoc set of standards. The heart of the operation, Intel's 8088, solidified the x86 architecture as the computing platform of the future. IBM accomplished this runaway success by breaking all their own rules, heavily leveraging 3rd party hardware and software, and by cutting as many corners as possible. The PC was designed in less than a year, so how did it become the most enduring design in the industry?
 
Some ad clips this episode were from this fabulous PC ad compilation: https://www.youtube.com/watch?v=kQT_YCBb9ao
 
Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and bonus content: https://www.patreon.com/adventofcomputing


THE SOURCE

     3/21/2021

One of the great things about the modern Internet is the wide range of services and content available on it. You have news, email, games, even podcasts. And in each category you have a wide range of choices. This wide diversity makes the Internet so compelling and fun to explore. But what happens when you take away that freedom of choice? What would a network look like if there was only one news site, or one place to get eamil? Look no further than THE SOURCE. Formed in 1979 and marketed as the information utility for the information age, THE SOURCE looked remarkably like the Internet in a more closed-off format. The key word here is: looked.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and bonus content: https://www.patreon.com/adventofcomputing


C Level, Part I

     4/4/2021

C is easily one of the most influential programming languages in the world, and it's also one of the most popular languages in the world. Even after close to 50 years it remains in widespread and sustained use. In this series we are going to look at how C was developed, how it spread, and why it remains so relevant. To do that we need to start with background, and look at what exactly influenced C. This episode we are diving into some more ALGOL, CPL, BCPL, and eventually B.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and bonus content: https://www.patreon.com/adventofcomputing


(OldComputerPods) ©Sean Haas, 2020