'the' Episodes

The Brief History Of The Battery

     3/9/2020

Most computers today have multiple batteries. Going way, way, back, most had a CMOS or BIOS battery used to run the clock and keep BIOS configurations when the computer was powered down. These have mostly centered around the CR2032 lithium button cell battery, also common in things like garage door openers and many of my kids toys!

 

Given the transition to laptops for a lot of people now that families, schools, and companies mostly deploy one computer per person, there’s a larger battery in a good percentage of machines made. Laptops mostly use lithium ion batteries, which 

 

The oldest known batteries are “Baghdad batteries”, dating back to about 200BC. They could have been used for a number of things, like electroplating. But it would take 2,000 years to get back to it. As is often the case, things we knew as humans, once backed up with science, became much, much more. First, scientists were studying positive and negative elements and forming an understanding that electricity flowed between them. Like the English natural scientist, William Gilbert  - who first established some of the basics of electricity and magnetism. And Sir Thomas Browne, who continued to refine theories and was the first to call it “electricity.” Then another British scientist, Peter Collinson, sent Franklin an electricity tube, which these previous experiments had begun to produce. 

 

Benjamin Franklin spent some time writing back and forth with Collinson and flew a kite and proved that electrical currents flowed through a kite string and that a metal key was used to conduct that electricity. This proved that electricity was fluid. Linked capacitors came along in 1749. That was 1752 and Thomas-Francois Dalibard also proved the hypothesis using a large metal pole struck by lightning. 

 

Budding scientists continued to study electricity and refine the theories. 1799, Alessandro Volta built a battery by alternating zinc, cloth soaked in brine, and silver and stacking them. This was known as a voltaic pile and would release a steady current. The batteries corroded fast but today we still refer to the resistance of an ohm when the current of an amp flows through it as a volt. Suddenly we were creating electricity from an electrochemical reaction. 

 

People continued to experiment with batteries and electricity in general. Giuseppe Zamboni, another Italian, physicist invented the Zamboni pile in 1812. Here, he switched to zinc foil and manganese oxide. Completely unconnected, Swedish chemist Johann August Arfvedson discovered Lithium in 1817. Lithium. Atomic number 3. Lithium is an alkali metal found all over the world. It can be used to treat manic depression and bipolar disorder. And it powers todays modern smart-everything and Internet of thingsy world. But no one knew that yet. 

 

The English chemist John Frederick Daniell invented the Daniell cell in 1836, building on the concept but using a copper plate in a copper sulfate solution in a plate and hanging a zinc plate in the jar or beaker. Each plate had a wire and the zinc plate would become a negative terminal, while the copper plate would be a positive terminal and suddenly we were able to reliably produce electricity. 

 

Robert Anderson would build the first electric car using a battery at around the same time, but Gaston Plante would build the first rechargeable battery in 1859, which is very much resembles the ones in our cars today. He gave us the lead-acid battery, switching to lead oxide in sulfuric acid. 

 

In the 1860s the Daniell cell would be improved by Callaud and a lot of different experiments continued on. The Gassner dry cell came from Germany in 1886, mixing ammonium chloride with plaster of Paris and adding zinc chloride. Shelf life shot up. The National Carbon Company would swap out the plaster of Paris with coiled cardboard. That Colombia Dry Cell would be commercially sold throughout the United States and National Carbon Company, which would become Eveready, who makes the Energizer batteries that power the weird bunny with the drum. 

 

Swedish scientist Jungner would give us nickel-cadmium or NiCd in 1899, but they were a bit too leaky. So Thomas Edison would patent a new model in 1901, iterations of these are pretty much common through to today. Litum would start being used shortly after by GN Lewis but would not become standard until the 1970s when push button cells started to be put in cameras. Asahi Chemical out of Japan would then give us the Lithium Ion battery in 1985, brought to market by Sony in 1991, leading to  John B. Goodenough, M. Stanley Whittingham, and Akira Yoshino winning the Nobel Prize in Chemistry in 2019. 

 

Those lithium ion batteries are used in most computers and smart phones today. The Osborne 1 came in 1981. It was what we now look back on as luggable computer. A 25 pound computer that could be taken on the road. But you plugged it directly into the wall. But the Epson HX-20 would ship the same year, with a battery, opening the door to batteries powering computers. 

 

Solar cells and other larger batteries require much larger amounts. This causes an exponential increase in demand and thus a jump in the price, making it more lucrative to mine. 

 

Mining lithium to create these batteries is, as with all other large scale operations taken on by humans, destroying entire ecosystems, such as those in Argentina, Bolivia, Chile, and the Tibetan plateau. Each ton of lithium takes half a million gallons of water, another resource that’s becoming more precious. And the waste is usually filtered back into the ecosystem. Most other areas mine lithium out of rock using traditional methods, but there’s certainly still an environmental impact. There are similar impacts to mining Cobalt and Nickel, the other two metals used in most batteries. 

 

So I think we’re glad we have batteries. Thank you to all these pioneers who brought us to the point that we have batteries in pretty much everything. And thank you, listeners, for sticking through to the end of this episode of the History of Computing Podcast. We’re lucky to have you. 


Dungeons && Dragons

     12/27/2019

What does insurance, J.R.R. Tolkien, HG Wells, and the Civil War have in common? They created a perfect storm for the advent of Dungeons and Dragons. Sure, D&D might not be directly impactful on the History of Computing. But it’s impacts are far and wide. The mechanics have inspired many a game. And the culture impact can be seen expansively across the computer gaming universe. D&D came of age during the same timeframe that the original PC hackers were bringing their computers to market. But how did it all start? We’ll leave the history of board games to the side, given that Chess sprang up in northern India over 1500 years ago, spreading first to the Persian empire and then to Spain following the Moorish conquest of that country. And given that card games go back to a time before the Tang Dynasty in 9th century China. And Gary Gygax, the co-creator and creative genius behind D&D loved playing chess, going back to playing with his grandfather as a young boy. Instead, we’ll start this journey in 1780 with Johann Christian Ludwig Hellwig, who invented the first true war-game to teach military strategy. It was good enough to go commercial. Then Georg Julis Venturini made a game in 1796, then Opiz in 1806, then Kriegsspiel in 1824, which translates from German to wargame. And thus the industry was born. There were a few dozen other board games but in 1913, Little Wars, by HG Wells, added hollow lead figures, ornately painted, and distance to bring us into the era of miniature wargaming. Infantry moved a foot, cavalry moved two, and artillery required other troops to be around it. You fought with spring loaded cannons and other combat resulted in a one to one loss usually, making the game about trying to knock troops out while they were setting up their cannons. It was cute, but in the years before World War II, many sensed that the release of a war game by the pacifist Wells was a sign of oncoming doom. Indeed it was. But each of these inventors had brought their own innovations to the concept. And each impacted real war, with wargaming being directly linked to the blitzkrieg. Not a lot happened in innovative new Wargames between Wells and the 1950s. Apparently the world was busy fighting real war games. But Jack Scruby started making figures in 1955 and connecting communities, writing a book called All About Wargames in 1957. Then Gettysburg was created by Charles Roberts and released by Avalon Hill, which he founded, in 1958. It was a huge success and attracted a lot of enthusiastic if not downright obsessed players. In the game, you could play the commanders of the game, like Robert E Lee, Stonewall Jackson, Meade, and many others. You had units of varying sizes and a number of factors could impact the odds of battle. The game mechanics were complex, and it sparked a whole movement of war games that slowly rose through the 60s and 70s. One of those obsessed gamers was Gary Gygax, an insurance underwriter, who started publishing articles and magazines, Gygax started a the Lake Geneva Wargames Convention in 1968, which has since moved to Indianapolis after a pitstop in Milwaukee and now brings in upwards of 30,000 attendees. Gygax collaborated with his friend Jeff Perren on a game they released in 1970 called Chainmail. Chaimail got a supplement that introduced spells, magic items, dwarves, and hobbits - which seems based on Tolkien novels, but according to Gygax was more a composite of a lot of pulp novels, including one of his favorite, the Conan series. 1970 turned out to be a rough year, as Gygax got laid off from the insurance company and had a family with a wife and 5 kids to support. That’s when he started making games as a career. At first, it didn’t pay too well, but he started making games and published Chainmail with Guidon Games which started selling a whopping 100 copies a month. At the time, they were using 6 sided dice but other numbering systems worked better. They started doing 1-10 or 1-20 random number generation by throwing poker chips in a coffee can, but then Gary found weird dice in a school supply catalog and added the crazy idea of a 20 sided dice. Now a symbol found on t-shirts and a universal calling card of table top gamers. At about the same time University of Minnesota history student, Dave Arneson met Gygax at Gencon and took Chainmail home to the Twin Cities and started improving the rules, releasing his own derivative game called Blackmoor. He came back to Gencon the next year after testing the system and he and Gygax would go on to collaborate on an updated and expanded set of rules. Gygax would codify much of what Arneson didn’t want to codify, as Arneson found lawyer balling rules to be less fun from a gameplay perspective. But Gary, the former underwriter, was a solid rule-maker and thus role-playing games were born, in a game first called The Fantasy Game. Gary wrote a 50 page instruction book, which by 1973 had evolved into a 150-page book. He shopped it to a number of game publishers, but none had a book that thick or could really grock the concept of role-playing. Especially one with concepts borrowed from across the puIn the meantime, Gygax had been writing articles and helping others with games, and doing a little cobbling on the side. Because everyone needs shoes. And so in 1973, Gygax teamed up with childhood friend Don Kaye and started Tactical Studies Rules, which would evolve into TSR, witch each investing $1,000. They released Cavaliers and Roundheads on the way to raising the capital to publish the game they were now calling… Dungeons and Dragons. The game evolved further and in 1974 they put out 1,000 copies of in a boxed set. To raise more capital they brought in Brian Blume, who invested 2,000 more dollars. Sales of that first run were great, but Kaye passed away in 1975 and Blume’s dad stepped in to buy his shares. They started Dragon magazine, opened The Dungeon Hobby Shop and started hiring people. The game continued to grow, with Advanced Dungeons & Dragons being released with a boatload of books. They entered what we now call a buying tornado and by 1980, sales were well over 8 million dollars. But in 1979 James Egbert, a Michigan State Student, disappeared. A private eye blamed Dungeons and Dragons. He later popped up in Louisiana but the negative publicity had already started. Another teen, Irving Pulling committed suicide in 1982 and his mom blamed D&D and then started a group called Bothered About Dungeons and Dragons, or BADD. There’s no such thing as bad publicity though and sales hit $30 million by 83. In fact, part of the allure for many, including the crew I played with as a kid, was that it got a bad wrap in some ways… At this point Gary was in Hollywood getting cartoons made of Dungeons and Dragons and letting the Blume’s run the company. But they’d overspent and nearing bankruptcy due to stupid spending, Gygax had to return to Lake Geneva to save the company, which he did by releasing the first book in a long time, one of my favorite D&D books, Unearthed Arcana. Much drama running the company ensued, which isn’t pertinent to the connection D&D has to computing but basically Gary got forced out and the company lost touch with players because it was being run by people who didn’t really like gamers or gaming. 2nd edition D&D wasn’t a huge success But in 1996, Wizards of the Coast bought TSR. They had made a bundle off of Magic The Gathering and now that TSR was in the hands of people who loved games and gamers again, they immediately started looking for ways to reinvigorate the brand - which their leadership had loved. 3rd edition open gaming license was published by Wizards of the Coast and allowed third-part publishers to make material compatible with D&D products using what was known as the d20 System Trademark License. Fourth edition came along and in 2008 but that open gaming License was irrevocable so most continued using it over the new Game System License, which had been more restrictive. By 2016 when 5th edition came along, this is all felt similar to what we’ve seen with Apache, BSD, and MIT licenses, with TSR moving back to the Open Gaming License which had been so popular. Now let’s connect Dungeons and Dragons to the impact on Computing. In 1975, Will Crowther was working at Bolt, Beranek, and Newman. He’d been playing some of those early copies of Dungeons and Dragons and working on natural language processing. The two went together like peanut butter and chocolate and out popped something that tasted a little like each, a game called Colossal Cave Adventure. If you played Dungeons and Dragons, you’ll remember drawing countless maps on graph paper. Adventure was like that and loosely followed Kentucky’s Mammoth Cave system, given that Crowther was an avid caver. It ran on a PDP-10, and as those spread, so spread the fantasy game, getting updated by Stanford grad student Don Woods in 1976. Now, virtual words weren’t just on table tops, but they sprouted up in Rogue and by the time I got to college, there were countless MUDs or Multi-User Dungeons where you could kill other players. Mattel shipped the Dungeons & Dragons Computer Fantasy Game in 1981 then Dungeon! For the Apple II and another dozen or so games over the the years. These didn’t directly reflect the game mechanics of D&D though. But Pool of Raidance, set in the Forgotten Realms campaign setting of D&D popped up for Nintentendo and PCs in 1988, with dozens of D&D games shipping across a number of campaign settings. You didn’t have to have your friends over to play D&D any more. Out of that evolved Massive Multiplayer Online RPGs, including EverQuest, Ultima Online, Second Life, Dungeons and Dragons, Dark Age of Camelot, Runescape, and more. Even more closely aligned with the Dungeons and Dragons game mechanics you also got Matrix online, Star Wars Old Republic, Age of Conan and the list goes on. Now, in the meantime, Wizardy had shipped in 1981, Dragon Warrior shipped in 1986, and the Legend of Zelda had shipped in 1986 as well. And these represented an evolution on a simpler set of rules but using the same concepts. Dragon Warrior had started as Dragon Quest after the creators played Wizardy for the first time. These are only a fraction of the games that used the broad concepts of hit points, damage, probability of attack, including practically every first person shooter ever made, linking nearly every video game created that includes combat, to Dungeons and Dragons if not through direct inspiration, through aspects of game mechanics. Dungeons and Dragons also impacted media, appearing in movies like Mazes and Monsters, an almost comedic look at playing the game, ET, where I think I first encountered the game, reinvigorating Steven Jackson to release nearly the full pantheon of important Tolkien works, Krull, The Dark Crystal, The Princess Bride, Pathfinder, Excalibur, Camelot, and even The Last Witch Hunter, based off a Vin Diesel character he had separation anxiety with. The genre unlocked the limitations placed on the creativity by allowing a nearly unlimited personalization of characters. It has touched every genre of fiction and non-fiction. And the game mechanics are used not only for D&D but derivatives are also used for a variety of other industries. The impact Dungeons and Dragons had on geek culture stretches far and wide. The fact that D&D rose to popularity as many felt the geeks were taking over, with the rise of computing in general and the reinvention of entire economies, certainly connects it to so many aspects of our lives, whether realized or not. So next time you pick up that controller and hit someone in a game to do a few points of damage, next time you sit in a fantasy movie, next time you watch Game of Thrones, think about this. Once upon a time, there was a game called Chainmail. And someone came up with slightly better game mechanics. And that collaboration led to D&D. Now it is our duty to further innovate those mechanics in our own way. Innovation isn’t replacing manual human actions with digital actions in a business process, it’s upending the business process or industry with a whole new model. Yet, the business process usually needs to be automated to free us to rethink the model. Just like the creators of D&D did. If an insurance underwriter can have such an outsized impact on the world in the 1970s, what kind of impact could you be having today. Roll a d20 and find out! If you roll a 1, repeat the episode. Either way, have a great day, we’re lucky you decided to listen in!


Happy Birthday ENIAC

     2/15/2020

Today we’re going to celebrate the birthday of the first real multi-purpose computer: the gargantuan ENIAC which would have turned 74 years old today, on February 15th. Many generations ago in computing. The year is 1946. World War II raged from 1939 to 1945. We’d cracked Enigma with computers and scientists were thinking of more and more ways to use them. The press is now running articles about a “giant brain” built in Philadelphia. The Electronic Numerical Integrator and Computer was a mouthful, so they called it ENIAC. It was the first true electronic computer. Before that there were electromechanical monstrosities. Those had to physically move a part in order to process a mathematical formula. That took time. ENIAC used vacuum tubes instead. A lot of them. To put things in perspective: very hour of processing by the ENiAC was worth 2,400 hours of work calculating formulas by hand. And it’s not like you can do 2,400 hours in parallel between people or in a row of course. So it made the previous almost impossible, possible. Sure, you could figure out the settings to fire a bomb where you wanted two bombs to go in a minute rather than about a full day of running calculations. But math itself, for the purposes of math, was about to get really, really cool. The Bush Differential Analyzer, a later mechanical computer, had been built in the basement of the building that is now the ENIAC museum. The University of Pennsylvania ran a class on wartime electronics, based on their experience with the Differential Analyzer. John Mauchly and J. Presper Eckert met in 1941 while taking that class, a topic that had included lots of shiny new or newish things like radar and cryptanalysis. That class was mostly on ballistics, a core focus at the Moore School of Electrical Engineering at the University of Pennsylvania. More accurate ballistics would be a huge contribution to the war effort. But Echert and Mauchly wanted to go further, building a multi-purpose computer that could analyze weather and calculate ballistics. Mauchly got all fired up and wrote a memo about building a general purpose computer. But the University shot it down. And so ENIAC began life as Project PX when Herman Goldstine acted as the main sponsor after seeing their proposal and digging it back up. Mauchly would team up with Eckert to design the computer and the effort was overseen and orchestrated by Major General Gladeon Barnes of the US Army Ordnance Corps. Thomas Sharpless was the master programmer. Arthur Burkes built the multiplier. Robert Shaw designed the function tables. Harry Huskey designed the reader and the printer. Jeffrey Chu built the dividers. And Jack Davis built the accumulators. Ultimately it was just a really big calculator and not a computer that ran stored programs in the same way we do today. Although ENIAC did get an early version of stored programming that used a function table for read only memory. The project was supposed to cost $61,700. The University of Pennsylvania Department of Computer and Information Science in Philadelphia actually spent half a million dollars worth of metal, tubes and wires. And of course the scientists weren’t free. That’s around $6 and a half million worth of cash today. And of course it was paid for by the US Army. Specifically the Ballistic Research Laboratory. It was designed to calculate firing tables to make blowing things up a little more accurate. Herman Goldstine chose a team of programmers that included Betty Jennings, Betty Snyder, Kay McNulty, Fran Bilas, Marlyn Meltzer, and Ruth Lichterman. They were chosen from a pool of 200 and set about writing the necessary formulas for the machine to process the requirements provided from people using time on the machine. In fact, Kay McNulty invented the concept of subroutines while working on the project. They would flip switches and plug in cables as a means of programming the computer. And programming took weeks of figuring up complex calculations on paper. . Then it took days of fiddling with cables, switches, tubes, and panels to input the program. Debugging was done step by step, similar to how we use break points today. They would feed ENIAC input using IBM punch cards and readers. The output was punch cards as well and these punch cards acted as persistent storage. The machine then used standard octal radio tubes. 18000 tubes and they ran at a lower voltage than they could in order to minimize them blowing out and creating heat. Each digit used in calculations took 36 of those vacuum tubes and 20 accumulators that could run 5,000 operations per second. The accumulators used two of those tubes to form a flip-flop and they got them from the Kentucky Electrical Lamp Company. Given the number that blew every day they must have loved life until engineers got it to only blowing a tube every couple of days. ENIAC was modular computer and used different panels to perform different tasks, or functions. It used ring counters with 10 positions for a lot of operations making it a digital computer as opposed to the modern binary computational devices we have today. The pulses between the rings were used to count. Suddenly computers were big money. A lot of research had happened in a short amount of time. Some had been government funded and some had been part of corporations and it became impossible to untangle the two. This was pretty common with technical advances during World War II and the early Cold War years. John Atanasoff and Cliff Berry had ushered in the era of the digital computer in 1939 but hadn’t finished. Maunchly had seen that in 1941. It was used to run a number of calculations for the Manhattan Project, allowing us to blow more things up than ever. That project took over a million punch cards and took precedent over artillery tables. Jon Von Neumann worked with a number of mathematicians and physicists including Stanislaw Ulam who developed the Monte Method. That led to a massive reduction in programming time. Suddenly programming became more about I/O than anything else. To promote the emerging computing industry, the Pentagon had the Moore School of Electrical Engineering at The University of Pennsylvania launch a series of lectures to further computing at large. These were called the Theory and Techniques for Design of Electronic Digital Computers, or just the Moore School Lectures for short. The lectures focused on the various types of circuits and the findings from Eckert and Mauchly on building and architecting computers. Goldstein would talk at length about math and other developers would give talks, looking forward to the development of the EDVAC and back at how they got where they were with ENIAC. As the University began to realize the potential business impact and monetization, they decided to bring a focus to University owned patents. That drove the original designers out of the University of Pennsylvania and they started the Eckert-Mauchly Computer Corporation in 1946. Eckert-Mauchley would the build EDVAC, taking use of progress the industry had made since the ENIAC construction had begun. EDVAC would effectively represent the wholesale move away from digital and into binary computing and while it weighed tons - it would become the precursor to the microchip. After the ENIAC was finished Mauchly filed for a patent in 1947. While a patent was granted, you could still count on your fingers the number of machines that were built at about the same time, including the Atanasoff Berry Computer, Colossus, the Harvard Mark I and the Z3. So luckily the patent was avoided and digital computers are a part of the public domain. That patent was voided in 1973. By then, the Eckert-Mauchly computer corporation had been acquired by Remington Rand, which merged with Sperry and is now called Unisys. The next wave of computers would be mainframes built by GE, Honeywell, IBM, and another of other vendors and so the era of batch processing mainframes began. The EDVAC begat the UNIVAC and Grace Hopper being brought in to write an assembler for that. Computers would become the big mathematical number crunchers and slowly spread into being data processors from there. Following decades of batch processing mainframes we would get minicomputers and interactivity, then time sharing, and then the PC revolution. Distinct eras in computing. Today, computers do far more than just the types of math the ENIAC did. In fact, the functionality of ENIAC was duplicated onto a 20 megahertz microchip in 1996. You know, ‘cause the University of Pennsylvania wanted to do something to celebrate the 50th birthday. And a birthday party seemed underwhelming at the time. And so the date of release for this episode is February 15th, now ENIAC Day in Philadelphia, dedicated as a way to thank the university, creators, and programmers. And we should all reiterate their thanks. They helped put computers front and center into the thoughts of the next generation of physicists, mathematicians, and engineers, who built the mainframe era. And I should thank you - for listening to this episode. I’m pretty lucky to have ya’. Have a great day! .


The Mouse

     2/18/2020

In a world of rapidly changing technologies, few have lasted as long is as unaltered a fashion as the mouse. The party line is that the computer mouse was invente d by Douglas Engelbart in 1964 and that it was a one-button wooden device that had two metal wheels. Those used an analog to digital conversion to input a location to a computer. But there’s a lot more to tell. Englebart had read an article in 1945 called “As We May Think” by Vannevar Bush. He was in the Philippines working as a radio and radar tech. He’d return home,. Get his degree in electrical engineering, then go to Berkeley and get first his masters and then a PhD. Still in electrical engineering. At the time there were a lot of military grants in computing floating around and a Navy grant saw him work on a computer called CALDIC, short for the California Digital Computer. By the time he completed his PhD he was ready to start a computer storage company but ended up at the Stanford Research Institute in 1957. He published a paper in 1962 called Augmenting Human Intellect: A Conceptual Framework. That paper would guide the next decade of his life and help shape nearly everything in computing that came after. Keeping with the theme of “As We May Think” Englebart was all about supplementing what humans could do. The world of computer science had been interested in selecting things on a computer graphically for some time. And Englebart would have a number of devices that he wanted to test in order to find the best possible device for humans to augment their capabilities using a computer. He knew he wanted a graphical system and wanted to be deliberate about every aspect in a very academic fashion. And a key aspect was how people that used the system would interact with it. The keyboard was already a mainstay but he wanted people pointing at things on a screen. While Englebart would invent the mouse, pointing devices certainly weren’t new. Pilots had been using the joystick for some time, but an electrical joystick had been developed at the US Naval Research Laboratory in 1926, with the concept of unmanned aircraft in mind. The Germans would end up building one in 1944 as well. But it was Alan Kotok who brought the joystick to the computer game in the early 1960s to play spacewar on minicomputers. And Ralph Baer brought it into homes in 1967 for an early video game system, the Magnavox Odyssey. Another input device that had come along was the trackball. Ralph Benjamin of the British Royal Navy’s Scientific Service invented the trackball, or ball tracker for radar plotting on the Comprehensive Display System, or CDS. The computers were analog at the time but they could still use the X-Y coordinates from the trackball, which they patented in 1947. Tom Cranston, Fred Longstaff and Kenyon Taylor had seen the CDS trackball and used that as the primary input for DATAR, a radar-driven battlefield visualization computer. The trackball stayed in radar systems into the 60s, when Orbit Instrument Corporation made the X-Y Ball Tracker and then Telefunken turned it upside down to control the TR 440, making an early mouse type of device. The last of the options Englebart decided against was the light pen. Light guns had shown up in the 1930s when engineers realized that a vacuum tube was light-sensitive. You could shoot a beam of light at a tube and it could react. Robert Everett worked with Jay Forrester to develop the light pen, which would allow people to interact with a CRT using light sensing to cause an interrupt on a computer. This would move to the SAGE computer system from there and eek into the IBM mainframes in the 60s. While the technology used to track the coordinates is not even remotely similar, think of this as conceptually similar to the styluses used with tablets and on Wacom tablets today. Paul Morris Fitts had built a model in 1954, now known as Fitts’s Law, to predict the time that’s required to move things on a screen. He defined the target area as a function of the ratio between the distance to the target and the width of the target. If you listen to enough episodes of this podcast, you’ll hear a few names repeatedly. One of those is Claude Shannon. He brought a lot of the math to computing in the 40s and 50s and helped with the Shannon-Hartley Theorum, which defined information transmission rates over a given medium. So these were the main options at Englebart’s disposal to test when he started ARC. But in looking at them, he had another idea. He’d sketched out the mouse in 1961 while sitting in a conference session about computer graphics. Once he had funding he brought in Bill English to build a prototype I n 1963. The first model used two perpendicular wheels attached to potentiometers that tracked movement. It had one button to select things on a screen. It tracked x,y coordinates as had previous devices. NASA funded a study to really dig in and decide which was the best device. He, Bill English, and an extremely talented team, spent two years researching the question, publishing a report in 1965. They really had the blinders off, too. They looked at the DEC Grafacon, joysticks, light pens and even what amounts to a mouse that was knee operated. Two years of what we’d call UX research or User Research today. Few organizations would dedicate that much time to study something. But the result would be patenting the mouse in 1967, an innovation that would last for over 50 years. I’ve heard Engelbart criticized for taking so long to build the oNline System, or NLS, which he showcased at the Mother of All Demos. But it’s worth thinking of his research as academic in nature. It was government funded. And it changed the world. His paper on Computer-Aided Display Controls was seminal. Vietnam caused a lot of those government funded contracts to dry up. From there, Bill English and a number of others from Stanford Research Institute which ARC was a part of, moved to Xerox PARC. English and Jack Hawley iterated and improved the technology of the mouse, ditching the analog to digital converters and over the next few years we’d see some of the most substantial advancements in computing. By 1981, Xerox had shipped the Alto and the Star. But while Xerox would be profitable with their basic research, they would miss something that a candle-clad hippy wouldn’t. In 1979, Xerox let Steve Jobs make three trips to PARC in exchange for the opportunity to buy 100,000 shares of Apple stock pre-IPO. The mouse by then had evolved to a three button mouse that cost $300. It didn’t roll well and had to be used on pretty specific surfaces. Jobs would call Dean Hovey, a co-founder of IDEO and demand they design one that would work on anything including quote “blue jeans.” Oh, and he wanted it to cost $15. And he wanted it to have just one button, which would be an Apple hallmark for the next 30ish years. Hovey-Kelley would move to optical encoder wheels, freeing the tracking ball to move however it needed to and then use injection molded frames. And thus make the mouse affordable. It’s amazing what can happen when you combine all that user research and academic rigor from Englebarts team and engineering advancements documented at Xerox PARC with world-class industrial design. You see this trend played out over and over with the innovations in computing that are built to last. The mouse would ship with the LISA and then with the 1984 Mac. Logitech had shipped a mouse in 1982 for $300. After leaving Xerox, Jack Howley founded a company to sell a mouse for $400 the same year. Microsoft released a mouse for $200 in 1983. But Apple changed the world when Steve Jobs demanded the mouse ship with all Macs. The IBM PC would ;use a mouse and from there it would become ubiquitous in personal computing. Desktops would ship with a mouse. Laptops would have a funny little button that could be used as a mouse when the actual mouse was unavailable. The mouse would ship with extra buttons that could be mapped to additional workflows or macros. And even servers were then outfitted with switches that allowed using a device that switched the keyboard, video, and mouse between them during the rise of large server farms to run the upcoming dot com revolution. Trays would be put into most racks with a single u, or unit of the rack being used to see what you’re working on; especially after Windows or windowing servers started to ship. As various technologies matured, other innovations came along to input devices. The mouse would go optical in 1980 and ship with early Xerox Star computers but what we think of as an optical mouse wouldn’t really ship until 1999 when Microsoft released the IntelliMouse. Some of that tech came to them via Hewlett-Packard through the HP acquisition of DEC and some of those same Digital Research Institute engineers had been brought in from the original mainstreamer of the mouse, PARC when Bob Taylor started DRI. The LED sensor on the muse stuck around. And thus ended the era of the mouse pad, once a hallmark of many a marketing give-away. Finger tracking devices came along in 1969 but were far too expensive to produce at the time. As capacitive sensitive pads, or trackpads came down in price and the technology matured those began to replace the previous mouse-types of devices. The 1982 Apollo computers were the first to ship with a touchpad but it wasn’t until Synaptics launched the TouchPad in 1992 that they began to become common, showing up in 1995 on Apple laptops and then becoming ubiquitous over the coming years. In fact, the IBM Thinkpad and many others shipped laptops with little red nubs in the keyboard for people that didn’t want to use the TouchPad for awhile as well. Some advancements in the mouse didn’t work out. Apple released the hockey puck shaped mouse in 1998, when they released the iMac. It was USB, which replaced the ADB interface. USB lasted. The shape of the mouse didn’t. Apple would go to the monolithic surface mouse in 2000, go wireless in 2003 and then release the Mighty Mouse in 2005. The Mighty Mouse would have a capacitive touch sensor and since people wanted to hear a click would produce that with a little speaker. This also signified the beginning of bluetooth as a means of connecting a mouse. Laptops began to replace desktops for many, and so the mouse itself isn’t as dominant today. And with mobile and tablet computing, resistive touchscreens rose to replace many uses for the mouse. But even today, when I edit these podcasts, I often switch over to a mouse simply because other means of dragging around timelines simply aren’t as graceful. And using a pen, as Englebart’s research from the 60s indicated, simply gets fatiguing. Whether it’s always obvious, we have an underlying story we’re often trying to tell with each of these episodes. We obviously love unbridled innovation and a relentless drive towards a technologically utopian multiverse. But taking a step back during that process and researching what people want means less work and faster adoption. Doug Englebart was a lot of things but one net-new point we’d like to make is that he was possibly the most innovative in harnessing user research to make sure that his innovations would last for decades to come. Today, we’d love to research every button and heat map and track eyeballs. But remembering, as he did, that our job is to augment human intellect, is best done when we make our advances useful, helps to keep us and the forks that occur in technology from us, from having to backtrack decades of work in order to take the next jump forward. We believe in the reach of your innovations. So next time you’re working on a project. Save yourself time, save your code a little cyclomatic complexity, , and save users frustration from having to relearn a whole new thing. And research what you’re going to do first. Because you never know. Something you engineer might end up being touched by nearly every human on the planet the way the mouse has. Thank you Englebart. And thank you to NASA and Bob Roberts from ARPA for funding such important research. And thank you to Xerox PARC, for carrying the torch. And to Steve Jobs for making the mouse accessible to every day humans. As with many an advance in computing, there are a lot of people that deserve a little bit of the credit. And thank you listeners, for joining us for another episode of the history of computing podcast. We’re so lucky to have you. Now stop consuming content and go change the world.


Before The Web, There Was Gopher

     10/23/2019

Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Today we’re going to talk about Gopher. Gopher was in some ways a precursor to the world wide web, or more specifically, too http. The University of Minnesota was founded in 1851. It gets cold in Minnesota. Like really cold. And sometimes, it’s dangerous to walk around outside. As the University grew, they needed ways to get students between buildings on campus. So they built tunnels. But that’s not where the name came from. The name actually comes from a political cartoon. In the cartoon a bunch of not-cool railroad tycoons were pulling a train car to the legislature. The rest of the country just knew it was cold in Minnesota and there must be gophers there. That evolved into the Gopher State moniker, the Gopher mascot of the U and later the Golden Gophers. The Golden Gophers were once a powerhouse in college football. They have won the 8th most National titles of any University in college football, although they haven’t nailed one since 1960. Mark McCahill turned 4 years old that year. But by the late 80s he was in his thirties. McCahill had graduated from the U in 1979 with a degree in Chemistry. By then he managed the Microcomputer Center at the University of Minnesota–Twin Cities. The University of Minnesota had been involved with computers for a long time. The Minnesota Education Computing Consortium had made software for schools, like the Oregon Trail. And even before then they’d worked with Honeywell, IBM, and a number of research firms. At this point, the University of Minnesota had been connected to the ARPANET, which was evolving into the Internet, and everyone wanted it to be useful. But it just wasn’t yet. TCP/IP maybe wasn’t the right way to connect to things. I mean, maybe bitnet was. But by then we knew it was all about TCP/IP. They’d used FTP. And saw a lot of promise in the tidal wave you could just feel coming of this Internet thing. There was just one little problem. A turf war between batch processed mainframes had been raging for a time with the suit and tie crowd thinking that big computers were the only place real science could happen and the personal computer kids thinking that the computer should be democratized and that everyone should have one. So McCahill writes a tool called POPmail to make it easy for people to access this weird thing called email on the Macs that were starting to show up at the University. This led to his involvement writing tools for departments. 1991 rolls around and some of the department heads around the University meet for months to make a list of things they want out of a network of computers around the school. Enter Farhad Anklesaria. He’d been working with those department heads and reduced their demands to something he could actually ship. A server that hosted some files and a client that accessed the files. McCahill added a search option and combined the two. They brought in four other programmers to help finish the coding. They finished the first version in about three weeks. Of those original programmers, Bob Alberti, who’d helped write an early online multiplayer game already, named his Gopher server Indigo after the Indigo Girls. Paul Lindner named one of his Mudhoney. They coded between taking support calls in the computing center. They’d invented bookmarks and hyperlinks which led McCahill to coin the term “surf the internet” Computers at the time didn’t come with the software necessary to access the Internet but Apple was kind enough to include a library at the time. People could get on the Internet and pretty quickly find some documents. Modems weren’t fast enough to add graphics yet. But, using the Gopher you could search the internet and retrieve information linked from all around the world. Wacky idea, right? The world wanted it. They gave it the name of the school’s mascot to keep the department heads happy. It didn’t work. It wasn’t a centralized service hosted on a mainframe. How dare they. They were told not to work on it any more but kept going anyway. They posted an FTP repository of the software. People downloaded it and even added improvements. And it caught fire underneath the noses of the University. This was one of the first rushes on the Internet. These days you’d probably be labeled a decacorn for the type of viral adoption they got. The White House jumped on the bandwagon. MTV veejay Adam Curry wore a gopher shirt when they announced their Gopher site. There were GopherCons. Al Gore showed up. He wasn’t talking about the Internet as though it were a bunch of tubes yet. So then Tim Berners-Lee had put the first website up in 1991, introducing html on Gopher and what we now know as the web was slowly growing. McCahill then worked with Berners-Lee, Marc Andreessen of Netscape, Alan Emtage and former MIT whiz kid, Peter J. Deutsch. Oh and the czar of the Internet Jon Postel. McCahill needed a good way of finding things on his new Internet protocol. So he invented something that we still use considerably: URLs, or Uniform Resource Locators. You know when you type http://www.google.com that’s a URL. The http indicates the protocol to use. Every computer has a default handler for those protocols. Everything following the :// is the address on the Internet of the object. Gopher of course was gopher://. FTP was ftp:// and so on. There’s of course more to the spec, but that’s the first part. Suddenly there were competing standards. And as with many rapid rushes to adopt a technology, Gopher started to fall off and the web started to pick up. Gopher went through the hoops. It went to an IETF RFC in 1993 as RFC 1436, The Internet Gopher Protocol (a distributed document search and retrieval protocol). I first heard of Mark McCahill when I was on staff at the University of Georgia and had to read up on how to implement this weird Gopher thing. I was tasked with deploying Gopher to all of the Macs in our labs. And I was fascinated, as were so many others, with this weird new thing called the Internet. The internet was decentralized. The Internet was anti-authoritarian. The Internet was the Subpop records of the computing world. But bands come and go. And the University of Minnesota wanted to start charging a licensing fee. That started the rapid fall of Gopher and the rise of the html driven web from Berners-Lee. It backfired. People were mad. The team hadn’t grown or gotten headcount or funding. The team got defensive publicly and while traffic continued to grow, the traffic on the web grew 300 times faster. The web came with no licensing. Yet. Modems got faster. The web added graphics. In 1995 an accounting disaster came to the U and the team got reassigned to work on building a modern accounting system. At a critical time, they didn’t add graphics. They didn’t further innovate. The air was taken out of their sales from the licensing drama and the lack of funding. Things were easier back then. You could spin up a server on your computer and other people could communicate with it without fear of your identity being stolen. There was no credit card data on the computer. There was no commerce. But by the time I left the University of Georgia we were removing the gopher apps in favor of NCSA Mosaic and then Netscape. McCahill has since moved on to Duke University. Perhaps his next innovation will be called Document Devil or World Wide Devil. Come to think of it, that might not be the best idea. Wouldn’t wanna’ upset the Apple Cart. Again. The web as we know it today wasn’t just some construct that happened in a vacuum. Gopher was the most popular protocol to come before it but there were certainly others. In those three years, people saw the power of the Internet and wanted to get in on that. They were willing it into existence. Gopher was first but the web built on top of the wave that gopher started. Many browsers still support gopher either directly or using an extension to render documents. But Gopher itself is no longer much of a thing. What we’re really getting at is that the web as we know it today was deterministic. Which is to say that it was almost willed into being. It wasn’t a random occurrence. The very idea of a decentralized structure that was being willed into existence, by people who wanted to supplement human capacity or by a variety of other motives including “cause it seemed cool at the time, man.” It was almost independent of the action of any specific humans. It was just going to happen, as though free will of any individual actors had been removed from the equation. Bucking authority, like the department heads at the U, hackers from around the world just willed this internet thing into existence. And all these years later, many of us are left in awe at their accomplishments. So thank you to Mark and the team for giving us Gopher, and for the part it played in the rise of the Internet.


BASIC

     11/24/2019

BASIC Welcome to the History of Computing Podcast, where we explore the history of information technology. Because by understanding the past prepares us to innovate the future! Today we’re going to look at the computer that was the history of the BASIC programming language. We say BASIC but really BASIC is more than just a programming language. It’s a family of languages and stands for Beginner’s All-purpose Symbolic Instruction Code. As the name implies it was written to help students that weren’t math nerds learn how to use computers. When I was selling a house one time, someone was roaming around in my back yard and apparently they’d been to an open house and they asked if I’m a computer scientist after they saw a dozen books I’d written on my bookshelf. I really didn’t know how to answer that question We’ll start this story with Hungarian John George Kemeny. This guy was pretty smart. He was born in Budapest and moved to the US with his family in 1940 when his family fled anti-Jewish sentiment and laws in Hungary. Some of his family would go on to die in the Holocaust, including his grandfather. But safely nestled in New York City, he would graduate high school at the top of his class and go on to Princeton. Check this out, he took a year off to head out to Los Alamos and work on the Manhattan Project under Nobel laureate Richard Feynman. That’s where he met fellow Hungarian immigrant Jon Von Neumann - two of a group George Marx wrote about in his book on great Hungarian Emmigrant Scientists and thinkers called The Martians. When he got back to Princeton he would get his Doctorate and act as an assistant to Albert Einstein. Seriously, THE Einstein. Within a few years he was a full professor at Dartmouth and go on to publish great works in mathematics. But we’re not here to talk about those contributions to the world as an all around awesome place. You see, by the 60s math was evolving to the point that you needed computers. And Kemeny and Thomas Kurtz would do something special. Now Kurtz was another Dartmoth professor who got his PhD from Princeton. He and Kemeny got thick as thieves and wrote the Dartmouth Time-Sharing System (keep in mind that Time Sharing was all the rage in the 60s, as it gave more and more budding computer scientists access to those computer-things that prior to the advent of Unix and the PC revolution had mostly been reserved for the high priests of places like IBM. So Time Sharing was cool, but the two of them would go on to do something far more important. In 1956, they would write DARSIMCO, or Dartmouth Simplified Code. As with Pascal, you can blame Algol. Wait, no one has ever heard of DARSIMCO? Oh… I guess they wrote that other language you’re here to hear the story of as well. So in 59 they got a half million dollar grant from the Alfred P. Sloan foundation to build a new department building. That’s when Kurtz actually joined the department full time. Computers were just going from big batch processed behemoths to interactive systems. They tried teaching with DARSIMCO, FORTRAN, and the Dartmouth Oversimplified Programming Experiment, a classic acronym for 1960s era DOPE. But they didn’t love the command structure nor the fact that the languages didn’t produce feedback immediately. What was it called? Oh, so in 1964, Kemeny wrote the first iteration of the BASIC programming language and Kurtz joined him very shortly thereafter. They did it to teach students how to use computers. It’s that simple. And as most software was free at the time, they released it to the public. We might think of this as open source-is by todays standards. I say ish as Dartmouth actually choose to copyright BASIC. Kurtz has said that the name BASIC was chosen because “We wanted a word that was simple but not simple-minded, and BASIC was that one.” The first program I wrote was in BASIC. BASIC used line numbers and read kinda’ like the English language. The first line of my program said 10 print “Charles was here” And the computer responded that “Charles was here” - the second program I wrote just added a second line that said: 20 goto 10 Suddenly “Charles was here” took up the whole screen and I had to ask the teacher how to terminate the signal. She rolled her eyes and handed me a book. And that my friend, was the end of me for months. That was on an Apple IIc. But a lot happened with BASIC between 1964 and then. As with many technologies, it took some time to float around and evolve. The syntax was kinda’ like a simplified FORTRAN, making my FORTRAN classes in college a breeze. That initial distribution evolved into Dartmouth BASIC, and they received a $300k grant and used student slave labor to write the initial BASIC compiler. Mary Kenneth Keller was one of those students and went on to finish her Doctorate in 65 along with Irving Tang, becoming the first two PhDs in computer science. After that she went off to Clarke College to found their computer science department. The language is pretty easy. I mean, like PASCAL, it was made for teaching. It spread through universities like wildfire during the rise of minicomputers like the PDP from Digital Equipment and the resultant Data General Nova. This lead to the first text-based games in BASIC, like Star Trek. And then came the Altair and one of the most pivotal moments in the history of computing, the porting of BASIC to the platform by Microsoft co-founders Bill Gates and Paul Allen. But Tiny BASIC had appeared a year before and suddenly everyone needed “a basic.” You had Commodore BASIC, BBC Basic, Basic for the trash 80, the Apple II, Sinclair and more. Programmers from all over the country had learned BASIC in college on minicomputers and when the PC revolution came, a huge part of that was the explosion of applications, most of which were written in… you got it, BASIC! I typically think of the end of BASIC coming in 1991 when Microsoft bought Visual Basic off of Alan Cooper and object-oriented programming became the standard. But the things I could do with a simple if, then else statement. Or a for to statement or a while or repeat or do loop. Absolute values, exponential functions, cosines, tangents, even super-simple random number generation. And input and output was just INPUT and PRINT or LIST for source. Of course, functional programming was always simpler and more approachable. So there, you now have Kemeny as a direct connection between Einstein and the modern era of computing. Two immigrants that helped change the world. One famous, the other with a slightly more nuanced but probably no less important impact in a lot of ways. Those early BASIC programs opened our eyes. Games, spreadsheets, word processors, accounting, Human Resources, databases. Kemeny would go on to chair the commission investigating Three Mile Island, a partial nuclear meltdown that was a turning point in nuclear proliferation. I wonder what Kemeny thought when he read the following on the Statue of Liberty: Give me your tired, your poor, Your huddled masses yearning to breathe free, The wretched refuse of your teeming shore. Perhaps, like many before and after, he thought that he would breathe free and with that breath, do something great, helping bring the world into the nuclear era and preparing thousands of programmers to write software that would change the world. When you wake up in the morning, you have crusty bits in your eyes and things seem blurry at first. You have to struggle just a bit to get out of bed and see the sunrise. BASIC got us to that point. And for that, we owe them our sincerest thanks. And thank you dear listeners, for your contributions to the world in whatever way they may be. You’re beautiful. And of course thank you for giving me some meaning on this planet by tuning in. We’re so lucky to have you, have a great day!


Once Upon A Friendster

     8/17/2019

Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Todays episode is on former Social Networking pioneer, Friendster. Today when you go to friendster.com you get a page that the social network is taking a break. The post was put up in 2018. How long did Rip Van Winkle Sleep? But what led to the rise of the first big social network and well, what happened? The story begins in 1973. Talkomatic was a chat room and was a hit in the PLATO or Programmed Logic for Automatic Teaching Operations community at the University of Illinois, an educational learning system that had been running since 1960. Dave Woolley and Douglas Brows at the University of Illinois brought chat and then the staff built TERM-Talk the same year, adding screen sharing and PLATO Notes would be added where you could add notes to your profile. This was the inspiration for the name of Lotus Notes. Then in the 80s came Bulletin Board Systems, 84 brought FidoNet, 88 brought IRC, 96 brought ICQ, and in 96 we got Bolt.com, the first social networking and video website with SixDegrees coming in 1997 as the first real social media website. AOL Instant Messenger showed up the same year and AOL bought ICQ in 99. It was pretty sweet that I didn’t have to remember all those ICQ numbers any more! 1999 - Yahoo! And Microsoft got in the game launching tools called Messenger at about the same time and LiveJournal came along, as well as Habbo, a social networking site for games. By 2001 Six Degrees shut down and Messenger was shipped with XP. But 2002. That was the year the Euro hit the street. Before England dissed it. That was the year Israeli and Palestinian conflicts escalated. Actually, that’s a lot of years, regrettably. I remember scandals at Enron and Worldcom well that year, ultimate resulting in Sarbanes Oxley to counter the more than 5 trillion dollars in corporate scandals that sent the economy into a tailspin. My Georgia Bulldogs football team beat Arkansas to win the SEC title and then beat Florida State in the Sugar Bowl. Nelly released Hot In Here and Eminem released Lose Yourself and Without Me. If film, Harry Potter was searching for the Chamber of Secrets and Frodo was on a great trek to the Two Towers. Eminem was in the theaters as well with 8 Mile. And Friendster was launched by Jonathan Abrams in Mountain View California. They wanted to get people making new friends and meeting in person. It was an immediate hit and people flocked to the site. They grew to three million users in just a few months, catching the attention of investors. As a young consultant, I loved keeping track of my friends who I never got to see in person using Friendster. Napster was popular at the time and the name Friendster came from a mashup of friends and Napster. With this early success, Friendster took $12 million dollars in funding from VC firm Kleiner Perkins Caufield & Byers, Benchmark Capital the next year. That was the year a Harvard student named Mark Zuckerburg launched FaceMash with his roommate Eduardo Saverin for Harvard students in a kinda’ “Hot or Not” game. They would later buy Instagram as a form of euphoric recall, looking back on those days. Google has long wanted a social media footprint and tried to buy Friendster in 2003, but when rejected launched Orkut in 2004 - which just ran in Brazil, tried Google Friend Connect in 2008, which lasted until 2012, Google Buzz, which launched in 2010 and only lasted a year, Google Wave, which launched in 2009 and also only lasted a year, and of course, Google + which ran from 2011 to 2019. Google is back at it again with a new social network called Shoelace out of their Area 120 incubator. The $30 million dollars in Google stock would be worth a billion dollars today. MySpace was also launched in 2003 by Chris DeWolfe and Tom Anderson, growing to have more traffic than Google over time. But Facebook launched in 2004 and after having problems keeping the servers up and running, Friendster's board replaced Abrams as CEO and moved him to chairmen of the board. He was replaced by Scott Sassa. And then in 2005 Sassa was replaced by Taek Kwn and then he was replaced by Kent Lindstrom who was replaced by Richard Kimber. Such rapid churn in the top spot means problems. A rudderless ship. In 2006 they added widgets to keep up with MySpace. They didn’t. They also opened up a developer program and opened up APIs. They still had 52 million unique visitors worldwide in June 2008. But by then, MySpace had grown to 7 times their size. MOL Global, an online payments processor from Malaysia bought the company in 2009 and relaunched the site. All user data was erased and Friendster provided an export tool to move data to other popular sites at the time, such as Flickr. In 2009 Friendster had 3 Million unique visitors per day. They relaunched But that dropped to less than a quarter million by the end of 2010. People abandoned the network. What happened? Facebook eclipsed the Friendster traffic in 2009. Friendster became something more used in Asia than the US. Really, though, I remember early technical problems. I remember not being able to log in, so moving over to MySpace. I remember slow loading times. And I remember more and more people spending time on MySpace, customizing their MySpace page. Facebook did something different. Sure, you couldn’t customize the page, but the simple layout loaded fast and was always online. This reminds me of the scene in the show Silicon Valley, when they have to grab the fire extinguisher because they set the house on fire from having too much traffic! In 2010, Facebook acquired Friendster's portfolio of social networking patents for $40 million dollars. In 2011, Newscorp sold MySpace for $35 million dollars after it had been valued at it peak in 2008. After continuing its decline, Friendster was sold to a social gaming site in 2015, trying to capitalize on the success that Facebook had doing online gaming. But after an immediate burst of users, it too was not successful. In 2018 the site finally closed its doors. Today Friendster is the 651,465th ranked site in the world. There are a few thing to think about when you look at the Friendster story: 1. The Internet would not be what it is today without sites like Friendster to help people want to be on it. 2. The first company on a new thing isn’t always the one that really breaks through 3. You have to, and I mean, have to keep your servers up. This is a critical aspect of maintaining you’re momentum. I was involved with one of the first 5 facebook apps. And we had no idea 2 million people would use that app in the weekend it was launched. We moved mountains to get more servers and clusters brought online and refactored sql queries on the fly, working over 70 hours in a weekend. And within a week we hit 10 million users. That app paid for dozens of other projects and was online for years. 4. When investors move in, the founder usually gets fired at the first sign of trouble. Many organizations simply can’t find their equilibrium after that and flounder. 5. Last but not least: Don’t refactor every year, but if you can’t keep your servers up, you might just have too much technical debt. I’m sure everyone involved with Friendster wishes they could go back and do many things differently. But hindsight is always 20/20. They played their part in the advent of the Internet. Without early pioneers like Friendster we wouldn’t be where we are at today. As Heinlein said, “yet another crew of Rip Van Winkle’s” But Buck Rogers eventually did actually wake back up, and maybe Friendster will as well. Thank you for tuning into another episode of the History of Computing Podcast. We’re lucky to have you. Have a great day!


The Mother Of All Demos

     10/17/2019

Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Today we’re going to cover a special moment in time. Picture this if you will. It’s 1968. A collection of some 1,000 of the finest minds in computing is sitting in the audience of the San Francisco Civic Center. They’re at a joint conference of the Association for Computing Machinery and the IEEE or the Institute of Electrical and Electronics Engineers Fall Join Computer Conference in San Francisco. They’re waiting to see the a session called A research center for augmenting human intellect. Many had read Vannevar Bush’s “As We May Think” Atlantic article in 1946 that signified the turning point that inspired so many achievements over the previous 20 years. Many had witnessed the evolution from the mainframe to the transistorized computer to timesharing systems. The presenter for this session would be Douglas Carl Engelbart. ARPA had strongly recommended he come to finally make a public appearance. Director Bob Taylor in fact was somewhat adamant about it. The talk was six years in the making and ARPA and NASA were ready to see what they had been investing in. ARPA had funded his Augmentation Research Center Lab in SRI, or the Stanford Research Institute. The grad instigator J.C.R. Licklider had started the funding when ARPA was still called DARPA in 1963 based on a paper Engelbart published in 1962. But it had really been going since Engelbart got married in 1950 and realized computers could be used to improve human capabilities, to harness the collective intellect, to facilitate truly interactive computing and to ultimately make the world a better place. Englebart was 25. He’d been from Oregon where he got his Bachelors in 48 after serving in World War II as a radar tech. He then come to Berkely in 53 for is Masters, sating through 1955 to get his PhD. He ended up at Stanford’s SRI. There, he hired people like Don Andrews, Bill Paxton, Bill English, and Jeff Rulifson. And today Engelbart was ready to show the world what his team had been working on. The computer was called the oNLine System, or NLS. Bill English would direct things onsite. Because check this out, not all presenters were onsite on that day in 1968. Instead, some were at ARC in Menlo Park, 30 miles away. To be able to communicate onsite they used two 1200 baud modems connecting over a leased line to their office. But they would also use two microwave links. And that was for something crazy: video. The lights went dark. The OnLine Computer was projected onto a 22 foot high screen using an Eidophor video projector. Bill English would flip the screen up as the lights dimmed. The audience was expecting a tall, thin man to come out to present. Instead, they saw Doug Englebart on the screen in front of them. The one behind the camera, filming Engelbart, was Stewart Russel Brand, the infamous editor of the Whole Earth Catalog. It seems Englebart was involved in more than just computers. But people destined to change the world have always travelled in the same circles I supposed. Englebart’s face came up on the screen, streaming in from all those miles away. And the screen they would switch back and forth to. That was the Online System, or NLS for short. The camera would come in from above Englebart’s back and the video would be transposed with the text being entered on the screen. This was already crazy. But when you could see where he was typing, there was something… well, extra. He was using a pointing device in his right hand. This was the first demo of a computer mouse Which he had applied for a patent for in 1967. He called it that because it had a tail which was the cabe that connected the wooden contraption to the computer. Light pens had been used up to this point, but it was the first demonstration of a mouse and the team had actually considered mounting it under the desk and using a knee to move the pointer.But they decided that would just be too big a gap for normal people to imagine and that the mouse would be simpler. Engelbart also used a device we might think of more like a macro pad today. It was modeled after piano keys. We’d later move this type of functionality onto the keyboard using various keystrokes, F keys, and a keyboard and in the case of Apple, command keys. He then opened a document on his screen. Now, people didn’t do a lot of document editing in 1968. Really, computers were pretty much used for math at that point. At least, until that day. That document he opened. He used hyperlinks to access content. That was the first real demo of clickable hypertext. He also copied text in the document. And one of the most amazing aspects of the presentation was that you kinda’ felt like he was only giving you a small peak into what he had. You see, before the demo, they thought he was crazy. Many were probably only there to see a colossal failure of a demo. But instead they saw pure magic. Inspiration. Innovation. They saw text highlighted. They saw windows on screens that could be resized. They saw the power of computer networking. Video conferencing. A stoic Engelbart was clearly pleased with his creation. Bill Paxton and Jeff Rulifson were on the other side, helping with some of the text work. His style worked well with the audience, and of course, it’s easy to win over an audience when they have just been wowed by your tech. But more than that, his inspiration was so inspiring that you could feel it just watching the videos. All these decades later. can watching those videos. Engelbart and the team would receive a standing ovation. And to show it wasn’t smoke and mirrors, ARC let people actually touch the systems and Engelbart took questions. Many people involved would later look back as though it was an unfinished work. And it was. Andy van Dam would later say Everybody was blown away and thought it was absolutely fantastic and nothing else happened. There was almost no further impact. People thought it was too far out and they were still working on their physical teletypes, hadn't even migrated to glass teletypes yet. But that’s not really fair or telling the whole story. In 1969 we got the Mansfield Amendment - which slashed the military funding pure scientific research. After that, the budget was cut and the team began to disperse, as was happening with a lot of the government-backed research centers. Xerox was lucky enough to hire Bob Taylor, and many others immigrated to Xerox PARC, or Palo Alto Research Center, was able to take the concept and actually ship a device in 1973, although not as mass marketable yet as later devices would be. Xerox would ship the Alto in 1973. The Alto would be the machine that inspired the Mac and therefore Windows - so his ideas live on today. His own team got spun out of Stanford and sold, becoming Tymshare and then McDonnel Douglas. He continued to have more ideas but his concepts were rarely implemented at McDonnel Douglas so he finally left in 1986, starting the Bootstrapp Alliance, which he founded with his daughter. But he succeeded. He wanted to improve the plight of man and he did. Hypertext and movable screens directly influenced a young Alan Kay who was in the audience and was inspired to write Smalltalk. The Alto at Xerox also inspired Andy van Dam, who built the FRESS hypertext system based on many of the concepts from the talk as well. It also did multiple windows, version control on documents, intradocument hypertext linking, and more. But, it was hard to use. Users needed to know complex commands just to get into the GUI screens. He was also still really into minicomputers and timesharing, and kinda’ missed that the microcomputer revolution was about to hit hard. The hardware hacker movement that was going on all over the country, but most concentrated in the Bay Area, was about to start the long process of putting a computer, and now mobile device, in every home in the world. WIth smaller and smaller and faster chips, the era of the microcomputer would transition into the era of the client and server. And that was the research we were transitioning to as we moved into the 80s. Charles Irby was a presentter as well, being a designer of NLS. He would go on to lead the user interface design work on the Xerox star before founding a company then moving on to VP of development for General Magic, a senior leader at SGI and then the leader of the engineering team that developed the Nintendo 64. Bob Sproull was in the audience watching all this and would go on to help design the Xerox Alto, the first laser printer, and write the Principles of Interactive Computer Graphics before becoming a professor at Conegie Mellon and then ending up helping create Sun Microsystems Laboratories, becoming the director and helping design asuynchronous processors. Butler Lampson was also there, a found of Xerox PARC, where the Alto was built and co-creator of Ethernet. Bill Paxton (not the actor) would join him at PARC and later go on to be an early founder of Adobe. In 2000, Engelbart would receive the National Medal of Technology for his work. He also He got the Turing Award in 1997, the Locelace Medal in 2001. He would never lose his belief in the collective intelligence. He wrote Boosting Our Collective IQ in 1995 and it has Englebart passed away in 2013. He will forever be known as the inventor of the mouse. But he gave us more. He wanted to augment the capabilities of humans, allowing us to do more, rather than replace us with machines. This was in contrast to SAIL and the MIT AI Lab where they were just doing research for the sake of research. The video of his talk is on YouTube, so click on the links in the show notes if you’d like to access it and learn more about such a great innovator. He may not have brought a mass produced system to market, but as with Vanevar Bush’s article 20 years before, the research done is a turning point in history; a considerable milestone on the path to the gleaming world we now live in today. The NLS teaches us that while you might not achieve commercial success with years of research, if you are truly innovative, you might just change the world. Sometimes the two simply aren’t mutually exclusive. And when you’re working on a government grant, they really don’t have to be. So until next time, dare to be bold. Dare to change the world, and thank you for tuning in to yet another episode of the History of Computing Podcast. We’re so lucky to have you. Have a great day! https://www.youtube.com/watch?v=yJDv-zdhzMY


The Evolution Of The Microchip

     9/13/2019

The Microchip Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Todays episode is on the history of the microchip, or microprocessor. This was a hard episode, because it was the culmination of so many technologies. You don’t know where to stop telling the story - and you find yourself writing a chronological story in reverse chronological order. But few advancements have impacted humanity the way the introduction of the microprocessor has. Given that most technological advances are a convergence of otherwise disparate technologies, we’ll start the story of the microchip with the obvious choice: the light bulb. Thomas Edison first demonstrated the carbon filament light bulb in 1879. William Joseph Hammer, an inventor working with Edison, then noted that if he added another electrode to a heated filament bulb that it would glow around the positive pole in the vacuum of the bulb and blacken the wire and the bulb around the negative pole. 25 years later, John Ambrose Fleming demonstrated that if that extra electrode is made more positive than the filament the current flows through the vacuum and that the current could only flow from the filament to the electrode and not the other direction. This converted AC signals to DC and represented a boolean gate. In the 1904 Fleming was granted Great Britain’s patent number 24850 for the vacuum tube, ushering in the era of electronics. Over the next few decades, researchers continued to work with these tubes. Eccles and Jordan invented the flip-flop circuit at London’s City and Guilds Technical College in 1918, receiving a patent for what they called the Eccles-Jordan Trigger Circuit in 1920. Now, English mathematician George Boole back in the earlier part of the 1800s had developed Boolean algebra. Here he created a system where logical statements could be made in mathematical terms. Those could then be performed using math on the symbols. Only a 0 or a 1 could be used. It took awhile, John Vincent Atanasoff and grad student Clifford Berry harnessed the circuits in the Atanasoff-Berry computer in 1938 at Iowa State University and using Boolean algebra, successfully solved linear equations but never finished the device due to World War II, when a number of other technological advancements happened, including the development of the ENIAC by John Mauchly and J Presper Eckert from the University of Pennsylvania, funded by the US Army Ordinance Corps, starting in 1943. By the time it was taken out of operation, the ENIAC had 20,000 of these tubes. Each digit in an algorithm required 36 tubes. Ten digit numbers could be multiplied at 357 per second, showing the first true use of a computer. John Von Neumann was the first to actually use the ENIAC when they used one million punch cards to run the computations that helped propel the development of the hydrogen bomb at Los Alamos National Laboratory. The creators would leave the University and found the Eckert-Mauchly Computer Corporation. Out of that later would come the Univac and the ancestor of todays Unisys Corporation. These early computers used vacuum tubes to replace gears that were in previous counting machines and represented the First Generation. But the tubes for the flip-flop circuits were expensive and had to be replaced way too often. The second generation of computers used transistors instead of vacuum tubes for logic circuits. The integrated circuit is basically a wire set into silicon or germanium that can be set to on or off based on the properties of the material. These replaced vacuum tubes in computers to provide the foundation of the boolean logic. You know, the zeros and ones that computers are famous for. As with most modern technologies the integrated circuit owes its origin to a number of different technologies that came before it was able to be useful in computers. This includes the three primary components of the circuit: the transistor, resistor, and capacitor. The silicon that chips are so famous for was actually discovered by Swedish chemist Jöns Jacob Berzelius in 1824. He heated potassium chips in a silica container and washed away the residue and viola - an element! The transistor is a semiconducting device that has three connections that amplify data. One is the source, which is connected to the negative terminal on a battery. The second is the drain, and is a positive terminal that, when touched to the gate (the third connection), the transistor allows electricity through. Transistors then acts as an on/off switch. The fact they can be on or off is the foundation for Boolean logic in modern computing. The resistor controls the flow of electricity and is used to control the levels and terminate lines. An integrated circuit is also built using silicon but you print the pattern into the circuit using lithography rather than painstakingly putting little wires where they need to go like radio operators did with the Cats Whisker all those years ago. The idea of the transistor goes back to the mid-30s when William Shockley took the idea of a cat’s wicker, or fine wire touching a galena crystal. The radio operator moved the wire to different parts of the crystal to pick up different radio signals. Solid state physics was born when Shockley, who first studied at Cal Tech and then got his PhD in Physics, started working on a way to make these useable in every day electronics. After a decade in the trenches, Bell gave him John Bardeen and Walter Brattain who successfully finished the invention in 1947. Shockley went on to design a new and better transistor, known as a bipolar transistor and helped move us from vacuum tubes, which were bulky and needed a lot of power, to first gernanium, which they used initially and then to silicon. Shockley got a Nobel Prize in physics for his work and was able to recruit a team of extremely talented young PhDs to help work on new semiconductor devices. He became increasingly frustrated with Bell and took a leave of absence. Shockley moved back to his hometown of Palo Alto, California and started a new company called the Shockley Semiconductor Laboratory. He had some ideas that were way before his time and wasn’t exactly easy to work with. He pushed the chip industry forward but in the process spawned a mass exodus of employees that went to Fairchild in 1957. He called them the “Traitorous 8” to create what would be Fairchild Semiconductors. The alumni of Shockley Labs ended up spawning 65 companies over the next 20 years that laid foundation of the microchip industry to this day, including Intel. . If he were easier to work with, we might not have had the innovation that we’ve seen if not for Shockley’s abbrasiveness! All of these silicon chip makers being in a small area of California then led to that area getting the Silicon Valley moniker, given all the chip makers located there. At this point, people were starting to experiment with computers using transistors instead of vacuum tubes. The University of Manchester created the Transistor Computer in 1953. The first fully transistorized computer came in 1955 with the Harwell CADET, MIT started work on the TX-0 in 1956, and the THOR guidance computer for ICBMs came in 1957. But the IBM 608 was the first commercial all-transistor solid-state computer. The RCA 501, Philco Transac S-1000, and IBM 7070 took us through the age of transistors which continued to get smaller and more compact. At this point, we were really just replacing tubes with transistors. But the integrated circuit would bring us into the third generation of computers. The integrated circuit is an electronic device that has all of the functional blocks put on the same piece of silicon. So the transistor, or multiple transistors, is printed into one block. Jack Kilby of Texas Instruments patented the first miniaturized electronic circuit in 1959, which used germanium and external wires and was really more of a hybrid integrated Circuit. Later in 1959, Robert Noyce of Fairchild Semiconductor invented the first truly monolithic integrated circuit, which he received a patent for. While doing so independently, they are considered the creators of the integrated circuit. The third generation of computers was from 1964 to 1971, and saw the introduction of metal-oxide-silicon and printing circuits with photolithography. In 1965 Gordon Moore, also of Fairchild at the time, observed that the number of transistors, resistors, diodes, capacitors, and other components that could be shoved into a chip was doubling about every year and published an article with this observation in Electronics Magazine, forecasting what’s now known as Moore’s Law. The integrated circuit gave us the DEC PDP and later the IBM S/360 series of computers, making computers smaller, and brought us into a world where we could write code in COBOL and FORTRAN. A microprocessor is one type of integrated circuit. They’re also used in audio amplifiers, analog integrated circuits, clocks, interfaces, etc. But in the early 60s, the Minuteman missal program and the US Navy contracts were practically the only ones using these chips, at this point numbering in the hundreds, bringing us into the world of the MSI, or medium-scale integration chip. Moore and Noyce left Fairchild and founded NM Electronics in 1968, later renaming the company to Intel, short for Integrated Electronics. Federico Faggin came over in 1970 to lead the MCS-4 family of chips. These along with other chips that were economical to produce started to result in chips finding their way into various consumer products. In fact, the MCS-4 chips, which split RAM , ROM, CPU, and I/O, were designed for the Nippon Calculating Machine Corporation and Intel bought the rights back, announcing the chip in Electronic News with an article called “Announcing A New Era In Integrated Electronics.” Together, they built the Intel 4004, the first microprocessor that fit on a single chip. They buried the contacts in multiple layers and introduced 2-phase clocks. Silicon oxide was used to layer integrated circuits onto a single chip. Here, the microprocessor, or CPU, splits the arithmetic and logic unit, or ALU, the bus, the clock, the control unit, and registers up so each can do what they’re good at, but live on the same chip. The 1st generation of the microprocessor was from 1971, when these 4-bit chips were mostly used in guidance systems. This boosted the speed by five times. The forming of Intel and the introduction of the 4004 chip can be seen as one of the primary events that propelled us into the evolution of the microprocessor and the fourth generation of computers, which lasted from 1972 to 2010. The Intel 4004 had 2,300 transistors. The Intel 4040 came in 1974, giving us 3,000 transistors. It was still a 4-bit data bus but jumped to 12-bit ROM. The architecture was also from Faggin but the design was carried out by Tom Innes. We were firmly in the era of LSI, or Large Scale Integration chips. These chips were also used in the Busicom calculator, and even in the first pinball game controlled by a microprocessor. But getting a true computer to fit on a chip, or a modern CPU, remained an elusive goal. Texas Instruments ran an ad in Electronics with a caption that the 8008 was a “CPU on a Chip” and attempted to patent the chip, but couldn’t make it work. Faggin went to Intel and they did actually make it work, giving us the first 8-bit microprocessor. It was then redesigned in 1972 as the 8080. A year later, the chip was fabricated and then put on the market in 1972. Intel made the R&D money back in 5 months and sparked the idea for Ed Roberts to build The Altair 8800. Motorola and Zilog brought competition in the 6900 and Z-80, which was used in the Tandy TRS-80, one of the first mass produced computers. N-MOSs transistors on chips allowed for new and faster paths and MOS Technology soon joined the fray with the 6501 and 6502 chips in 1975. The 6502 ended up being the chip used in the Apple I, Apple II, NES, Atari 2600, BBC Micro, Commodore PET and Commodore VIC-20. The MOS 6510 variant was then used in the Commodore 64. The 8086 was released in 1978 with 3,000 transistors and marked the transition to Intel’s x86 line of chips, setting what would become the standard in future chips. But the IBM wasn’t the only place you could find chips. The Motorola 68000 was used in the Sun-1 from Sun Microsystems, the HP 9000, the DEC VAXstation, the Comodore Amiga, the Apple Lisa, the Sinclair QL, the Sega Genesis, and the Mac. The chips were also used in the first HP LaserJet and the Apple LaserWriter and used in a number of embedded systems for years to come. As we rounded the corner into the 80s it was clear that the computer revolution was upon us. A number of computer companies were looking to do more than what they could do with he existing Intel, MOS, and Motorola chips. And ARPA was pushing the boundaries yet again. Carver Mead of Caltech and Lynn Conway of Xerox PARC saw the density of transistors in chips starting to plateau. So with DARPA funding they went out looking for ways to push the world into the VLSI era, or Very Large Scale Integration. The VLSI project resulted in the concept of fabless design houses, such as Broadcom, 32-bit graphics, BSD Unix, and RISC processors, or Reduced Instruction Set Computer Processor. Out of the RISC work done at UC Berkely came a number of new options for chips as well. One of these designers, Acorn Computers evaluated a number of chips and decided to develop their own, using VLSI Technology, a company founded by more Fairchild Semiconductor alumni) to manufacture the chip in their foundry. Sophie Wilson, then Roger, worked on an instruction set for the RISC. Out of this came the Acorn RISC Machine, or ARM chip. Over 100 billion ARM processors have been produced, well over 10 for every human on the planet. You know that fancy new A13 that Apple announced. It uses a licensed ARM core. Another chip that came out of the RISC family was the SUN Sparc. Sun being short for Stanford University Network, co-founder Andy Bchtolsheim, they were close to the action and released the SPARC in 1986. I still have a SPARC 20 I use for this and that at home. Not that SPARC has gone anywhere. They’re just made by Oracle now. The Intel 80386 chip was a 32 bit microprocessor released in 1985. The first chip had 275,000 transistors, taking plenty of pages from the lessons learned in the VLSI projects. Compaq built a machine on it, but really the IBM PC/AT made it an accepted standard, although this was the beginning of the end of IBMs hold on the burgeoning computer industry. And AMD, yet another company founded by Fairchild defectors, created the Am386 in 1991, ending Intel’s nearly 5 year monopoly on the PC clone industry and ending an era where AMD was a second source of Intel parts but instead was competing with Intel directly. We can thank AMD’s aggressive competition with Intel for helping to keep the CPU industry going along Moore’s law! At this point transistors were only 1.5 microns in size. Much, much smaller than a cats whisker. The Intel 80486 came in 1989 and again tracking against Moore’s Law we hit the first 1 million transistor chip. Remember how Compaq helped end IBM’s hold on the PC market? When the Intel 486 came along they went with AMD. This chip was also important because we got L1 caches, meaning that chips didn’t need to send instructions to other parts of the motherboard but could do caching internally. From then on, the L1 and later L2 caches would be listed on all chips. We’d finally broken 100MHz! Motorola released the 68050 in 1990, hitting 1.2 Million transistors, and giving Apple the chip that would define the Quadra and also that L1 cache. The DEC Alpha came along in 1992, also a RISC chip, but really kicking off the 64-bit era. While the most technically advanced chip of the day, it never took off and after DEC was acquired by Compaq and Compaq by HP, the IP for the Alpha was sold to Intel in 2001, with the PC industry having just decided they could have all their money. But back to the 90s, ‘cause life was better back when grunge was new. At this point, hobbyists knew what the CPU was but most normal people didn’t. The concept that there was a whole Univac on one of these never occurred to most people. But then came the Pentium. Turns out that giving a chip a name and some marketing dollars not only made Intel a household name but solidified their hold on the chip market for decades to come. While the Intel Inside campaign started in 1991, after the Pentium was released in 1993, the case of most computers would have a sticker that said Intel Inside. Intel really one upped everyone. The first Pentium, the P5 or 586 or 80501 had 3.1 million transistors that were 16.7 micrometers. Computers kept getting smaller and cheaper and faster. Apple answered by moving to the PowerPC chip from IBM, which owed much of its design to the RISC. Exactly 10 years after the famous 1984 Super Bowl Commercial, Apple was using a CPU from IBM. Another advance came in 1996 when IBM developed the Power4 chip and gave the world multi-core processors, or a CPU that had multiple CPU cores inside the CPU. Once parallel processing caught up to being able to have processes that consumed the resources on all those cores, we saw Intel's Pentium D, and AMD's Athlon 64 x2 released in May 2005 bringing multi-core architecture to the consumer. This led to even more parallel processing and an explosion in the number of cores helped us continue on with Moore’s Law. There are now custom chips that reach into the thousands of cores today, although most laptops have maybe 4 cores in them. Setting multi-core architectures aside for a moment, back to Y2K when Justin Timberlake was still a part of NSYNC. Then came the Pentium Pro, Pentium II, Celeron, Pentium III, Xeon, Pentium M, Xeon LV, Pentium 4. On the IBM/Apple side, we got the G3 with 6.3 million transistors, G4 with 10.5 million transistors, and the G5 with 58 million transistors and 1,131 feet of copper interconnects, running at 3GHz in 2002 - so much copper that NSYNC broke up that year. The Pentium 4 that year ran at 2.4 GHz and sported 50 million transistors. This is about 1 transistor per dollar made off Star Trek: Nemesis in 2002. I guess Attack of the Clones was better because it grossed over 300 Million that year. Remember how we broke the million transistor mark in 1989? In 2005, Intel started testing Montecito with certain customers. The Titanium-2 64-bit CPU with 1.72 billion transistors, shattering the billion mark and hitting a billion two years earlier than projected. Apple CEO Steve Jobs announced Apple would be moving to the Intel processor that year. NeXTSTEP had been happy as a clam on Intel, SPARC or HP RISC so given the rapid advancements from Intel, this seemed like a safe bet and allowed Apple to tell directors in IT departments “see, we play nice now.” And the innovations kept flowing for the next decade and a half. We packed more transistors in, more cache, cleaner clean rooms, faster bus speeds, with Intel owning the computer CPU market and AMD slowly growing from the ashes of Acorn computer into the power-house that AMD cores are today, when embedded in other chips designs. I’d say not much interesting has happened, but it’s ALL interesting, except the numbers just sound stupid they’re so big. And we had more advances along the way of course, but it started to feel like we were just miniaturizing more and more, allowing us to do much more advanced computing in general. The fifth generation of computing is all about technologies that we today consider advanced. Artificial Intelligence, Parallel Computing, Very High Level Computer Languages, the migration away from desktops to laptops and even smaller devices like smartphones. ULSI, or Ultra Large Scale Integration chips not only tells us that chip designers really have no creativity outside of chip architecture, but also means millions up to tens of billions of transistors on silicon. At the time of this recording, the AMD Epic Rome is the single chip package with the most transistors, at 32 billion. Silicon is the seventh most abundant element in the universe and the second most in the crust of the planet earth. Given that there’s more chips than people by a huge percentage, we’re lucky we don’t have to worry about running out any time soon! We skipped RAM in this episode. But it kinda’ deserves its own, since RAM is still following Moore’s Law, while the CPU is kinda’ lagging again. Maybe it’s time for our friends at DARPA to get the kids from Berkley working at VERYUltra Large Scale chips or VULSIs! Or they could sign on to sponsor this podcast! And now I’m going to go take a VERYUltra Large Scale nap. Gentle listeners I hope you can do that as well. Unless you’re driving while listening to this. Don’t nap while driving. But do have a lovely day. Thank you for listening to yet another episode of the History of Computing Podcast. We’re so lucky to have you!


Mavis Beacon Teaches Typing

     10/5/2019

Mavis Beacon Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Today we’re going to give thanks to a wonderful lady. A saint. The woman that taught me to type. Mavis Beacon. Over the years I often wondered what Mavis was like. She took me from a kid that was filled with wonder about these weird computers we had floating around school to someone that could type over a hundred words a minute. She always smiled. I never saw her frown once. I thought she must be a teacher somewhere. She must be a kind lady whose only goal in the world was to teach young people how to type. And indeed, she’s taught over a million people to type in her days as a teacher. In fact she’d been teaching for years by the time I first encountered her. Mavis Beacon was initially written for MS-DOS in 1987 and released by The Software Toolworks. Norm Worthington, Mike Duffy joined Walt Bilofsky started the company out of Sherman Oaks, California in 1980 and also made Chessmaster in 1986. They started with HDOS, a health app for the Osborne 1. They worked on Small C and Grogramma, releasing a conversation simulation tool from Joseph Weizenbaum in 1981. They wrote Mavis Beacon Teaches Typing in 1987 for IBM PCs. It took "Three guys, three computers, three beds, in four months”. It was an instant success. They went public in 1988 and were acquired by Pearson in 1994 for around half a billion dollars, becoming Mindscape in 1994. By 1998 she’d taught over 6,000,000 kids to type. Today, Encore Software produces the software and Software MacKiev distributes a version for the Mac. The software integrates with iTunes, supports competitive typing games, and still tracks words-per-minute. But who was Mavis? What inspired her to teach generations of children to type? Why hasn’t she aged? Mavis was named after Mavis Staples but she was a beacon to anyone looking to learn to type, thus Mavis Beacon. Mavis was initially portrayed by Haitian-born Renée L'Espérance, who was discovered working behind the perfume counter at Saks Fifth Avenue Beverly Hills by talk-show host Les Crane in 1985. He then brought her in to be the model. Featuring an African-American woman regrettably caused some marketing problems but didn’t impact the success of the release. So until the next episode, think about this: Mavis Beacon, real or not, taught me and probably another 10 million kids to type. She opened the door for us to do more with computers. I could never write code or books or even these episodes at a rate if it hadn’t been for her. So I owe her my sincerest of gratitude. And Norm Worthington, for having the idea in the first place. And I owe you my gratitude, for tuning into another episode of the History of Computing Podcast. We’re lucky to have you. Have a great day!


The MIT Tech Model Railroad Club

     9/22/2019

Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Today we’re going to look at the Tech Model Railroad Club, an obsessive group of young computer hackers that helped to shape a new vision for the young computer industry through the late 50s and early 60s. We’ve all seen parodies it in the movies. Queue up a montage. Iron Man just can’t help but tinker with new models of his armor. Then viola, these castaway hack jobs are there when a new foe comes along. As is inspiration to finish them. The Lambda Lamda Lamda guys get back at the jock frat boys in Revenge of the Nerds. The driven inventor in Honey I Shrunk the Kids just can’t help himself but build the most insane inventions. Peter Venkman in Ghostbusters. There’s a drive. And those who need to understand, to comprehend, to make sense of what was non-sensical before. I guess it even goes back to Dr Frankenstein. Some science just isn’t meant to be conquered. But trains. Those are meant to be conquered. They’re the golden spike into the engineering chasm that young freshman who looked like the cast of Stand By Me, but at MIT, wanted to conquer. You went to MIT in the 50s and 60s because you wanted a deeper understanding of how the world worked. But can you imagine a world where the unofficial motto of the MIT math department was that “there’s no such thing as computer science. It’s witchcraft!” The Tech Model Railroad Club, or TMRC, had started in 1946. World War II had ended the year before and the first first UN General Assembly and Security Council met, with Iran filing the first complaint against the Soviet Union and UNICEF being created. Syria got their independence from France. Jordan got their independence from Britain. The Philippines gained their independence from the US. Truman enacted the CIA, Stalin accounted a 5 year plan for Russia, ushering in the era of Soviet reconstruction and signaling the beginning of the col war, which would begin the next year. Anti-British protests exploded in India, and Attlee agreed to their independence. Ho Chi Minh became president of the Democratic Republic of Vietnam and France recognized their statehood days later, with war between his forces and the French breaking out later that year resulting in French martial law. Churchill gave his famous Iron Curtain Speech. Italy and Bulgaria abolished their monarchies. The US Supreme Court ordered desegregation of busses and Truman ordered desegregation of the armed forces and created the Committee on Civil Rights using an executive order. And there was no true computer industry. But the ENIAC went into production in 1946. And a group of kids at the Massachusetts Institute of Technology weren’t thinking much about the new world order being formed nor about the ENIAC which was being installed just a 5 or 6 hour drive away. They were thinking about model trains. And over the next few years they would build, paint, and make these trains run on model tracks. Started by Walter Marvin and John Fitzallen Moore, who would end up with over a dozen patents after earning his PhD from Columbia and having a long career at Lockheed, EMI Medical who invented the CT scan. By the mid-50s the club had grown and there were a few groups of people who were really in it for different things. Some wanted to drink cocacola while they painted trains. But the thing that drew many a student though was the ARRC, or Automatic Railroad Running Computer. This was built by the Signals and Power Subcommittee who used relays from telephone switches to make the trains do all kinds of crazy things, even cleaning the tracks. Today there we’re hacking genes, going to lifehacker.com, and sometimes regrettably getting hacked, or losing data in a breach. But the term came from one who chops or cuts, going back to the 1200s. But on a cool day in 1955, on the third floor of Build 20, known as the Plywood Palace, that would change. Minutes of a meeting at the Tech Model Railroad Club note “Mr. Eccles requests that anyone working or hacking on the electrical system turn the power off to avoid fuse blowing.” Maybe they were chopping parts of train tracks up. Maybe the term was derived from something altogether separate. But this was the beginning of a whole new culture. One that survives and thrives today. Hacking began to mean to do technical things for enjoyment in the club. And those who hacked became hackers. The OG hacker was Jack Dennis, an alumni of the TMRC. Jack Dennis had gotten his bachelors from MIT in 1953 and moved on to get his Masters then Doctorate by 1958, staying until he retired in 1987, teaching and influencing many subsequent generations of young hackers. You see, he studied artificial intelligence, or taking these computers built by companies like IBM to do math, and making them… intelligent. These switches and relays under the table of the model railroad were a lot of logical circuits strung together and in the days before what we think of as computers now, these were just a poor college student’s way of building a computer. Having skipped two grades in high school, this “computer” was what drew Alan Kotok to the TMRC in 1958. And incoming freshman Peter Samson. And Bob Saunders, a bit older than the rest. Then grad student Jack Dennis introduced the TMRC to the IBM 704. A marvel of human engineering. It was like your dad’s shiny new red 1958 corvette. Way too expensive to touch. But you just couldn’t help it. The young hackers didn’t know it yet, but Marvin Minsky had shown up to MIT in 1958. John McCarthy was a research fellow there. Jack Dennis got his PhD that year. Outside of MIT, Robert Noyce and Jack Kilby were giving us the Integrated Circuit, we got FORTRAN II, and that McCarthy guy. He gave us LISP. No, he didn’t speak with a LISP. He spoke IN LISP. And then president Lyndon Johnson established ARPA in response to Sputnik, to speed up technological progress. Fernando Corbato got his PhD in physics in 1956 and stayed on with the nerds until he retired as well. Kotok ended up writing the first chess program with McCarthy on the IBM 7090 while still a teenager. Everything changed when Lincoln Lab got the TX-0, lovingly referred to as the tikso. Suddenly, they weren’t loading cards into batch processing computers. The old IBM way was the enemy. The new machines allowed them to actually program. They wrote calculators and did work for courses. But Dennis kinda’ let them do most anything they wanted. So of course we ended up with very early computer games as well, with tic tac toe and Mouse in the Maze. These kids would write anything. Compilers? Sure. Assemblers? Got it. They would hover around the signup sheet for access to the tikso and consume every minute that wasn’t being used for official research. At this point, the kids were like the budding laser inventors in Weird Science. They were driven, crazed. And young Peter Deutsch joined them, writing the Lisp 1.5 implementation for the PDP at 12. Can you imagine being a 12 year old and holding your own around a group of some of the most influential people in the computer industry. Bill Gosper got to MIT in 1961 and so did the second PDP-1 ever built. Steve Russell joined the team and ended up working on Spacewar! When he wasn’t working on Lisp. Speaking of video games. They made Spacewar during this time with a little help from Kotok Steve Piner, Samson, Suanders, and Dan Edwards. In fact, Kotok and Saunders created the first gamepad, later made popular for Nintendo, so they could play Spacewar without using the keyboard. This was work that would eventually be celebrated by the likes of Rolling Stone and Space War and in fact would later become the software used to smoke test the PDP once it entered into the buying tornado. Ricky Greenblatt got to MIT in 1962. And this unruly, unkempt, and extremely talented group of kids hacked their way through the PDP, with Greenblatt becoming famous for his hacks, hacking away the first FORTRAN compiler for the PDP and spending so much time at the terminal that he didn’t make it through his junior year at MIT. These formative years in their lives were consumed with cocacola, Chinese food, and establishing many paradigms we now consider fundamental in computer science. The real shift from a batch process mode of operations, fed by paper tape and punchcards, to a interactive computer was upon us. And they were the pioneers who through countless hours of hacking away, found “the right thing.” Project MAC was established at MIT in 1963 using a DARPA grant and was initially run by legendary J. C. R. Licklider. MAC would influence operating systems with Multics which served as the inspiration for Unix, and the forming of what we now know as computer science through the 1960s and 70s. This represented a higher level of funding and a shift towards the era of development that led to the Internet and many of the standards we still use today. More generations of hackers would follow and continue to push the envelope. But that one special glimpse in time, let’s just say if you listen at just the right frequency you can hear screaming at terminals when a game of Spacewar didn’t go someone’s way, or when something crashed, or with glee when you got “the right thing.” And if you listen hard enough at your next hackathon, you can sometimes hear a Kotok or a Deutsch or a Saunders whisper in your ear exactly what “the right thing” is - but only after sufficient amounts of trial, error, and Spacewar. This free exercise gives way to innovation. That’s why Google famously gives employees free time to pursue their passions. That’s why companies run hackathons. That’s why everyone from DARPA to Netflix has run bounty programs. These young mathematicians, scientists, physicists, and engineers would go on to change the world in their own ways. Uncle John McCarthy would later move to Stanford, where he started the Stanford Artificial Intelligence Laboratory. From there he influenced Sun Microsystems (the S in Sun is for Stanford), Cisco, and dozens of other Silicon Valley powerhouses. Dennis would go on to found Multics and be an inspiration for Ken Thompson with the first versions of Unix. And after retiring he would go to NASA and then Acorn Networks. Slug Russell would go on to a long career as a developer and then executive, including a stop mentoring two nerdy high school kids at Lakeside School in Seattle. They were Paul Allen and Bill Gates, who would go on to found Microsoft. Alan Kotok would go on to join DEC where he would work for 30 years, influencing much of the computing through the 70s and into the 80s. He would work on the Titan chip at DEC and in the various consortiums around the emergent Internet. He would be a founding member of the World Wide Web Consortium. Ricky Greenblatt ended up spending too much of his time hacking. He would go on to found Lisp Machines, coauthor the time sharing software for the PDP-6 and PDP-10, write Maclisp, and write the first computer chess program to beat world class players in Hubert Dreyfus. Peter Samson wrote the Tech Model Railroad Club’s official dictionary which would evolve into the now-famous Jargon file. He wrote the Harmony compiler, a FORTRAN compiler for the PDP-6, made music for the first time with computers, became an architect at DEC, would oversee hardware engineering at NASA, and continues to act as a docent at the Computer History Museum. Bob Saunders would go on to be a professor at the University of California, becoming president of the IEEE, and Chairman of the Board during some of the most influential years in that great body of engineers and scientists. Peter Deutsch would go on to get his PhD from Berkeley, found Aladdin Enterprises, write Ghostscript, create free Postscript and PDF alternatives, work on Smalltalk, work at Sun, be an influential mind at Xerox PARC, and is now a composer. We owe a great deal to them. So thank you to these pioneers. And thank you, listeners, for sticking through to the end of this episode of the History of Computing Podcast. We’re lucky to have you.


The Internet Tidal Wave

     8/15/2019

Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Todays episode is going to be just a little bit unique. Or not unique as the case may be. Bill Gates sent a very important memo on May 26th, 1995. It’s so important because of how well it foreshadows what was about to happen with this weird thing called the Internet. So we’re going to simply provide the unaltered transcript and if you dig it, read a book or two of his. He is a surprisingly good writer. To: Executive Staff and direct reports From: Bill Gates Date: May 26, 1995 The Internet Tidal Wave Our vision for the last 20 years can be summarized in a succinct way. We saw that exponential improvements in computer capabilities would make great software quite valuable. Our response was to build an organization to deliver the best software products. In the next 20 years the improvement in computer power will be outpaced by the exponential improvements in communications networks. The combination of these elements will have a fundamental impact on work, learning and play. Great software products will be crucial to delivering the benefits of these advances. Both the variety and volume of the software will increase. Most users of communications have not yet seen the price of communications come down significantly. Cable and phone networks are still depreciating networks built with old technology. Universal service monopolies and other government involvement around the world have kept communications costs high. Private networks and the Internet which are built using state of the art equipment have been the primary beneficiaries of the improved communications technology. The PC is just now starting to create additional demand that will drive a new wave of investment. A combination of expanded access to the Internet, ISDN, new broadband networks justified by video based applications and interconnections between each of these will bring low cost communication to most businesses and homes within the next decade. The Internet is at the forefront of all of this and developments on the Internet over the next several years will set the course of our industry for a long time to come. Perhaps you have already seen memos from me or others here about the importance of the Internet. I have gone through several stages of increasing my views of its importance. Now I assign the Internet the highest level of importance. In this memo I want to make clear that our focus on the Internet is crucial to every part of our business. The Internet is the most important single development to come along since the IBM PC was introduced in 1981. It is even more important than the arrival of the graphical user interface (GUI). The PC analogy is apt for many reasons. The PC wasn't perfect. Aspects of the PC were arbitrary or even poor. However a phenomena grew up around the IBM PC that made it a key element of everything that would happen for the next 15 years. Companies that tried to fight the PC standard often had good reasons for doing so but they failed because the phenomena overcame any weaknesses that resisters identified. The Internet Today The Internet's unique position arises from a number of elements. TCP/IP protocols that define its transport level support distributed computing and scale incredibly well. The Internet Engineering Task Force (IETF) has defined an evolutionary path that will avoid running into future problems even as eventually everyone on the planet connects up. The HTTP protocols that define HTML Web browsing are extremely simple and have allowed servers to handle incredible traffic reasonably well. All of the predictions about hypertext - made decades ago by pioneers like Ted Nelson - are coming true on the Web. Although other protocols on the Internet will continue to be used (FTP, Gopher, IRC, Telnet, SMTP, NNTP). HTML with extensions will be the standard that defines how information will be presented. Various extensions to HTML, including content enhancements like tables, and functionality enhancements like secure transactions, will be widely adopted in the near future. There will also be enhanced 3D presentations providing for virtual reality type shopping and socialization. Another unique aspect of the Internet is that because it buys communications lines on a commodity bid basis and because it is growing so fast, it is the only "public" network whose economics reflect the latest advances in communications technology. The price paid for corporations to connect to the Internet is determined by the size of your "on-ramp" to the Internet and not by how much you actually use your connection. Usage isn't even metered. It doesn't matter if you connect nearby or half way around the globe. This makes the marginal cost of extra usage essentially zero encouraging heavy usage. Most important is that the Internet has bootstrapped itself as a place to publish content. It has enough users that it is benefiting from the positive feedback loop of the more users it gets, the more content it gets, and the more content it gets, the more users it gets. I encourage everyone on the executive staff and their direct reports to use the Internet. I've attached an appendix, which Brian Flemming helped me pull together that shows some hot sites to try out. You can do this by either using the .HTM enclosure with any Internet browser or, if you have Word set up properly, you can navigate right from within this document. Of particular interest are the sites such as "YAHOO" which provide subject catalogs and searching. Also of interest are the ways our competitors are using their Websites to present their products. I think SUN, Netscape and Lotus do some things very well. Amazingly it is easier to find information on the Web than it is to find information on the Microsoft Corporate Network. This inversion where a public network solves a problem better than a private network is quite stunning. This inversion points out an opportunity for us in the corporate market. An important goal for the Office and Systems products is to focus on how our customers can create and publish information on their LANs. All work we do here can be leveraged into the HTTP/Web world. The strength of the Office and Windows businesses today gives us a chance to superset the Web. One critical issue is runtime/browser size and performance. Only when our Office - Windows solution has comparable performance to the Web will our extensions be worthwhile. I view this as the most important element of Office 96 and the next major release of Windows. One technical challenge facing the Internet is how to handle "real-time" content - specifically audio and video. The underlying technology of the Internet is a packet network which does not guarantee that data will move from one point to another at a guaranteed rate. The congestion on the network determines how quickly packets are sent. Audio can be delivered on the Internet today using several approaches. The classic approach is to simply transmit the audio file in its entirety before it is played. A second approach is to send enough of it to be fairly sure that you can keeping playing without having to pause. This is the approach Progressive Networks Real Audio (Rob Glaser's new company) uses. Three companies (Internet Voice Chat, Vocaltec, and Netphone) allow phone conversations across the Internet but the quality is worse than a normal phone call. For video, a protocol called CU-SeeMe from Cornell allows for video conferencing. It simply delivers as many frames per second as it sees the current network congestion can handle, so even at low resolution it is quite jerky. All of these "hacks" to provide video and audio will improve because the Internet will get faster and also because the software will improve. At some point in the next three years, protocol enhancements taking advantage of the ATM backbone being used for most of the Internet will provide "quality of service guarantees". This is a guarantee by every switch between you and your destination that enough bandwidth had been reserved to make sure you get your data as fast as you need it. Extensions to IP have already been proposed. This might be an opportunity for us to take the lead working with UUNET and others. Only with this improvement and an incredible amount of additional bandwidth and local connections will the Internet infrastructure deliver all of the promises of the full blown Information Highway. However, it is in the process of happening and all we can do is get involved and take advantage. I think that virtually every PC will be used to connect to the Internet and that the Internet will help keep PC purchasing very healthy for many years to come. PCs will connect to the Internet a variety of ways. A normal phone call using a 14.4k or 28.8k baud modem will be the most popular in the near future. An ISDN connection at 128kb will be very attractive as the connection costs from the RBOCs and the modem costs come down. I expect an explosion in ISDN usage for both Internet connection and point-to-point connections. Point-to-point allows for low latency which is very helpful for interactive games. ISDN point-to-point allows for simultaneous voice data which is a very attractive feature for sharing information. Example scenarios include planning a trip, discussing a contract, discussing a financial transaction like a bill or a purchase or taxes or getting support questions about your PC answered. Eventually you will be able to find the name of someone or a service you want to connect to on the Internet and rerouting your call to temporarily be a point-to-point connection will happen automatically. For example when you are browsing travel possibilities if you want to talk to someone with expertise on the area you are considering, you simply click on a button and the request will be sent to a server that keeps a list of available agents who can be working anywhere they like as long as they have a PC with ISDN. You will be reconnected and the agent will get all of the context of what you are looking at and your previous history of travel if the agency has a database. The reconnection approach will not be necessary once the network has quality of service guarantees. Another way to connect a PC will be to use a cable-modem that uses the coaxial cable normally used for analog TV transmission. Early cable systems will essentially turn the coax into an Ethernet so that everyone in the same neighborhood will share a LAN. The most difficult problem for cable systems is sending data from the PC back up the cable system (the "back channel"). Some cable companies will promote an approach where the cable is used to send data to the PC (the "forward channel") and a phone connection is used for the back channel. The data rate of the forward channel on a cable system should be better than ISDN. Eventually the cable operators will have to do a full upgrade to an ATM-based system using either all fiber or a combination of fiber and Coax - however, when the cable or phone companies will make this huge investment is completely unclear at this point. If these buildouts happen soon, then there will be a loose relationship between the Internet and these broadband systems. If they don't happen for some time, then these broadband systems could be an extension of the Internet with very few new standards to be set. I think the second scenario is very likely. Three of the biggest developments in the last five years have been the growth in CD titles, the growth in On-line usage, and the growth in the Internet. Each of these had to establish critical mass on their own. Now we see that these three are strongly related to each other and as they come together they will accelerate in popularity. The On-line services business and the Internet have merged. What I mean by this is that every On-line service has to simply be a place on the Internet with extra value added. MSN is not competing with the Internet although we will have to explain to content publishers and users why they should use MSN instead of just setting up their own Web server. We don't have a clear enough answer to this question today. For users who connect to the Internet some way other than paying us for the connection we will have to make MSN very, very inexpensive - perhaps free. The amount of free information available today on the Internet is quite amazing. Although there is room to use brand names and quality to differentiate from free content, this will not be easy and it puts a lot of pressure to figure out how to get advertiser funding. Even the CD-ROM business will be dramatically affected by the Internet. Encyclopedia Brittanica is offering their content on a subscription basis. Cinemania type information for all the latest movies is available for free on the Web including theater information and Quicktime movie trailers. Competition Our traditional competitors are just getting involved with the Internet. Novell is surprisingly absent given the importance of networking to their position however Frankenberg recognizes its importance and is driving them in that direction. Novell has recognized that a key missing element of the Internet is a good directory service. They are working with AT&T and other phone companies to use the Netware Directory Service to fill this role. This represents a major threat to us. Lotus is already shipping the Internotes Web Publisher which replicates Notes databases into HTML. Notes V4 includes secure Internet browsing in its server and client. IBM includes Internet connection through its network in OS/2 and promotes that as a key feature. Some competitors have a much deeper involvement in the Internet than Microsoft. All UNIX vendors are benefiting from the Internet since the default server is still a UNIX box and not Windows NT, particularly for high end demands, SUN has exploited this quite effectively. Many Web sites, including Paul Allen's ESPNET, put a SUN logo and link at the bottom of their home page in return for low cost hardware. Several universities have "Sunsites" named because they use donated SUN hardware. SUN's Java project involves turning an Internet client into a programmable framework. SUN is very involved in evolving the Internet to stay away from Microsoft. On the SUN Homepage you can find an interview of Scott McNealy by John Gage where Scott explains that if customers decide to give one product a high market share (Windows) that is not capitalism. SUN is promoting Sun Screen and HotJava with aggressive business ads promising that they will help companies make money. SGI has also been advertising their leadership on the Internet including servers and authoring tools. Their ads are very business focused. They are backing the 3D image standard, VRML, which will allow the Internet to support virtual reality type shopping, gaming, and socializing. Browsing the Web, you find almost no Microsoft file formats. After 10 hours of browsing, I had not seen a single Word .DOC, AVI file, Windows .EXE (other than content viewers), or other Microsoft file format. I did see a great number of Quicktime files. All of the movie studios use them to offer film trailers. Apple benefited by having TCP support before we did and is working hard to build a browser built from OpenDoc components. Apple will push for OpenDoc protocols to be used on the Internet, and is already offering good server configurations. Apple's strength in education gives them a much stronger presence on the Internet than their general market share would suggest. Another popular file format on the Internet is PDF, the short name for Adobe Acrobat files. Even the IRS offers tax forms in PDF format. The limitations of HTML make it impossible to create forms or other documents with rich layout and PDF has become the standard alternative. For now, Acrobat files are really only useful if you print them out, but Adobe is investing heavily in this technology and we may see this change soon. Acrobat and Quicktime are popular on the network because they are cross platform and the readers are free. Once a format gets established it is extremely difficult for another format to come along and even become equally popular. A new competitor "born" on the Internet is Netscape. Their browser is dominant, with 70% usage share, allowing them to determine which network extensions will catch on. They are pursuing a multi-platform strategy where they move the key API into the client to commoditize the underlying operating system. They have attracted a number of public network operators to use their platform to offer information and directory services. We have to match and beat their offerings including working with MCI, newspapers, and other who are considering their products. One scary possibility being discussed by Internet fans is whether they should get together and create something far less expensive than a PC which is powerful enough for Web browsing. This new platform would optimize for the datatypes on the Web. Gordon Bell and others approached Intel on this and decided Intel didn't care about a low cost device so they started suggesting that General Magic or another operating system with a non-Intel chip is the best solution. Next Steps In highlighting the importance of the Internet to our future I don't want to suggest that I am alone in seeing this. There is excellent work going on in many product groups. Over the last year, a number of people have championed embracing TCP/IP, hyperlinking, HTML, and building client, tools and servers that compete on the Internet. However, we still have a lot to do. I want every product plan to try and go overboard on Internet features. One element that will be crucial is coordinating our various activities. The challenge/opportunity of the Internet is a key reason behind the recent organization. Paul Maritz will lead the Platform group to define an integrated strategy that makes it clear that Windows machines are the best choice for the Internet. This will protect and grow our Windows asset. Nathan and Pete will lead the Applications and Content group to figure out how to make money providing applications and content for the Internet. This will protect our Office asset and grow our Office, Consumer, and MSN businesses. The work that was done in the Advanced Technology group will be extremely important as it is integrated in with our products. We must also invest in the Microsoft home page, so it will be clear how to find out about our various products. Today it's quite random what is on the home page and the quality of information is very low. If you look up speeches by me all you find are a few speeches over a year old. I believe the Internet will become our most important promotional vehicle and paying people to include links to our home pages will be a worthwhile way to spend advertising dollars. First we need to make sure that great information is available. One example is the demonstration files (Screencam format) that Lotus includes on all of their products organized by feature. I think a measurable part of our ad budget should focus on the Internet. Any information we create - white papers, data sheets, etc., should all be done on our Internet server. ITG needs to take a hard look at whether we should drop our leasing arrangements for data lines to some countries and simply rely on the Internet. The actions required for the Windows platform are quite broad. Pual Maritz is having an Internet retreat in June which will focus on coordinating these activities. Some critical steps are the following: 1. Server. BSD is working on offering the best Internet server as an integrated package. We need to understand how to make NT boxes the highest performance HTTP servers. Perhaps we should have a project with Compaq or someone else to focus on this. Our initial server will have good performance because it uses kernel level code to blast out a file. We need a clear story on whether a high volume Web site can use NT or not becaues SUN is viewed as the primary choice. Our plans for security need to be strengthened. Other Backoffice pieces like SMS and SQL server also need to stay out in front in working with the Internet. We need to figure out how OFS can help perhaps by allowing pages to be stored as objects and having properties added. Perhaps OFS can help with the challenge of maintaining Web structures. We need to establish distributed OLE as the protocol for Internet programming. Our server offerings need to beat what Netscape is doing including billing and security support. There will be substantial demand for high performance transaction servers. We need to make the media server work across the Internet as soon as we can as new protocols are established. A major opportunity/challenge is directory. If the features required for Internet directory are not in Cairo or easily addable without a major release we will miss the window to become the world standard in directory with serious consequences. Lotus, Novell, and AT&T will be working together to try and establish the Internet directory. Actually getting the content for our directory and popularizing it could be done in the MSN group. 2. Client. First we need to offer a decent client (O'Hare) that exploits Windows 95 shortcuts. However this alone won't get people to switch away from Netscape. We need to figure out how to integrate Blackbird, and help browsing into our Internet client. We have made the decision to provide Blackbird capabilities openly rather than tie them to MSN. However, the process of getting the size, speed, and integration good enough for the market needs works and coordination. We need to figure out additional features that will allows us to get ahead with Windows customers. We need to move all of our Internet value added from the Plus pack into Windows 95 itself as soon as we possible can with a major goal to get OEMs shipping our browser preinstalled. This follows directly from the plan to integrate the MSN and Internet clients. Another place for integration is to eliminate today's Help and replace it with the format our browser accepts including exploiting our unique extensions so there is another reason to use our browser. We need to determine how many browsers we promote. Today we have O'Hare, Blackbird, SPAM MediaView, Word, PowerPoint, Symettry, Help and many others. Without unification we will lose to Netscape/HotJava. Over time the shell and the browser will converge and support hierarchical/list/query viewing as well as document with links viewing. The former is the structured approach and the later allows for richer presentation. We need to establish OLE protocols as the way rich documents are shared on the Internet. I am sure the OpenDoc consortium will try and block this. 3. File sharing/Window sharing/Multi-user. We need to give away client code that encourages Windows specific protocols to be used across the Internet. It should be very easy to set up a server for file sharing across the Internet. Our PictureTel screen sharing client allowing Window sharing should work easily across the Internet. We should also consider whether to do something with the Citrix code that allows you to become a Windows NT user across the Network. It is different from the PictureTel approach because it isn't peer to peer. Instead it allows you to be a remote user on a shared NT system. By giving away the client code to support all of these scenarios, we can start to show that a Windows machine on the Internet is more valuable than an artitrary machine on the net. We have immense leverage because our Client and Server API story is very strong. Using VB or VC to write Internet applications which have their UI remoted is a very powerful advantage for NT servers. 4. Forms/Languages. We need to make it very easy to design a form that presents itself as an HTML page. Today the Common Gateway Interface (CGI) is used on Web servers to give forms 'behavior' but its quite difficult to work with. BSD is defining a somewhat better approach they call BGI. However we need to integrate all of this with our Forms3 strategy and our languages. If we make it easy to associate controls with fields then we get leverage out of all of the work we are doing on data binding controls. Efforts like Frontier software's work and SUN's Java are a major challenge to us. We need to figure out when it makes sense to download control code to the client including a security approach to avoid this being a virus hole. 5. Search engines. This is related to the client/server strategies. Verity has done good work with Notes, Netscape, AT&T and many others to get them to adopt their scalable technology that can deal with large text databases with very large numbers of queries against them. We need to come up with a strategy to bring together Office, Mediaview, Help, Cairo, and MSN. Access and Fox do not support text indexing as part of their queries today which is a major hole. Only when we have an integrated strategy will we be able to determine if our in-house efforts are adequate or to what degree we need to work with outside companies like Verity. 6. Formats. We need to make sure we output information from all of our products in both vanilla HTML form and in the extended forms that we promote. For example, any database reports should be navigable as hypertext documents. We need to decide how we are going to compete with Acrobat and Quicktime since right now we aren't challenging them. It may be worth investing in optimizing our file formats for these scenarios. What is our competitor to Acrobat? It was supposed to be a coordination of extended metafiles and Word but these plans are inadequate. The format issue spans the Platform and Applications groups. 7. Tools. Our disparate tools efforts need to be brought together. Everything needs to focus on a single integrated development environment that is extensible in a object oriented fashion. Tools should be architected as extensions to this framework. This means one common approach to repository/projects/source control. It means one approach to forms design. The environment has to support sophisticated viewing options like timelines and the advanced features SoftImage requires. Our work has been separated by independent focus on on-line versus CD-ROM and structured display versus animated displays. There are difficult technical issues to resolve. If we start by looking at the runtime piece (browser) I think this will guide us towards the right solution with the tools. The actions required for the Applications and Content group are also quite broad. Some critical steps are the following: 1. Office. Allowing for collaboration across the Internet and allowing people to publish in our file formats for both Mac and Windows with free readers is very important. This won't happen without specific evangelization. DAD has written some good documents about Internet features. Word could lose out to focused Internet tools if it doesn't become faster and more WYSIWYG for HTML. There is a critical strategy issue of whether Word as a container is strict superset of our DataDoc containers allowing our Forms strategy to embrace Word fully. 2. MSN. The merger of the On-line business and Internet business creates a major challenge for MSN. It can't just be the place to find Microsoft information on the Internet. It has to have scale and reputation that it is the best way to take advantage of the Internet because of the value added. A lot of the content we have been attracting to MSN will be available in equal or better form on the Internet so we need to consider focusing on areas where we can provide something that will go beyond what the Internet will offer over the next few years. Our plan to promote Blackbird broadly takes away one element that would have been unique to MSN. We need to strengthen the relationship between MSN and Exchange/Cairo for mail, security and directory. We need to determine a set of services that MSN leads in - money transfer, directory, and search engines. Our high-end server offerings may require a specific relationship with MSN. 3. Consumer. Consumer has done a lot of thinking about the use of on-line for its various titles. On-line is great for annuity revenue and eliminating the problems of limited shelf-space. However, it also lowers the barriers to entry and allows for an immense amount of free information. Unfortunately today an MSN user has to download a huge browser for every CD title making it more of a demo capability than something a lot of people will adopt. The Internet will assure a large audience for a broad range of titles. However the challenge of becoming a leader in any subject area in terms of quality, depth, and price will be far more brutal than today's CD market. For each category we are in we will have to decide if we can be #1 or #2 in that category or get out. A number of competitors will have natural advantages because of their non-electronic activities. 4. Broadband media applications. With the significant time before widescale iTV deployment we need to look hard at which applications can be delivered in an ISDN/Internet environment or in a Satellite PC environment. We need a strategy for big areas like directory, news, and shopping. We need to decide how to persue local information. The Cityscape project has a lot of promise but only with the right partners. 5. Electronic commerce. Key elements of electronic commerce including security and billing need to be integrated into our platform strategy. On-line allows us to take a new approach that should allow us to compete with Intuit and others. We need to think creatively about how to use the Internet/on-line world to enhance Money. Perhaps our Automatic teller machine project should be revived. Perhaps it makes sense to do a tax business that only operates on on-line. Perhaps we can establish the lowest cost way for people to do electronic bill paying. Perhaps we can team up with Quickbook competitors to provide integrated on-line offerings. Intuit has made a lot of progress in overseas markets during the last six months. All the financial institutions will find it very easy to buy the best Internet technology tools from us and others and get into this world without much technical expertise. The Future We enter this new era with some considerable strengths. Among them are our people and the broad acceptance of Windows and Office. I believe the work that has been done in Consumer, Cairo, Advanced Technology, MSN, and Research position us very well to lead. Our opportunity to take advantage of these investments is coming faster than I would have predicted. The electronic world requires all of the directory, security, linguistic and other technologies we have worked on. It requires us to do even more in these ares than we planning to. There will be a lot of uncertainty as we first embrace the Internet and then extend it. Since the Internet is changing so rapidly we will have to revise our strategies from time to time and have better inter-group communication than ever before. Our products will not be the only things changing. The way we distribute information and software as well as the way we communicate with and support customers will be changing. We have an opportunity to do a lot more with our resources. Information will be disseminated efficiently between us and our customers with less chance that the press miscommunicates our plans. Customers will come to our "home page" in unbelievable numbers and find out everything we want them to know. The next few years are going to be very exciting as we tackle these challenges are opportunities. The Internet is a tidal wave. It changes the rules. It is an incredible opportunity as well as incredible challenge I am looking forward to your input on how we can improve our strategy to continue our track record of incredible success. HyperLink Appendix Related reading, double click to open them On-line! (Microsoft LAN only, Internet Assistant is not required for this part): * "Gordon Bell on the Internet" email by Gordon Bell * "Affordable Computing: advertising subsidized hardware" by Nicholas Negroponie * "Brief Lecture Notes on VRML & Hot Java" email by William Barr * "Notes from a Lecture by Mark Andresson (Netscape)" email by William Barr * "Application Strategies for the World Wide Web" by Peter Pathe (Contains many more links!) Below is a hotlist of Internet Web sites you might find interesting. I've included it as an embedded .HTM file which should be readable by most Web Browsers. Double click it if you're using a Web Browser like O'Hare or Netscape. HotList.htm A second copy of these links is below as Word HTML links. To use these links, you must be running the World Internet Assistant, and be connected to the Web. Cool, Cool, Cool.. The Lycos Home Page Yahoo RealAudio Homepage HotWired - New Thinking for a New Medium Competitors Microsoft Corporation World-Wide-Web Server Welcome To Oracle Lotus on the Web Novell Inc. World Wide Web Home Page Symantec Corporation Home Page Borland Online Disney/Buena Vista Paramount Pictures Adobe Systems Incorporated Home Page MCI Sony Online Sports ESPNET SportsZone The Gate Cybersports Page The Sports Server Las Vegas Sports Page News CRAYON Mercury Center Home Page Travel/Entertainment ADDICTED TO NOISE CDnow The Internet Music Store Travel & Entertainment Network home page Virtual Tourist World Map C(?) Net Auto Dealernet Popular Mechanics


(OldComputerPods) ©Sean Haas, 2020