'apple' Episodes

General Magic Was Almost Magical

     1/18/2020

General Magic Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us to innovate (and sometimes cope with) the future! Today’s episode is on a little-known company called General Magic who certainly had a substantial impact on the modern, mobile age of computing. Imagine if you had some of the best and brightest people in the world. And imagine if they were inspired by a revolutionary idea. The Mac changed the way people thought about computers when it was released in 1984. And very quickly thereafter they had left Apple. What happened to them? They got depressed and many moved on. The Personal Computer Revolution was upon us. And people who have changed the world can be hard to inspire. Especially at A big company like what Apple was becoming, where they can easily lose the ability to innovate. Mark Pratt had an idea. The mobile device was going to be the next big thing. The next wave. I mean, Steve Jobs has talked about mobile computing all the way back in 83. And it had been researched at PARC before that and philosophically the computer science research community had actually conceptualized ubiquitous computing. But Pratt knew they couldn’t build something at Apple. So in 1990 John Sculley, then CEO at Apple, worked with Pratt and they got The Apple board of directors to invest in the idea, which they built a company for, called General Magic. He kept his ideas in a book called Pocket Crystal. Two of the most important members of the original Mac team, Bill Atkinson and Andy Hertzfeld were inspired by the vision and joined on as well. Now legends, everyone wanted to work with them. It was an immediate draw for the best and brightest in the world. Megan Smith, Dan Winkler, amy Lindbergh, Joanna Hoffman, Scott Canaster, Darin Adler, Kevin Lynch, big names in software. They were ready to change the world. Again. They would build a small computer into a phone. A computer... in your pocket. It would be described as a telephone, a fax, and a computer. They went to Fry’s. A lot. USB didn’t exist yet. So they made it. ARPANET was a known quantity but The Internet hadn’t been born yet. Still, a pocket computer with the notes from your refrigerator, files from your computer, contacts , schedules, calculators. They had a vision. They wanted expressive icons, so they invented emoticons. And animated them. There was no data network to connect computers on phones with. So they reached out to AT&T and Go figure, they signed on. Sony, Phillips, Motorola, Mitsubishi gave them 6 million each. And they created an alliance of partners. Frank Canova built a device he showed off as “Angler” at COMDEX in 1992. Mobile devices were on the way. By 1993, the Apple Board of Directors was pressuring Sculley for the next Mac-type of visionary idea. So the Newton was announced in 1994, with the General Magic team feeling betrayed by Sculley. And General Magic got shoved out of the nest of stealth mode. After a great announcement they got a lot of press. They went public without having a product. The devices were trying to do a lot. Maybe too much. The devices were slow. Some aspects of the devices worked, for other aspects, They faked demos. The web showed up and They didn’t embrace it. In fact, Dean Omijar with Auctionweb was on the team. He thought the web was way cooler than the mobile device but the name needed work so it became eBay. The team didn’t embrace management or working together. They weren’t finishing projects. They were scope creeping the projects. The delays started. Some of the team had missed delays for the Mac and that worked. But other devices shipped. After 4 years, they shipped the Sony Magic Link in 1994. The devices were $800. People weren’t ready to be connected all the time. The network was buggy. They sold less than 3k. The stock tumbled and by 95 the Internet miss was huge. They were right. The future was in mobile computing. They needed the markets to be patient. They weren’t. They had inspired a revolution in computing and it slipped through their fingers. AT&T killed the devices, Marc was ousted as CEO, and after massive losses, they laid off nearly a quarter of the team and ultimately filed chapter 11. They weren’t the only ones. Sculley has invested so much into the Newton that he got sacked from Apple. But the vision and the press. They inspired a wave of technology. Rising like a Phoenix from the postPC, ubiquitous ashes CDMA would slowly come down in cost over the next decade and evolve connectivity through 3g and the upcoming 5g revolution. And out of their innovations came the Simon Personal Communicator by BellSouth and manufactured as the IBM Simon by Mitsubishi. The Palm, Symbian, and Pocket PC, or Windows CE would come out shortly thereafter and rise in popularity over the next few years. Tony Farrell repeated the excersize when helping invent the iPod as well and Steve Jobs even mentioned he had considered some of the tech from Magic Hat. He would later found Nest. And Andy Rubin, one of the creators of Android, also come from General Magic. Next time you read about the fact that Samsung and Apple combined control 98% of the mobile market or that Android overtook Windows for market share by double digits you can thank General Magic for at least part of the education that shaped those. The alumni include the head of speech recognition from Google, VPs from Google, Samsung, Apple, Blacberry, ebay, the CTOs of Twitter, LinkedIn, Adobe, and the United States. Alumni also include the lead engineers of the Safari browser and AI at Apple, cofounders of webtv, leaders from Pinterest, creator of dreamweaver. And now there’s a documentary about their journey called appropriately, General Magic. Their work and vision inspired the mobility industry. They touch nearly every aspect of mobile devices today and we owe them for bringing us forward into one of the most transparent and connected eras of humanity. Next time you see a racist slur recorded from a cell phone, next time a political gaffe goes viral, next time the black community finally shows proof of the police shootings they’ve complained about for decades, next time political dissenters show proof of mass killings, next time abuse at the hands of sports coaches is caught and next time all the other horrible injustices of humanity are forced upon us, thank them. Just as I owe you my thanks. I am sooooo lucky you chose to listen to this episode of the history of computing podcast. Thank you so much for joining me. Have a great day!


TidBITS and The Technology of Publishing

     12/17/2019

Today we’re going to look at publishing and from a different perspective than the normal history of computing podcast episodes, we’ll actually interview someone that has been living and breathing publishing technical content since, well the inception - and so one of the most qualified people to actually have that conversation: Adam Engst of TidBITS.


One Year Of History Podcasts

     7/7/2020

 

 

The first episode of this podcast went up on July 7th 2019. One year later, we’ve managed to cover a lot of ground, but we’re just getting started. Over 70 episodes and so far, my favorite was on Mavis Beacon Teaches Typing.

They may seem disconnected at times, but they’re not. There’s a large outline and it’s all research being included in my next book.

The podcast began with an episode on the prehistory of the computer. And we’ve had episodes on the history of batteries, electricity, superconductors, and more - to build up to what was necessary in order for these advances in computing to come to fruition.

We’ve celebrated Grace Hopper and her contributions. But we’d like to also cover a lot of other diverse voices in computing. 

There was a series on Windows, covering Windows 1, 3, , and 95. But we plan to complete that series with a look at 98, Millineum, NT, 2000, and on. We covered Android, CP/M, OS/2 and VMS but want to get into the Apple operating systems, SUN, and Linux, etc.

Speaking of Apple… We haven’t gotten started with Apple. We covered the lack of an OS story in the 90s - but there’s a lot to unpack around the founding of Apple, Steve Jobs and Woz, and the re-emergence of Apple and their impact there. 

And since that didn’t happen in a vacuum, there were a lot of machines in that transition from the PC being a hobbyist market to being a full-blown industry. We talked through Radioshack, Commodore, the Altair, the Xerox Alto, 

We have covered some early mainframes like the Atanasoff-Berry Computer, ENIAC, the story of Z-1 and Zuse, and even supercomputers like Cray, but still need to tell the later story, bridging the gap between the mainframe, the minicomputer, and traditional servers we might find in a data center today. 

We haven’t told the history of the Internet. We’ve touched on bits and pieces, but want to get into those first nodes that got put onto ARPAnet, the transition to NSFnet, and the merging of the nets into the Internet. And we covered sites like Friendster, Wikipedia, and even the Netscape browser, but the explosion of the Internet has so many other stories left to tell. Literally a lifetime’s worth. 

For example, we covered Twitter and Snapchat but Google and Facebook

We covered the history of object-oriented languages. We also covered BASIC, PASCAL, FORTRAN, ALGOL, Java, But still want to look at AWS and the modern web service architecture that’s allowed for an explosion of apps and web apps. 

Mobility. We covered the Palm Pilot and a little on device management, but still need to get into the iPhone and Samsung and the underlying technology that enabled mobility. 

And enterprise software and compliance.

Knowing the past informs each Investment thesis. We covered Y Combinator but there are a lot of other VC/Private equity firms to look at.

But what I thought I knew of the past isn’t always correct. As an example, coming from the Apple space, we have a hero worship of Steve Jobs that, for example, reading the Walter Isaacson book often conflicts with. He was a brilliant man, but complicated. And the more I read and research, the more I need to unpack many of own assumptions across the industry. 

I was here for a lot of this, yet my understanding is still not what it could be.

Interviews from people who wrote code to put on lunar landers, who invented technology like spreadsheets, 

I wish more people could talk about their experiences openly, but even 40 years later, some are still bound by NDAs

I’ve learned so much and I look forward to learning so much more!


The Evolution (and De-Evolution) of the Mac Server

     2/28/2020

Todays episode is on one of the topics I am probably the most intimate with that we’ll cover: the evolution of the Apple servers and then the rapid pivot towards a much more mobility-focused offering. Early Macs in 1984 shipped with AppleTalk. These could act as a server or workstation. But after a few years, engineers realized that Apple needed a dedicated server platform. Apple has had a server product starting in 1987 that lives on to today. At Ease had some file and print sharing options. But the old AppleShare (later called AppleShare IP server was primarily used to provide network resources to the Mac from 1986 to 2000, with file sharing being the main service offered. There were basically two options. At Ease, which ran on the early Mac operating systems and A/UX, or Apple Unix. This brought paged memory management and could run on the Macintosh II through the Centris Macs. Apple Unix shipped from 1988 to 1995 and had been based on System V. It was a solidly performing TCP/IP machine and introduced the world of POSIX. Apple Unix could emulate Mac apps and once you were under the hood, you could do pretty much anything you might do in another Unix environment. Apple also took a stab at early server hardware in the form of the Apple Network Server, which was announced in 1995 when Apple Unix went away, for the Quadra 950 and a PowerPC server sold from 1996 to 1997, although the name was used all the way until 2003. While these things were much more powerful and came with modern hardware, they didn’t run the Mac OS but ran another Unix type of operating system, AIX, which had begun life at about the same time as Apple Unix and was another System V variant, but had much more work done and given financial issues at Apple and the Taligent relationship between Apple and IBM to build a successor to Mac OS and OS/2, it made sense to work together on the project. Meanwhile, At Ease continued to evolve and Apple eventually shipped a new offering in the form of AppleShare IP, which worked up until 9.2.2. In an era before, as an example, you needed to require SMTP authentication, AppleShare IP was easily used for everything from file sharing services to mail services. An older Quadra made for a great mail server so your company could stop paying an ISP for some weird email address like that AOL address you got in college, and get your own domain in 1999! And if you needed more, you could easily slap some third party software on the hosts, like if you actually wanted SMTP authentication so your server didn’t get used to route this weird thing called spam, you could install Communigator or later Communigate Pro. Keep in mind that many of the engineers from NeXT after Steve Jobs left Apple had remained friends with engineers from Apple. Some still actually work at Apple. Serving services was a central need for NEXTSTEP and OPENSTEP systems. The UNIX underpinnings made it possible to compile a number of open source software packages and the first web server was hosted by Tim Berners Lee on a NeXTcube. During the transition over to Apple, AppleShare IP and services from NeXT were made to look and feel similarly and turned into Rhapsody from around 1999 and then Mac OS X Server from around 2000. The first few releases of Mac OS X Server, represented a learning curve for many classic Apple admins, and in fact caused a generational shift in who administered the systems. John Welch wrote books in 2000 and 2002 that helped administrators get up to speed. The Xserve was released in 2002 and the Xserve RAID was released in 2003. It took time, but a community began to form around these products. The Xserve would go from a G3 to a G4. The late Michael Bartosh compiled a seminal work in “Essential Mac OS X Panther Server Administration” for O’Reilly Media in 2005. I released my first book called The Mac Tiger Server Black Book in 2006. The server was enjoying a huge upswing in use. Schoun Regan and Kevin White wrote a Visual QuickStart for Panther Server. Schoun wrote one for Tiger Server. The platform was growing. People were interested. Small businesses, schools, universities, art departments in bigger companies. The Xserve would go from a G4 to an Intel processor and we would get cluster nodes to offload processing power from more expensive servers. Up until this point, Apple never publicly acknowledged that businesses or enterprises used their device so the rise of the Xserve advertising was the first time we saw that acknowledgement. Apple continued to improve the product with new services up until 2009 with Mac OS X Server 10.6. At this point, Apple included most services necessary for running a standard IT department for small and medium sized business in the product, including web (in the form of Apache), mail, groupware, DHCP, DNS, directory services, file sharing, and even web and wiki services. There were also edge case services such as Podcast Producer for automating video and content workflows, Xsan, a clustered file system, and in 2009 even purchased a company called Artbox, whose product was rebranded as Final Cut Server. Apple now had multiple awesome, stable products. Dozens of books and websites were helping built a community and growing knowledge of the platform. But that was a turning point. Around that same time Apple had been working towards the iPad, released in 2010 (although arguably the Knowledge Navigator was the first iteration, conceptualized in 1987). The skyrocketing sales of the iPhone led to some tough decisions. Apple no longer needed to control the whole ecosystem with their server product and instead began transitioning as many teams as possible to work on higher profit margin areas, reducing focus on areas that took attention away from valuable software developers who were trying to solve problems many other vendors had already solved better. In 2009 the Xserve RAID was discontinued and the Xserve went away the following year. By then, the Xserve RAID was lagging and for the use cases it served, there were other vendors whose sole focus was storage - and who Apple actively helped point customers towards. Namely the Promise array for Xsan. A few things that were happening around the same time. Apple could have bought Sun for less than 10% of their CASH reserves in 2010 but instead allowed Oracle to buy the tech giant. Instead, Apple released the iPad. Solid move. They also released the Mac Mini server, which while it lacked rack and stack options like an ipmi interface to remotely reboot the server and dual power supplies, was actually more powerful. The next few years saw services slowly pealed off the server. Today, the Mac OS X Server product has been migrated to just an app on the App Store. Today, macOS Server is meant to run Profile Manager and be run as a metadata controller for Xsan, Apple’s clustered file system. Products that used to compete with the platform are now embraced by most in the community. For the most part, this is because Apple let Microsoft or Linux-based systems own the market for providing features that are often unique to each enterprise and not about delighting end users. Today building server products that try to do everything for everyone seems like a distant memory for many at Apple. But there is still a keen eye towards making the lives of the humans that use Apple devices better, as has been the case since Steve Jobs mainstreamed the GUI and Apple made the great user experience advocate Larry Tesler their Chief Scientist. How services make a better experience for end users can be seen by the Caching service built into macOS (moved there from macOS Server) and how some products, such as Apple Remote Desktop, are still very much alive and kicking. But the focus on profile management and the desire to open up everything Profile Manager can do to third party developers who serve often niche markets or look more to scalability is certainly front and center. I think this story of the Apple Server offering is really much more about Apple branching into awesome areas that they needed to be at various points in time. Then having a constant focus on iterating to a better, newer offering. Growing with the market. Helping the market get to where they needed them to be. Serving the market and then when the needs of the market can be better served elsewhere, pulling back so other vendors could serve the market. Not looking to grow a billion dollar business unit in servers - but instead looking to provide them just until they didn’t need to. In many ways Apple paved the way for billion dollar businesses to host services. And the SaaS ecosystem is as vibrant for the Apple platform as ever. My perspective on this has changed a lot over the years. As someone who wrote a lot of books about the topic I might have been harsh at times. But that’s one great reason not to be judgmental. You don’t always know the full picture and it’s super-easy to miss big strategies like that when you’re in the middle of it. So thank you to Apple for putting user experience into servers as with everything you do. And thank you listeners for tuning into this episode of the History of Computing Podcast. We’re certainly lucky to have you and hope you join us next time!


The Evolution Of Wearables

     5/4/2020

Mark Weiser was the Chief Technologiest at the famed Xerox Palo Alto Research Center, or Xerox Parc in 1988 when he coined the term "ubiquitous computing.” Technology hadn’t entered every aspect of our lives at the time like it has now. The concept of wearable technology probably kicks off way earlier than you might think. 

Humans have long sought to augment ourselves with technology. This includes eyeglasses, which came along in 1286  and wearable clocks, an era kicked off with the Nuremberg eggs in 1510. The technology got smaller and more precise as our capacity at precision grew. Not all wearable technology is meant to be worn by humans. We strapped cameras to pigeons in 1907.

in the 15th century, Leonardo da Vinci would draw up plans for a pedometer and that concept would go on the shelf until Thomas Jefferson picked it back up during his tinkering days. And we would get an abacus ring in 1600. But computers began by needing a lot of electricity to light up those vacuum tubes to replace operations from an abacus, and so when the transistor came along in the 40s, we’d soon start looking for ways to augment our capabilities with those. 

Akio Morita and Masaru Ibuka began the wearable technology craze in 1953 when they started developing what would become the TR-55 when it was released in 1955. It was the first transistor radio and when they changed their name to Sony, they would introduce the first of their disruptive technologies. We don’t think of radios as technology as much as we once did, but they were certainly an integral part of getting the world ready to accept other technological advances to come!

Manfred Clynes came up with cyborgs in his story story called Cyborgs in Space in 1960. The next year, Edward Thorp and mathematician and binary algebra guru Claude Shannon wanted to try their hands at cheating at roulette so built a small computer to that timed when balls would land. It went in a shoe. created their own version of wearable technology – a computer small enough to fit into a shoe. This would stay a secret until Thorp released his book “Beat the Dealer” telling readers they got a 44 percent improvement in making bets. By 1969 though Seiko gave us the first automatic quartz watch. 

Other technologies were coming along at about the same time that would later revolutionize portable computing once they had time to percolate for awhile. Like in the 1960s, liquid crystal displayers were being researched at RCA. The technology goes back further but George H. Heilmeier from RCA laboratories gets credit for In 1964 for operationalizing LCD. 

And Hatano developed a mechanical pedometer to track progress to 10,000 steps a day, which by 1985 had him defining that as the number of steps a person should reach in a day. But back to electronics. 

Moore’s law. The digital camera traces its roots to 1975, but Kodak didn’t really pursue it. 1975 and devices were getting smaller and smaller. Another device we don’t think of as a computer all that much any more is a calculator. But kits were being sold by then and suddenly components had gotten small enough that you could get a calculator in your watch, initially introduced by Pulsar. And those radios were cool but what if you wanted to listen to what you wanted rather than the radio? Sony would again come along with another hit: The Walkman in 1979, selling over 200 million over the ensuing decade. Akio Morita was a genius, also bringing us digital hearing aids and putting wearables into healthcare. Can you imagine the healthcare industry without wearable technology today? 

You could do more and more and by 1981, Seiko would release the UC 2000 Wrist PC. By then portable computers were a thing. But not wearables. You could put 2 whopping kilobytes of data on your wrist and use a keyboard that got strapped to an arm. Computer watches continued to improve any by 1984 you could play. Games on them, like on the Nelsonic Space Attacker Watch. 

Flash memory arguably came along in 1984 and would iterate and get better, providing many, many more uses for tiny devices and flash media cards by 1997. But those calculator watches, Marty McFly would sport one in 1985s Back To The Future and by the time I was in high school they were so cheap you could get them for $10 at the local drug store. And a few years later, Nintendo would release the Power Glove in 1989, sparking the imagination of many a nerdy kid who would later build actually functional technology. Which regrettably the Power Glove was not. 

The first portable MP3 player came along in 1998. It was the MPMan. Prototypes had come along in 1979 with the IXI digital audio player. The audible player, Diamond Rio, and Personal Jukebox came along in 1998 and on the heels of their success the NOMAX Jukebox came in y2k. But the Apple iPod exploded onto the scene in 2001 and suddenly the Walkman and Diskman were dead and the era of having a library of music on mainstream humans was upon us, sparking Microsoft to release the Zen in 2004, and the Zune in 2006. 

And those watches. Garmin brought us their first portable GPS in 1990, which continues to be one of the best such devices on the market.

The webcam would come along in 1994 when Canadian researcher Steve Mann built the first the wearable wireless webcam. That was the spark that led to the era of the Internet of Things. Suddenly we weren’t just wearing computers. We were wearing computers connected to the inter webs. 

All of these technologies brought to us over the years… They were converging. Bluetooth was invented in 2000. 

By. 2006, it was time for the iPod and fitness tracking to converge. Nike+iPod was announced and Nike would release a small transmitter that. Fit into a notch in certain shoes. I’ve always been a runner and jumped on that immediately! You needed a receiver at the time for an iPod Nano. Sign me up, said my 2006 self! I hadn’t been into the cost of the Garmin but soon I was tracking everything. Later I’d get an iPhone and just have it connect. But it was always a little wonky. Then came The Nike+ Fuelband in 2012. I immediately jumped on that bandwagon as well. You. Had to plug it in at first but eventually a model came out that sync’d over bluetooth and life got better. I would sport that thing until it got killed off in 2014 and a little beyond… Turns out Nike knew about Apple coming into their market and between Apple, Fitbit, and Android Wear, they just didn’t want to compete in a blue ocean, no matter how big the ocean would be.  

Speaking of Fitbit, they were founded in 2007 James Park and Eric Friedman with a goal of bringing fitness trackers to market. And they capitalized on an exploding market for tracking fitness. But it wasn’t until the era of the app that they achieved massive success and in 2014 they released apps for iOS, Android and Windows Mobile, which was still a thing. And the watch and mobile device came together in 2017 when they released their smartwatch. They are now the 5th largest wearables company. 

Android Wear had been announced at Google I/O in 2014. Now called Wear OS, it’s a fork of Android Lollipop, that pairs with Android devices and integrates with the Google Assistant. It can connect over Bluetooth, Wi-Fi, and LTE and powers the Moto 360, the LG G and Samsung Gear. And there are a dozen other manufacturers that leverage the OS in some way, now with over 50 million installations of the apps. It can use Hangouts, and leverages voice to do everything from checking into Foursquare to dictating notes. 

But the crown jewel in the smart watches is definitely the Apple Watch. That came out of hiring former Adobe CTO Kevin Lynch to bring a Siri-powered watch to market, which happened in 2015. With over 33 million being sold and as of this recording on the 5th series of the watch, it can now connect over LTE, Wifi, or through a phone using Bluetooth. There are apps, complications, and a lot of sensors on these things, giving them almost limitless uses.

Those glasses from 1286. Well, they got a boost in 2013 when Google put images on them. Long a desire from science fiction, Google Glass brought us into the era of a heads up display. But Sega had introduced their virtual reality headset in 1991 and the technology actually dates back to the 70s from JPL and MIT. Nintendo experimented with Virtual boy in 1994. Apple released QuickTime VR shortly thereafter, but it wasn’t that great. I even remember some VGA “VR” headsets in the early 2000s, but they weren’t that great. It wasn’t until the Oculus Rift came along in 2012 that VR seemed all that ready. These days, that’s become the gold standard in VR headsets. The sign to the market was when Facebook bought Oculus for $2.3 billion dollars in 2014 and the market has steadily grown ever since. 

Given all of these things that came along in 2014, I guess it did deserve the moniker “The Year of Wearable Technology.” And with a few years to mature, now you can get wearable sensors that are built into yoga pants, like the Nadi X Yoga Pants, smartwatches ranging from just a few dollars to hundreds or thousands from a variety of vendors, sleep trackers, posture trackers, sensors in everything bringing a convergence between the automated home and wearables in the internet of things. Wearable cameras like the Go Pro, smart glasses from dozens of vendors, VR headsets from dozens of vendors, smart gloves, wearable onesies, sports clothing to help measure and improve performance, smart shoes, smart gloves, and even an Alexa enabled ring. 

Apple waited pretty late to come out with bluetooth headphones, releasing AirPods in 2016. These bring sensors into the ear, the main reason I think of them as wearables where I didn’t think of a lot of devices that came before them in that way. Now on their second generation, they are some of the best headphones you can buy. And the market seems poised to just keep growing. Especially as we get more and more sensors and more and more transistors packed into the tiniest of spaces. It truly is ubiquitous computing. 


The Homebrew Computer Club

     5/23/2020

Today we’re going to cover the Homebrew Computer Club.

Gordon French and Fred More started the Homebrew Computer Club. French hosted the Home-brew Computer Club’s first meeting in his garage in Menlo Park, California on March 5th, 1975. I can’t help but wonder if they knew they were about to become the fuse the lit a powder keg? If they knew they would play a critical role in inspiring generations to go out and buy personal computers and  automate everything. If they knew they would inspire the next generation of Silicon Valley hackers? Heck, it’s hard to imagine they didn’t with everything going on at the time. Hunter S Thompson rolling around deranged, Patty Hearst robbing banks in the area, the new 6800 and 8008 chips shipping…

Within a couple of weeks they were printing a newsletter. I hear no leisure suits were damaged in the making of it. The club would meet in French’s garage three times until he moved to Baltimore to take a job with the Social Security Administration. The group would go on without him until late in 1986. By then, the club had played a substantial part in spawning companies like Cromemco, Osborne, and most famously, Apple.

The members of the club traded parts, ideas, rumors, and hacks. The first meeting was really all about checking out the Altair 8800, by an Albuquerque calculator company called MITS, which would fan the flames of the personal computer revolution by inspiring hackers all over the world to build their own devices. It was the end of an era of free love and free information. Thompson described it as a high water mark. Apple would help to end the concept of free, making its founders rich beyond their working-class dreams. 

A newsletter called the People’s Computer Company had gotten an early Altair. Bob Albrecht would later change the name of the publication to Dr Dobbs. That first, fateful meeting, inspired Deve Wozniak to start working on one of the most important computers of the PC revolution, the Apple I. They’d bounce around until they pretty much moved into Stanford for good. 

I love a classic swap meet, and after meetings, some members of the group would reconvene at a parking lot or a bar to trade parts. They traded ideas, concepts, stories, hacks, schematics, and even software. Which inspired Bill Gates to write his “Open Letter to Hobbyists” - which he sent to the club’s newsletter.  

Many of the best computer minds in the late 70s were members of this collective. 

  • George Morrow would make computers mostly through his company Morrow designs, for 30 years.
  • Jerry Lawson invented cartridge-based gaming. 
  • Lee Felsenstein built the SOL, a computer based on the Intel 8080, the Pennywhistle Modem, and designed the Osborne 1, the first real portable computer. He did that with Adam Osborne who he met at the club. 
  • Li-Chen Wang developed Palo Alto Tiny Basic.
  • Todd Fischer would help design the IMSAI.
  • Paul Terrell would create the Byte Shop, a popular store for hobbyists that bought the first 50 Apple 1 computers to help launch the company. It was also the only place to buy the Altair in the area. 
  • Dan Werthimer founded the SETI@home project. 
  • Roger Melen would found Cromemco with Harry Garland. They named the company after Crothers Memorial, the graduate student engineering dorm at Stanford. They built computers and peripherals for the Z80 and S-100 bus. They gave us the Cyclops digital camera, the JS-1 joystick, and the Dazzler color graphics interface - all for the Altair. They would then build the Z-1 computer, using the same chassis as the IMSAI, iterating new computers until 1987 when they sold to Dynatech. 
  • John Draper, also known as Captain Crunch, had become a famous phreaker in 1971, having figured out that a whistle from a box of Captain Crunch would mimic the 2600 hertz frequency used to route calls. His Blue Box design was then shared to Steve Wozniak who set up a business selling them with his buddy from high school, Steve Jobs. 
  • And of course, Steve Wozniak would design the Apple 1 using what he learned at the meetings and team up with his buddy Steve Jobs to create Apple Computer and launch the Apple I, which Woz wanted to give his schematics away for free and Jobs wanted to sell the boards. That led to the Apple II, which made both wealthy beyond their wildest imaginations and paved the way for the Mac and every innovation to come out of Apple since. 

Slowly the members left to pursue their various companies. When the club ended in 1986, the personal computing revolution had come and IBM was taking the industry over. A number of members continued to meet for decades, using the new name, the 6800 club, named after the Motorola 6800 chip, which had been used in the Altair on that fateful day in 1975. 

This small band of pirates and innovators changed the world. Their meetings produced the concepts and designs that would be used in computers from Atari, Texas Instruments, Apple, and every other major player in the original personal computing hobbyist market. The members would found companies that went public and inspired IBM to enter what had been a hobbyist market and turn it into a full fledged industry. They would democratize the computer and their counter-culture personalities would humanize computing and even steer computing to benefit humans in an era when computers were considered part of the military industrial complex and so evil. 

They were open with one another, leading to faster sharing of ideas, faster innovation. Until suddenly they weren’t. And the higher water mark of open ideas was replaced with innovation that was financially motivated. They capitalized on a recession in chips as war efforts spun down. And they changed the world. And for that, we thank them. And I think you listener, for tuning in to this episode of the history of computing podcast. We are so, so lucky to have you. Now tune in to innovation, drop out of binge watching, and go change the world. 


The App Store

     1/6/2020

Picture this. It’s 1983. The International Design Conference in Aspen has a special speaker: Steve Jobs from Apple. He’s giving a talk called “The Future Isn’t What It Used To Be.” He has a scraggly beard and really, really wants to recruit some industrial designers. In this talk, he talked about software. He talked about dealers. After watching the rise of small computer stores across the country and seeing them selling, and frequently helping people pirate, apps for their iconic Apple II, Steve Jobs predicts that the dealers were adept at selling computers, but not software. There weren’t categories of software yet. But there were radio stations and television programs. And there were record stores. And he predicted we would transmit software electronically over the phone line. And that we’d pay for it with a credit card if we liked using it. If you haven’t listened to the talk, it’s fascinating. https://www.youtube.com/watch?v=KWwLJ_6BuJA In that talk, he parlayed Alan Kay’s research into the DynaBook while he was at Xerox PARC to talk about what would later be called tablet computers and ebooks. Jobs thought Apple would do so in the 80s. And they did dabble with the Newton MessagePad in 1993, so he wasn’t too far off. I guess the writers from Inspector Gadget were tuned into the same frequency as they gave Penny a book computer in 1983. Watching her use it with her watch changed my life. Or maybe they’d used GameLine, a service that let Atari 2600 owners rent video games using a cartridge with a phone connection. Either way, it took awhile, but Jobs would eventually ship the both the App Store and the iPad to the masses. He alluded to the rise of the local area network, email, the importance of design in computers, voice recognition, maps on devices (which came true with Google and then Apple Maps), maps with photos, DVDs (which he called video disks), the rise of object-oriented programming, and the ability to communicate with a portable device with a radio link. So flash forward to 1993. 10 years after that brilliant speech. Jobs is shown the Electronic AppWrapper at NextWORLD, built by Paget Press. Similar to the Whole Earth Catalog, EAW had begun life as a paper catalog of all software available for the NeXT computers but evolved into a CD-based tool and could later transmit software over the Internet. Social, legal, and logistical issues needed to be worked out. They built digital rights management. They would win the Content and Information Best of Breed award and there are even developers from that era still designing software in the modern era. That same year, we got Debian package managers and rpm. Most of this software was free and open source, but suddenly you could build a binary package and call it. By 1995 we had pan, the Comprehensive Perl Archive Network. An important repository for anyone that’s worked with Linux. 1998 saw the rise of apt-get. But it was 10 years after Jobs saw the Electronic AppWrapper and 20 years after he had publicly discussed what we now call an App Store that Apple launched the iTunes Store in 2003, so people could buy songs to transfer from their Mac to their iPod, which had been released in 2001. Suddenly you could buy music like you used to in a record store, but on the Internet. Now, the first online repository of songs you could download had come about back in 93 and the first store to sell songs had come along in 98 - selling MP3 files. But the iTunes Store was primarily to facilitate those objects going to a mobile device. And so 2007 comes along and Jobs announces the first iPhone at the Apple Worldwide Developers Conference . A year later, Apple would release the App Store, the day before the iPhone 3G dropped, bringing apps to phones wirelessly in 2008, 25 years after Jobs had predicted it in 1983. It began with 500 apps. A few months later the Google Play store would ship as well, although it was originally called the Android Market. It’s been a meteoric rise. 10 years later, in 2018, app revenue on the iOS App Store would hit 46.6 billion dollars. And revenue on the Google Play store would hit 24.8 billion with a combined haul between $71 billion and 101 billion according to where you look. And in 2019 we saw a continued 2 digit rise in revenues, likely topping $120 billion dollars. And a 3 digit rise in China. The global spend is expected to double by 2023, with Africa and South America expected to see a 400% rise in sales in that same time frame. There used to be shelves of software in boxes at places like Circuit City and Best Buy. The first piece of software I ever bought was Civilization. Those boxes at big box stores are mostly gone now. Kinda’ like how I bought Civilization on the App Store and have never looked back. App developers used to sell a copy of a game, just like that purchase. But game makers don’t just make money off of purchases any more. Now they make money off of in-app advertising and in-app purchases, many of which are for subscriptions. You can even buy a subscription for streaming media to your devices, obviating the need for buying music and sometimes video content. Everyone seems to be chasing that sweet, sweet monthly recurring revenue now. As with selling devices, Apple sells less but makes much, much more. Software development started democratically, with anyone that could learn a little BASIC, being able to write a tool or game that could make them millions. That dropped for awhile as software distribution channels matured but was again democratized with the release of the App Store. Those developers have received Operating systems, once distributed on floppies, have even moved over to the App Store - and with Apple and Google, the net result is that they’re now free. And you can even buy physical things using in-app purchases, Apple Pay through an Apple credit card, and digital currency, closing the loop and fully obfuscating the virtual and the physical. And today any company looking to become a standard, or what we like to call in software, a platform, will have an App Store. Most follow the same type of release strategy. They begin with a catalog, move to facilitating the transactions, add a fee to do so, and ultimately facilitate subscription services. If a strategy aint broke, don’t fix it. The innovations are countless. Amazon builds services for app developers and sells them a tie to wear at their pitches to angels and VCs. Since 1983, the economy has moved on from paying cash for a box of software. And we’re able to conceptualize disrupting just about anything thanks to the innovations that sprang forth in that time where those early PCs were transitioning into the PC revolution. Maybe it was inevitable without Steve Jobs right in the thick of it. Technological determinism is impossible to quantify. Either way, app stores and the resultant business models have made our lives better. And for that we owe Apple and all of the other organizations and individuals that helped make them happen, our gratitude. Just as I owe you mine for tuning in, to yet another episode, of the history of computing podcast. We are so lucky to have you. Have a great day!


The Origin Of The Blue Meanies

     10/8/2019

The Blue Meanies Origin Joke Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Today we’re going to look at an alternative story of how the Blue Meanies formed, from Greg Marriott, a Blue Meanie: https://web.archive.org/web/19991013005722/http://spies.com/greg/bluemeanies.html How Did The Blue Meanies Come To Be? The "Blue Meanies" was the name of a group of generalists in the system software group at Apple. I was a member of the group for three and half years. We were experts at Mac programming and debugging, and we guided the architecture of Mac system software for several years. People often ask how the Blue Meanies got started. The truth was pretty mundane, so I made up this story a few years ago. By the way, we had a hamster mascot named Gibbly. The stooped figure in the bloodstained lab coat scurried around the lab, checking his instruments. All was in readiness. Tonight, finally, he would silence the skeptics. He would show them his theories weren't those of a crack-pot, but those of a genius! He turned and surveyed the eleven tiny figures strapped on the tables in the center of the cavernous laboratory. The frightened rodents twisted and squirmed, but could not break free. Their sharp teeth had no effect on the stainless steel straps holding them in place. A twelfth hamster in a cage, marked with a nameplate that said "Gibbly," watched in horror as her brothers and sisters were subjected to this unthinkable torture. Their wide frightened eyes beheld their tormentor as he performed some last minute adjustments on the huge panel filling the far wall of the lab. Had they any intelligence at all they would have recognized the eleven identical sets of medical monitors. Gauges, meters, and dials reflected respiration, heart rate, and blood pressure. Eleven long streamers of paper inched their way out of the EEGs, leaving twisted little piles on the floor. The mad scientist paused and remembered the laughter of his peers when he presented his ideas to them. His face hardened as he recalled their ridicule when he proposed his "Theory of Transfiguration." His carefully documented research clearly showed that one mammal could be turned into another, yet they jeered and hooted until he was forced off stage, humiliated. He decided then to continue with his plan to prove his theories by turning rodents into monkeys. The old man smiled grimly and faced his subjects. He crossed the room and sat before his Macintosh. The desk was covered with documentation, leaving barely enough room to move the mouse. He shoved TechNotes and volumes of Inside Macintosh out of the way to make more room. He briefly checked that his control program was ready, reaching for the mouse. A couple of clicks later, relays closed deep inside the complex machines and the process began. Right at that moment, lightning struck the power lines just outside the lab windows. The Mac exploded in a shower of sparks. The blast propelled the old man backwards, his wheeled chair racing across the lab floor. He crashed into the panels and slid out of the chair onto the shiny floor. The piles of loose paper and manuals vaporized filling the air with a fine mist. At the same time enormous amounts of power surged through the machines, the tables, and the poor helpless hamsters. Automatic safety devices failed, fused by the jolt of electricity. The transformation raced out of control. The straps holding the hamsters snapped open, but the stunned animals still could not move. A puslating aura surrounded them, permeated their tiny bodies, growing stronger and stronger. As the acrid smoke from the Mac and the remnants of the manuals swirled through the aura it began to shimmer violently. The transformation continued. The eleven rodents began to shudder uncontrollably as the immense energy surrounding them intensified further. Had the scientist been conscious, he would have noted their change in size and form. They grew longer and wider and their fur (mostly) disappeared, replaced by Reeboks, Levis and t-shirts. Critical components in the complicated machinery finally succumbed to the outrageous current. Sparks flew from the panels and tiny lights winked out as the transformation process ground to a halt. The aura subsided and suddenly the air was very still. The old man stirred and groaned as his abused bones protested their treatment. He shook his head to clear it and was immediately forced to wonder why anyone would do such a thing after being slammed into a wall. He rose and looked at the stainless steel tables, expecting to see years of research blown to bits. He gasped in astonishment at the scene his eyes beheld. Eleven pairs of human eyes looked back at him. Unfortunately, the shock of such an overwhelming success was too much for him. His aging heart stopped beating and he fell heavily to the floor. Equally unfortunate was the intense paranoia which caused him to encrypt all of his notes. The eleven Blue Meanies [the way they got their name is another story... -ed.] looked at each other and smiled. The knowledge fused into their very structure by the aura made them giddy with excitement. They desperately wanted to use this newfound information in some way but didn't quite know what to do. They milled around the lab in confusion, looking for some clue that would tell them what to do next. One of them noticed a charred scrap of paper on the floor and picked it up. He showed it to the others, and soon they decided what to do. They grabbed Gibbly and filed out of the lab, not looking back, and set out on a long journey to Apple Computer, Inc., 20525 Mariani Ave, Cupertino, CA 95014.


Wikipedia

     9/2/2019

Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Todays episode is on the history of Wikipedia. The very idea of a single location that could store all the known information in the world began with Ptolemy I, founder of the Greek dynasty that ruled Egypt following the death of Alexander the great. He and his son amassed 100s of thousands of scrolls in the Library and Alexandria from 331 BC and on. The Library was part of a great campus of the Musaeum where they also supported great minds starting with Ptolemy I’s patronage of Euclid, the father of geometry, and later including Archimedes, the father of engineering, Hipparchus, the founder of trigonometry, Her, the father of math, and Herophilus, who gave us the scientific method and countless other great hellenistic thinkers. The Library entered into a slow decline that began with the expulsion of intellectuals from Alexandria in 145BC. Ptolemy VIII was responsible for that. Always be weary of people who attack those that they can’t win over especially when they start blaming the intellectual elite for the problems of the world. This began a slow decline of the library until it burned, first with a small fire accidentally set by Caesar in 48BC and then for good in the 270s AD. In the centuries since there have been attempts here and there to gather great amounts of information. The first known encyclopedia was the Naturalis Historiae by Pliny the Elder, never completed because he was killed in the eruption of Vesuvius. One of the better known being the Encyclopedia Britannica, starting off in 1768. Mass production of these was aided by the printing press but given that there’s a cost to producing those materials and a margin to be made in the sale of those materials that encouraged a somewhat succinct exploration of certain topics. The advent of the computer era of course led to encyclopedias on CD and then to online encyclopedias. Encyclopedias at the time employed experts in certain fields and paid them for compiling and editing articles for volumes that would then be sold. As we say these days, this was a business model just waiting to be disrupted. Jimmy Wales was moderating an online discussion board on Objectivism and happened across Larry Sanger in the early 90s. They debated and became friends. Wales started Nupedia, which was supposed to be a free encyclopedia, funded by advertising revenue. As it was to be free, they were to recruit thousands of volunteer editors. People of the caliber that had been previously hired to research and write articles for encyclopedias. Sanger, who was pursuing a PhD in philosophy from Ohio State University, was hired on as editor-in-chief. This was a twist on the old model of compiling an encyclopedia and a twist that didn’t work out as intended. Volunteers were slow to sign up, but Nupedia went online in 2000. Later in the year there had only been two articles that made it through the review process. When Sanger told Ben Kovitz about this, he recommended looking at the emerging wiki culture. This had been started with WikiWikiWeb, developed by Ward Cunningham in 1994, named after a shuttle bus that ran between airport terminals at the Honolulu airport. WikiWikiWeb had been inspired by Hypercard but needed to be multi-user so people could collaborate on web pages, quickly producing content on new patterns in programming. He wanted to make non-writers feel ok about writing. Sanger proposed using a wiki to be able to accept submissions for articles and edits from anyone but still having a complicated review process to accept changes. The reviewers weren’t into that, so they started a side project they called Wikipedia in 2001 with a user-generated model for content, or article, generation. The plan was to generate articles on Wikipedia and then move or copy them into Nupedia once they were ready. But Wikipedia got mentioned on Slashdot. In 2001 there were nearly 30 million websites but half a billion people using the web. Back then a mention on the influential Slashdot could make a site. And it certainly helped. They grew and more and more people started to contribute. They hit 1,000 articles in March of 2001 and that increased by 10 fold by September, By And another 4 fold the next year. It started working independent of Nupedia. The dot-com bubble burst in 2000 and by 2002 Nupedia had to lay Sanger off and he left both projects. Nupedia slowly died and was finally shut down in 2003. Eventually the Wikimedia Foundation was built to help unlock the world’s knowledge, which now owns and operates Wikipedia. Wikimedia also includes Commons for media, Wikibooks that includes free textbooks and manuals, Wikiquote for quotations, Wikiversity for free learning materials, MediaWiki the source code for the site, Wikidata for pulling large amounts of data from Wikimedia properties using APIs, Wikisource, a library of free content, Wikivoyage, a free travel guide, Wikinews, free news, Wikispecies, a directory containing over 687,000 species. Many of the properties have very specific ways of organizing data, making it easier to work with en masse. The properties have grown because people like to be helpful and Wales allowed self-governance of articles. To this day he rarely gets involved in the day-to-day affairs of the wikipedia site, other than the occasional puppy dog looks in banners asking for donations. You should donate. He does have 8 principles the site is run by: 1. Wikipedia’s success to date is entirely a function of our open community. 2. Newcomers are always to be welcomed. 3. “You can edit this page right now” is a core guiding check on everything that we do. 4. Any changes to the software must be gradual and reversible. 5. The open and viral nature of the GNU Free Documentation License and the Create Commons Attribution/Share-Alike License is fundamental to the long-term success of the site. 6. Wikipedia is an encyclopedia. 7. Anyone with a complaint should be treated with the utmost respect and dignity. 8. Diplomacy consists of combining honesty and politeness. This culminates in 5 pillars wikipedia is built on: 1. Wikipedia is an encyclopedia. 2. Wikipedia is written from a neutral point of view. 3. Wikipedia is free content that anyone can use, edit, and distribute. 4. Wikipedia’s editors should treat each other with respect and civility. 5. Wikipedia has no firm rules. Sanger went on to found Citizendium, which uses real names instead of handles, thinking maybe people will contribute better content if their name is attached to something. The web is global. Throughout history there have been encyclopedias produced around the world, with the Four Great Books of Song coming out of 11th century China, the Encyclopedia of the Brethren of Purity coming out of 10th century Persia. When Wikipedia launched, it was in English. Wikipedia launched a German version using the deutsche.wikipedia.com subdomain. It now lives at de.wikipedia.com and Wikipedia has gone from being 90% English to being almost 90 % non-English, meaning that Wikipedia is able to pull in even more of the world’s knowledge. Wikipedia picked up nearly 20,000 English articles in 2001, over 75,000 new articles in 2002, and that number has steadily climbed wreaching over 3,000,000 by 2010, and we’re closing in on 6 Million today. The English version is 10 terabytes of data uncompressed. If you wanted to buy a printed copy of wikipedia today, it would be over 2500 books. By 2009 Microsoft Encarta shut down. By 2010 Encyclopedia Britannica stopped printing their massive set of books and went online. You can still buy encyclopedias from specialty makers, such as the World Book. Ironically, Encyclopedia Britannica does now put real names of people on articles they produce on their website, in an ad-driven model. There are a lot of ads. And the content isn’t linked to as many places nor as thorough. Creating a single location that could store all the known information in the world seems like a pretty daunting task. Compiling the non-copywritten works of the world is now the mission of Wikipedia. The site receives the fifth most views per month and is read by nearly half a billion people a month with over 15 billion page views per month. Anyone who has gone down the rabbit hole of learning about Ptolemy I’s involvement in developing the Library of Alexandria and then read up on his children and how his dynasty lasted until Cleopatra and how… well, you get the point… can understand how they get so much traffic. Today there are over 48,000,000 articles and over 37,000,000 registered users who have contributed articles meaning if we set 160 Great Libraries of Alexandria side-by-side we would have about the same amount of information Wikipedia has amassed. And it’s done so because of the contributions of so many dedicated people. People who spend hours researching and building pages, undergoing the need to provide references to cite the data in the articles (btw wikipedia is not supposed to represent original research), more people to patrol and look for content contributed by people on a soapbox or with an agenda, rather than just reporting the facts. Another team looking for articles that need more information. And they do these things for free. While you can occasionally see frustrations from contributors, it is truly one of the best things humanity has done. This allows us to rediscover our own history, effectively compiling all the facts that make up the world we live in, often linked to the opinions that shape them in the reference materials, which include the over 200 million works housed at the US Library of Congress, and over 25 million books scanned into Google Books (out of about 130 million). As with the Great Library of Alexandria, we do have to keep those who seek to throw out the intellectuals of the world away and keep the great works being compiled from falling to waste due to inactivity. Wikipedia keeps a history of pages, to avoid revisionist history. The servers need to be maintained, but the database can be downloaded and is routinely downloaded by plenty of people. I think the idea of providing an encyclopedia for free that was sponsored by ads was sound. Pivoting the business model to make it open was revolutionary. With the availability of the data for machine learning and the ability to enrich it with other sources like genealogical research, actual books, maps, scientific data, and anything else you can manage, I suspect we’ll see contributions we haven’t even begun to think about! And thanks to all of this, we now have a real compendium of the worlds knowledge, getting more and more accurate and holistic by the day. Thank you to everyone involved, from Jimbo and Larry, to the moderators, to the staff, and of course to the millions of people who contribute pages about all the history that makes up the world as we know it today. And thanks to you for listening to yet another episode of the History of Computing Podcast. We’re lucky to have you. Have a great day! Note: This work was produced in large part due to the compilation of historical facts available at https://en.wikipedia.org/wiki/History_of_Wikipedia


The History of Symantec

     8/11/2019

Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Todays episode is on the History of Symantec. This is really more part one of a part two series. Broadcom announced they were acquiring Symantec in August of 2019, the day before we recorded this episode. Who is this Symantec and what do they do - and why does Broadcom want to buy them for 10.7 Billion dollars? For starters, by themselves Symantec is a Fortune 500 company with over $4 billion dollars in annual revenues so $10.7 Billion is a steal for an enterprise software company. Except they’re just selling the Enterprise software division and keeping Norton in the family. With just shy of 12,000 employees, Symantec has twisted and turned and bought and sold companies for a long time. But how did they become a Fortune 500 company? It all started with Eisenhower. ARPA or the Advanced Research Projects Agency, which would later add the word Defense to their name, become DARPA and build a series of tubes call the interweb. While originally commissioned so Ike could counter Sputnik, ARPA continued working to fund projects in computers and in the 1970s, this kid out of the University of Texas named Gary Hendrix saw that they were funding natural language understanding projects. This went back to Turing and DARPA wanted to give some AI-complete a leap forward, trying to make computers as intelligent as people. This was obviously before Terminator told us that was a bad idea (pro-tip, it’s a good idea). Our intrepid hero Gary saw that sweet, sweet grant money and got his PhD from the UT Austin Computational Linguistics Lab. He wrote some papers on robotics and the Stanford Research Institute, or SRI for short. Yes, that’s the same SRI that invented the hosts.txt file and is responsible for keeping DNS for the first decade or so of the internet. So our pal Hendrix joins SRI and chases that grant money, leaving SRI in 1980 with about 15 other Stanford researchers to start a company they called Machine Intelligence Corporation. That went bust and so he started Symantec Corporation in 1982 got a grant from the National Science foundation to build natural language processing software; it turns out syntax and semantics make for a pretty good mashup. So the new company Symantec built out a database and some advanced natural language code, but by 1984 the PC revolution was on and that code had been built for a DEC PDP so could not be run on the emerging PCs in the industry. Symantec was then acquired by C&E Software short for the names of its founders, Dennis Coleman and Gordon Eubanks. The Symantec name stayed and Eubanks became the chairman of the board for the new company. C&E had been working on PC software called Q&A, which the new team finished and then added natural language processing to make using the tools easier to use. They called that “The Intelligent Assistant” and they now had a tool that would take them through the 80s. People swapped rolls, and due to a sharp focus on sales they did well. During the early days of the PC, dealers - or small computer stores that were popping up all over the country, were critical to selling hardware and software. Every Symantec employee would go on the road for six days a week, visiting 6 dealers a day. It was grueling but kept them growing and building. They became what we now call a “portfolio” company in 1985 when they introduced NoteIt, a natural language processing tool used to annotate docs in Lotus 1-2-3. Lotus was in the midst of eating the lunch of previous tools. They added another devision and made SQZ a Lotus 1-2-3 spreadsheet tool. This is important, they were a 3 product company with divisions when in 1987 they got even more aggressive and purchased Breakthrough Software who made an early project management tool called TimeLine. And this is when they did something unique for a PC software company: they split each product into groups that leveraged a shared pool of resources. Each product had a GM that was responsible for the P&L. The GM ran the development, Quality Assurance, Tech Support, and Product Market - those teams reported directly to the GM, who reported to then CEO Eubanks. But there was a shared sales, finance, and operations team. This laid the framework for massive growth, increased sales, and took Symantec to their IPO in 1989. Symantec purchased what was at the time the most popular CRM app called ACT! In 1993 Meanwhile, Peter Norton had a great suite of tools for working with DOS. Things that, well, maybe should have been built into operating systems (and mostly now are). Norton could compress files, do file recovery, etc. The cash Symantec raised allowed them to acquire The Peter Norton Company in 1999 which would completely change the face of the company. This gave them development tools for PC and Mac as Norton had been building those. This lead to the introduction of Symantec Antivirus for the Macintosh and called the anti-virus for PC Norton Antivirus because people already trusted that name. Within two years, with the added sales and marketing air cover that the Symantec sales machine provided, the Norton group was responsible for 82% of Symantecs total revenues. So much so that Symantec dropped building Q&A because Microsoft was winning in their market. I remember this moment pretty poignantly. Sure, there were other apps for the Mac like Virex, and other apps for Windows, like McAfee. But the Norton tools were the gold standard. At least until they later got bloated. The next decade was fast, from the outside looking in, except when Symantec acquired Veritas in 2004. This made sense as Symantec had become a solid player in the security space and before the cloud, backup seemed somewhat related. I’d used Backup Exec for a long time and watched Veritas products go from awesome to, well, not as awesome. John Thompson was the CEO through that decade and Symantec grew rapidly - purchasing systems management solution Altiris in 2007 and got a Data Loss Prevention solution that year in Vontu. Application Performance Management, or APM wasn’t very security focused so that business until was picked up by Vector Capital in 2008. They also picked up MessageLabs and AppStream in 2008. Enrique Salem replaced Thompson and Symantec bought Versign’s CA business in 2010. If you remember from our encryption episode, that was already spun off of RSA. Certificates are security-focused. Email encryption tool PGP and GuardianEdge were also picked up in 2010 providing key management tools for all those, um, keys the CA was issuing. These tools were never integrated properly though. They also picked up Rulespace in 2010 to get what’s now their content filtering solution. Symantec acquired LiveOffice in 2012 to get enterprise vault and instant messaging security - continuing to solidify the line of security products. They also acquired Odyssey Software for SCCM plugins to get better at managing embedded, mobile, and rugged devices. Then came Nukona to get a MAM product, also in 2012. During this time, Steve Bennett was hired as CEO and fired in 2014. Then Michael Brown, although in the interim Veritas was demerged in 2014 and as their products started getting better they were sold to The Carlyle Group in 2016 for $8B. Then Greg Clark became CEO in 2016, when Symantec purchased Blue Coat. Greg Clark then orchestrated the LifeLock acquisition for $2.3B of that $8B. Thoma Bravo then bought CA business to merge with DigiCert in 2017. Then in 2019 Rick Hill became CEO. Does this seem like a lot of buying and selling? It is. But it also isn’t. If you look at what Symantec has done, they have a lot of things they can sell customers for various needs in the information security space. At times, they’ve felt like a holding company. But ever since the Norton acquisition, they’ve had very specific moves that continue to solidify them as one of the top security vendors in the space. Their sales teams don’t spend six days a week on the road and go to six customers a day, but they have a sales machine. And the’ve managed to leverage that to get inside what we call the buying tornado of many emergent technologies and then sell the company before the tornado ends. They still have Norton, of course. Even though practically every other product in the portfolio has come and gone over the years. What does all of this mean? The Broadcom acquisition of the enterprise security division maybe tells us that Symantec is about to leverage that $10+ billion dollars to buy more software companies. And sell more companies after a little integration and incubation, then getting out of it before the ocean gets too red, the tech too stale, or before Microsoft sherlocks them. Because that’s what they do. And they do it profitably every single time. We often think of how an acquiring company gets a new product - but next time you see a company buying another one, think about this: that company probably had multiple offers. What did the team at the company being acquired get out of this deal? And we’ll work on that in the next episode, when we explore the history of Broadcom. Thank you for sticking with us through this episode of the History of Computing Podcast and have a great day!


The Blue Meanies of Apple, IBM, and the Pinks

     10/11/2019

Apple Lore: The Pinks Versus The Blue Meanies Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Today we’re going to cover two engineering groups at Apple: The Pinks and the Blue Meanies. The Mac OS System 6 had been the sixth operating system released in five years. By 1988, Apple was keeping up an unrealistic release cadence, especially given that the operating system had come along at an interesting time when a lot of transitions were happening in IT, and there were lot of increasingly complex problems trying to code around earlier learning opportunities. After sweeping the joint for bugs, Apple held an offsite engineering meeting in Pescadero and split the ideas for the next operating system into two colors of cards: pink, red, green, and blue. The most important of these for this episode were pink, or future release stuff and blue, or next release, stuff. The notecards were blue. The architects of blue were horrible, arrogant self-proclaimed bastards. They’d all seen Yellow Submarine and so they went with the evil Pepperland Blue Meanies. As architects, they were the ones who often said no to things. The Blue Meanies ended up writing much of the core of System 7. They called this OS, which took 3 years to complete, The Big Bang. It would last on the market for 6 years. Longer than any operating system from Apple did prior or since. System 7 gave us CDs, File Sharing, began the migration to a 32-bit OS, replaced MacroMaker with AppleScript and Apple Events and the Extensions Manager, which we’re likely to see a return of given the pace Apple’s going these days. System 7.0.1 came with an Easter egg. If you typed in Help! Help! We're being held prisoner in a system software factory! You got a list of names: Darin Adler Scott Boyd Chris Derossi Cynthia Jasper Brian McGhie Greg Marriott Beatrice Sochor Dean Yu The later iterations of the file ended “Who dares wins” Pink was meant to get more than incremental gains. They wanted coorperative multitasking. The people who really pushed for this were senior engineers Bayles Holt, David Goldsmith, Gene Pope, Erich Ringewald, and Gerard Schutten, referred to as the Gang of Five. They had their pink cards and knew that what was on them was critical, or Apple might have to go out and buy some other company to get the next really operating system. They insisted that they be given the time to build this new operating system and traded their managers to the blue meanies for the chance to build the preemptive multitasking and a more component-based, or object-oriented applications esgn. They got Mike Potel as their manager. They worked in a separate location looking to launch their new operating system in two years. The code named as Defiant, given that Pink just wasn’t awesome. They shared space with the Newton geeks. Given that they had two years and they saw the technical debt in System 6 as considerable, they had to decide if they were going to build a new OS from the ground up, or build on top of the System 6. They pulled in the Advanced Technology Group, another team at Apple, and got up to 11 people. They ended up starting over with a new microkernel they called Opus. Big words. The Pink staff ended up pulling in ideas from other cards and got up to about 25 people. From there, it went a little off the rail and turf wars set in. It kept growing. 100 engineers. They were secretive. They eventually grew to 150 people by 1990. Remember, two years. And the further out they got the less likely that the code would ever be backwards compatible. The Pink GUI used isometric icons, rounded windows, drop shadows, beveling, was fully internationalized, and were huge influences in Mac OS 8 and Copland. Even IBM was impressed by the work being done on Pink and in 1991 they entered an alliance with Apple to help take on what was quickly becoming a Microsoft Monopoly. They planned to bring this new OS to the market as a new company called Taligent in the mid-90s. Just two more years. In 1992, Taligent moved out of Apple with 170 employees, and Joe Guglielmi, who had once led the OS/2 team and had been a marketing exec at IBM for 30 years. By then, this one one of 5 partnerships between Apple and IBM, something that starts and stops every now and then up to today. It was an era of turf wars and empire building. But it was the era of Object orientation. Since Smalltalk, this had been a key aspect in higher level languages such as Java and in the AS/400. IBM had already done it with OS/2 and AIX. By 1993 there was suspicion. Again they grow, now to over 250 people, but they really just needed two more years, guys. Apple actually released an object-oriented SDK called Bedrock to migrate from System 7 to Pink, which could work also work with Windows 3.1, NT, and OS/2. Before you know it they were building a development environment on AIX and porting frameworks to HP-UX, OS/2, Windows. By 1994 the apps could finally run on an IBM RS/6000 running AIX. The buzz continued. Ish. 1994 saw HP take on 15% of the company and add Smalltalk into the mix. HP brought new compilers into the portfolio, and needed native functionality. The development environment was renamed to cq professional and the User Interface builder was changed to cqconstructor. TalAE became CommonPoint. TalOS was scheduled to ship in 1996. Just two more years. The world wanted to switch away from monolithic apps and definitely away from procedural apps. It still does. Every attempt to do so just takes two more years. Then and now. That’s what we call “Enterprise Software” and as with anyone who’s ok with such pace, Joe Guglielmi left Taligent in 1995. Let’s review where we are. There’s no real shipping OS. There’s an IDE but C++ programmers would need 3 months training to get up to speed on Taligent. Most needed a week or two class to learn Java, if that. Steve Jobs had aligned with Sun in OpenStep. So Apple was getting closer and closer to IBM. But System 7 was too big a dog to run Taligent. Debbie Coutant became CEO towards the end of the year. HP and Apple sold their stake in the company which was then up to 375 employees. Over half were laid off and the organization was wrapped into IBM as would be focusing on… Java. Commonpoint would be distributed across IBM products where possible. Taligent themselves would be key to the Java work done at IBM. By then IBM was a services first organization anyways, so it kinda’ all makes sense. TalOS was demoed in 1996 but never released. It was unique. It was object oriented from the ground up. It was an inspiration of a new era of interfaces. It was special. But it never shipped. Mac OS 8 was released in 1997. Better late than never. But it was clear that there was no more runway left in the code that had been getting bigger and meaner. They needed a strategy. The final Taligent employees got sucked into IBM that year, ending a fascinating drama in operating systems and frameworks. Whatever the behind baseball story, Apple decided to bring Steve Jobs back in, in 1997. And he brought NeXT, which gave the Mac all the object-oriented neediness they wanted. They got Objective-C, Mach (through Avie Tevanian of Carnegie Mellon), Property Lists, AppWrappers (.app), Workspace Manager (which begat the Finder), The Dock, and NetInfo. And they finally retired the Apple Bonkers server. But as importantly as anything else, they got Bertrand Serlet and Craig Federighi - who as the next major VPs of Software were able to keep the ship in the right direction and by 2001 they gave us 10.0: Cheetah * Darwin (kinda’ like Unix) with Terminal * Mail, Address Book, iTunes * AppleScript survived, AppleTalk didn’t * Aqua UI, Carbon and Cocoa APIs * AFP over TCP/IP, HTTP, SSH, and FTP server/client * Native PDF Support It began a nearly 20 year journey that we are still on. So in the end, the Pinks never shipped an operating system, despite their best intentions. And the Blues never paid down their technical debt. Despite their best intentions. As engineers, we need a plan. We need to ship incrementally. We need good, sane cultures that can work together. We need to pay down technical debt - but we don’t need to run amuck building technology that’s a little ahead of our time. Even if it’s always just two more years ahead of our time. And I think we’re at time gentle listeners. And I hope it doesn’t take me two years to ship this, gentle listeners. But if it does or doesn’t, thanks for tuning into another episode of the History of Computing Podcast. We’re lucky to have you. Have a great day!


Making Disks Flexible, Part 2

     3/8/2020

The floppy disk is one of the most iconic pieces of technology. While not in use in the modern day there was a period of 40 years where the floppy disk was synonymous with data storage. Today we pick up where we finished in the last episode, with the rise and fall of the 5 1/4 inch disk. We will be looking at the creation and spread of the 3 1/2 inch floppy disk. How did Sony, a non-player in the computer market, create this run away success? And how did Apple contribute to it's rise?

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1980: Sony Invents Microfloppy Disk
1983: Apple Builds Prototype MAC with 3 1/2 Inch Floppy


Applesoft BASIC, Microsoft and Apple's First Collaboration

     4/19/2020

It's easy to think of Apple and Microsoft as bitter rivals, but that's not always the case. The two companies have a very complicated relationship, and a very long history. This connection goes all the way back to the 1970s and a product called Applesoft BASIC. It would become stock software on nearly every Apple II computer ever sold, it kept Apple competitive in the early home computer market, and it may have saved Microsoft from bankruptcy.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1997: Bill Gates saves Apple from Bankruptcy
1976: Apple I hits shelves, Integer BASIC soon follows
1977: Apple II Released
1978: AppleSoft BASIC Ships


A Guided Tour of the Macintosh

     5/10/2020

In this byte sized episode I take a look at a pack in that came with the first Macintosh. Along side Apple stickers, manuals, and the computer itself there was a single cassette tape labeled "A Guided Tour of the Macintosh". The purpose? It's a strange addition to the Mac's packing, but a great example of Apple's attention to detail and ingenuity.

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1984: A Guided Tour of the Macintosh Released


Evolution of the Mouse

     12/2/2019

The computer mouse is a ubiquitous device, it's also one of the least changed devices we use with a computer. The mice we use today have only seen small incremental improvements since the first mouse was developed. So how did such a long lasting design take shape, and how did it travel the decades up to now?

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Important dates in this episode:

1961: First Mouse Developed at Engelbart's ARC Lab
1972: Xerox Develops Rollerball Mouse for Alto
1979: Apple LISA Mouse Designed


The Apple II (Part III)

     5/13/2014

Third part on the Apple II:

  • News

  • New acquisitions

  • Feedback

  • Books, Software, Modern Upgrades, Online Stores, Emulation, Current Web Sites

  • Special guest host Carrington Vanston!!

Items mentioned in this episode:

News

New Acquisitions

Vintage Computer Shows

Books

  • Compute’s First, Second and Third Book of Apple

  • Apple II User’s Guide by Lon Poole

  • Programming Surprises & Tricks for your Apple II/IIe Computer by David L. Heiserman

  • AppleSoft Tutorial from Apple, Inc. - based on Apple II BASIC Programming Manual by Jef Raskin; rewritten for AppleSoft by Caryl Richardson

  • Beneath Apple DOS by Don Worth and Peter Lechner - Beneath Apple DOS is intended to serve as a companion to Apple's DOS Manual, providing additional information for the advanced programmer or the novice Apple user who wants to know more about the structure of diskettes.

  • Apple II/IIe Computer Graphics by Ken Williams, founder and CEO of Sierra On-Line Inc

  • AppleSoft BASIC Toolbox by Larry Wintermeyer

  • Apple Graphics Games by Paul Coletta

  • Machine Language for Beginners by Richard Mansfield

  • Micro Adventure is the title of a series of books for young adult readers, published by Scholastic, Inc.

  • Golden Flutes & Great Escapes by Delton Horn

  • Sophistication and Simplicity, the Life and Times of the Apple II Computer by Steve Weyhrich, 2013 - http://www.amazon.com/dp/0986832278/?tag=flodaypod-20

  • The New Apple II User’s Guide by David Finnegan, 2012 - http://www.amazon.com/dp/0615639879/?tag=flodaypod-20

  • iWoz: Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It,  by Steve Wozniak and Gina Smith, 2006 - http://www.amazon.com/dp/0393061434/?tag=flodaypod-20

  • Steve Jobs by Walter Isaacson, 2011 - http://www.amazon.com/dp/1451648537/?tag=flodaypod-20

  • WOZPAK Special Edition - http://www.amazon.com/dp/1304231321/?tag=flodaypod-20

  • What’s Where in the Apple by Prof. William F. Luebbert - http://www.whatswhereintheapple.com/

Software

Modern Upgrades & Connectivity Options

Online Stores

Emulation

Current Web Sites & Other Forums

Other Books/Sites Used for Reference

 


Interview with Apple II Fan Ken Gagne

     7/20/2014

Bonus episode this month. Interview with Apple II Enthusiast Ken Gagne about KFest, Open Apple Podcast, Juiced.GS and more.

 

 


The Apple II (Part II)

     3/26/2014

Second part on the Apple II:

  • New acquisitions

  • News

  • Tech specs, peripherals, magazines, user groups, shows

  • Special guest host Carrington Vanston!!

Items mentioned in this episode:

New Acquisitions

Vintage Computer Shows

Magazines

User Groups

Shows

Other Books/Sites Used for Reference



The Apple II, Part I, History with Steve Weyhrich

     2/5/2014

First part on the Apple II:

  • Personal memories of the Apple II.

  • New acquisitions.

  • News.

  • Feedback.

  • History of the Apple II.

  • Special guest host Steve Weyhrich, the man who literally wrote the book on Apple II history!!

 Links mentioned in this episode:

 New Acquisitions

 Vintage Computer Shows

 Feedback

 

Other News

History

 


The Apple 1

     6/8/2013

Links mentioned in the show:


Return of Viruses: The Spread

     10/18/2020

It's time to round out spook month with a return to one of last year's topics: the computer virus. Malicious code traveling over networks is actually a relatively new phenomenon, early viruses were much different. In this episode we examine ANIMAL and Elk Cloner, two early viruses that were meant as practical jokes and spread by hapless computer users. Along the way we will see cases of parallel evolution, name calling, and find out if there is any one origin to the word "virus".

Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and bonus content: https://www.patreon.com/adventofcomputing


The Immutable Laws of Game Mechanics In A Microtransaction-Based Economy

     12/23/2020

Once upon a time, we put a quarter in a machine and played a game for awhile. And life was good. The rise of personal computers and subsequent fall in the cost of microchips allowed some of the same chips found in early computers, such as the Zylog Z80, to bring video game consoles into homes across the world. That one chip could be found in the ColecoVision, Nintendo Game Boy, and the Sega Genesis. Given that many of the cheaper early computers came with joysticks or gaming at the time, the line between personal computer and video game console seemed natural. 

Then came the iPhone, which brought an explosion of apps. Apps were anywhere from a buck to a hundred. We weren't the least surprised by the number of games that exploded onto the platform. Nor by the creativity of the developers. When the Apple App Store and Google Play added in-app purchasing and later in-app subscriptions it all just seemed natural. But it has profoundly changed the way games are purchased, distributed, and the entire business model of apps. 

The Evolving Business Model of Gaming

Video games were originally played in arcades, similar to pinball. The business model was each game was a quarter or token. With the advent of PCs and video game consoles, games were bought in stores, as were records or cassettes that included music. The business model was that the store made money (40-50%), the distributor who got the game into a box and on the shelf in the store made money, and the company that made the game got some as well. And discounts to sell more inventory usually came out of someone not called the retailer. By the time everyone involved got a piece, it was common for the maker of the game to get between $5 and $10 dollars per unit sold for a $50 game. 

No one was surprised that there was a whole cottage industry of software piracy. Especially given that most games could be defeated in 40 to 100 hours. This of course spawned a whole industry to thwart piracy, eating into margins but theoretically generating more revenue per game created. 

Industries evolve. Console and computer gaming split (although arguably consoles have always just been computers) and the gamer-verse further schism'd between those who played various types of games. Some games were able to move to subscription models and some companies sprang up to deliver games through subscriptions or as rentals  (game rentals over a modem was the business model that originally inspired the AOL founders). And that was ok for the gaming industry, which slowly grew to the point that gaming was a larger industry than the film industry.

Enter Mobile Devices and App Stores

Then came mobile devices, disrupting the entire gaming industry. Apple began the App Store model, establishing that the developer got 70% of the sale - much better than 5%. Steve Jobs had predicted the coming App Store in a 1985 and then when the iPhone was released tried to keep the platform closed but eventually capitulated and opened up the App Store to developers. 

Those first developers made millions. Some developers were able to port games to mobile platforms and try to maintain a similar pricing model to the computer or console versions. But the number of games created a downward pressure that kept games cheap, and often free. 

The number of games in the App Store grew (today there are over 5 million apps between Apple and Google). With a constant downward pressure on price, the profits dropped. Suddenly, game developers forgot they used to get 10 percent of the sale of a game a lot of times and started to blame the stores the games were distributed in on the companies that owned the App Stores: Apple, Google, and in some cases, Steam. 

The rise and subsequent decrease in popularity of Pokémon Go was the original inspiration for this article in 2016 but since a number of games have validated the perspectives. These free games provide a valuable case study into how the way we design a game to be played (known as game mechanics) impacts our ability to monetize the game in various ways. And there are lots and lots of bad examples in games (and probably legislation on the way to remedy abuses) that also tells us what not to do.

The Microtransaction-Based Economy

These days, game developers get us hooked on the game early, get us comfortable with the pace of the game and give us an early acceleration. But then that slows down. Many a developer then points us to in-app purchases in order to unlock items that allow us to maintain the pace of a game, or even to hasten the pace. And given that we're playing against other people a lot of the time, they try and harness our natural competitiveness to get us to buy things. These in-app purchases are known as microtransactions. And the aggregate of these in-app purchases can be considered as a microtransaction-based economy.

As the microtransaction-based economy has arrived in full force, there are certain standards emerging as cultural norms for these economies. And violating these rules cause vendors to get blasted on message boards and more importantly lose rabid fans of the game. As such, I’ve decided to codify my own set of laws for these, which are follows:

All items that can be purchased with real money should be available for free. 

For example, when designing a game that has users building a city and we develop a monument that users can pay $1 for and place in their city to improve morale of those that live in the city, that monument should be able to be earned in the game as well. Otherwise, you’re able to pay for an in-app purchase that gives some players an advantage for doing nothing more than spending money. 

In-app purchases do not replace game play, but hasten the progression through the game. 

For example, when designing a game that has users level up based on earning experience points for each task they complete, we never want to just gift experience points based on an in-app purchase. Instead, in-app purchases should provide a time-bound amplification to experience (such as doubling experience for 30 minutes in Pokémon Go or keeping anyone else from attacking a player for 24 hours in Clash of Clans so we can save enough money to buy that one Town Hall upgrade we just can’t live without). 

The amount paid for items in a game should correlate to the amount of time saved in game play. 

For example, get stuck on a level in Angry Birds. We could pay a dollar for a pack of goodies that will get us past that level (and probably 3 more), so we can move on. Or we could keep hammering away at that level for another hour. Thus, we saved an hour, but lost pride points in the fact that we didn’t conquer that level. Later in the game, we can go back and get three stars without paying to get past it. 

Do not allow real-world trading. 

This is key. If it’s possible to build an economy outside the game, players can then break your game mechanics. For example, in World of Warcraft, you can buy gold, and magic items online for real money and then log into the game only to have another shady character add those items to your inventory. This leads to people writing programs known as bots (short for robots) to mine gold or find magic items on their behalf so they can sell it in the real world. There are a lot of negative effects to such behavior, including the need to constantly monitor for bots (which wastes a lot of developer cycles), bots cause the in-game economy to practically crash when the game updates (e.g. a map) and breaks the bots, and make games both more confusing for users and less controllable by the developer.

Establish an in-game currency.

 You don’t want users of the game buying things with cash directly. Instead, you want them to buy a currency, such as gold, rubies, gems, karma, or whatever you’d like to call that currency. Disassociating purchases from real world money causes users to lose track of what they’re buying and spend more money. Seems shady, and it very well may be, but I don’t write games so I can’t say if that’s the intent or not. It’s a similar philosophy to buying poker chips, rather than using money in a casino (just without the free booze).

Provide multiple goals within the game.

Players will invariably get bored with the critical path in your game. When they do, it’s great for players to find other aspects of the game to keep them engaged. For example, in Pokémon Go, you might spend 2 weeks trying to move from level 33 to level 34. During that time, you might as well go find that last Charmander so you can evolve to a Charzard. That’s two different goals: one to locate a creature, the other to gain experience. Or you can go take over some gyms in your neighborhood. Or you can power level by catching hundreds of Pidgeys. The point is, to keep players engaged during long periods with no progression, having a choose your own adventure style game play is important. For massive multiplayers (especially role playing games) this is critical, as players will quickly tire of mining for gold and want to go, for example, jump into the latest mass land war. To place a little context around this, there are also 28 medals in Pokémon Go (that I’m aware of), which keep providing more and more goals in the game. 

Allow for rapid progression early in the game in order to hook users, so they will pay for items later in the game.

We want people to play our games because they love them. Less than 3% of players will transact an in-app purchase in a given game. But that number skyrockets as time is invested in a game. Quickly progressing through levels early in a game keeps users playing. Once users have played a game for 8 or 9 hours, if you tell them they can go to bed and for a dollar and it will seem like they kept playing for another 8 or 9 hours, based on the cool stuff they’ll earn, they’re likely to give up that dollar and keep playing for another couple of hours rather than get that much needed sleep! We should never penalize players that don't pay up. In fact, players often buy things that simply change the look of their character in games like Among Us. There is no need to impact game mechanics with purchase if we build an awesome enough game.  

Create achievable goals in discrete amounts of time. 

Boom Beach villages range from level 1 to level 64. As players rise through the ability to reach the next stage becomes logarithmically more difficult given other players are paying to play. Goals against computers players (or NPCs or AI according to how we want to think of it) are similar. All should be achievable though. The game Runeblade for the Apple Watch was based on fundamentally sound game mechanics that could enthrall a player for months; however, there’s no way to get past a certain point. Therefore, players lose interest, Eric Cartman-style, and went home.

Restrict the ability to automate the game.

If we had the choice to run every day to lose weight or to eat donuts and watch people run and still lose weight, which would most people choose? Duh. Problem is that when players automate your game, they end up losing interest as their time investment in the game diminishes, as does the necessary skill level to shoot up through levels in games. Evony Online was such a game; and I’m pretty sure I still get an email every month chastising me for botting the game 8-10 years after anyone remembers that the game existed. As a game becomes too dependent on resources obtained by gold mining bots in World of Warcraft, the economy of the game could crash when they were knocked off-line. Having said this, such drama adds to the intrigue - which can be a game inside a game for many. 

Pit players against one another.

Leaderboards. Everyone wants to be in 1st place, all the time. Or to see themselves moving up in rankings. By providing a ranking system, we increase engagement, and drive people towards making in-app purchases. Those just shouldn't be done to directly get a leg up. It's a slippery slope to allow a player to jump 30 people in front of them to get to #1,000 in the rankings only to see those people do an in-app purchase and create an addiction to the in-app purchases in order to maintain their position in the rankings. It's better to make smaller amounts and keep players around than have them hate a developer once they're realized the game was making money off addiction. Sounds a bit like 

Don’t pit weak players against strong players unnecessarily. 

In Clash of Clans a player builds a village. As they build more cool stuff in the village, the village levels up. The player can buy rubies to complete buildings faster, and so you can basically buy the village levels. But, since a player can basically buy levels, the levels can exceed the players skill. Therefore, in order to pit matched players in battles, a second metric was introduced to match battles that is based on won/lost ratios of battles. By ensuring that players of similar skill duel one another, the skill of players is more likely to progress organically and therefore they remain engaged with the game. The one exception to this rule that I’ve seen actually work well so far has been in Pokémon Go where a player needs to be physically close to a gym rather than just close to the gym while sitting in their living room playing on a console. That geographical alignment really changes this dynamic, as does the great way that gym matches heavily favor attackers, driving fast turnover in gyms and keeping the game accessible to lower level players.

Add time-based incentives. 

If a player logs into a game every day, they should get a special incentive for the day that amplifies the more days they log in in a row. Or if they don’t log in, another player can steal all the stuff. Players get a push alert when another player attacks them. There are a number of different ways to incentivize players to keep logging into an app. The more we keep players in an app, the more likely they are to make a purchase. Until they get so many alerts that they delete your app. Don’t do that.

Incentivize pure gameplay. 

 It might seem counter-intuitive to incentivize players to not use in-app purchases. But not allowing for a perfect score on an in-app purchase (e.g. not allowing for a perfect level in Angry Birds if you used an in-app purchase) will drive more engagement in a game, while likely still allowing for an in-app purchase and then a late-game strategy of finding perfection to unlock that hidden extra level, or whatever the secret sauce is for your game.

Apply maximum purchasing amounts.

Games can get addictive for players. We want dolphins, not whales. This is to say that we want people to spend what they would have spent on a boxed game, say $50, or even that per month. But when players get into spending thousands per day, they're likely to at some point realize their error in judgement and contact Apple or Google for a refund. And they should get one. Don't take advantage of people. 

Make random returns on microtransactions transparent.

There has been talk of regulating randomized loot boxes. Why? Because the numbers don't add up. Rampant abuse of in-app purchases for random gear means that developers who publish the algorithm or source code for how those rewards are derived  will have a certain level of non-repudiation when the law suits start. Again, if those rewards can be earned during the game as well (maybe at a lower likelihood) then we're not abusing game mechanics. 

Conclusion

The above list might seem manipulative at times. Especially to those who don't write code for a living. And to some degree it is. But it can be done ethically and when it is the long-term returns are greater. If nothing else, these laws are a code of ethics of sorts. 

These are lessons that hundreds of companies are out there learning by trial and error, and hopefully documenting them can help emergent companies not have to repeat some of the same mistakes of others. 

We could probably get up to 100 of these (with examples) if we wanted to! What laws have you noticed?


Apple: The Apple I computer to the ///

     1/30/2021

I’ve been struggling with how to cover a few different companies, topics, or movements for awhile. The lack of covering their stories thus far has little to do with their impact but just trying to find where to put them in the history of computing. One of the most challenging is Apple. This is because there isn’t just one Apple. Instead there are stages of the company, each with their own place in the history of computers. 

Today we can think of Apple as one of the Big 5 tech companies, which include Amazon, Apple, Google, Facebook, and Microsoft. But there were times in the evolution of the company where things looked bleak. Like maybe they would get gobbled up by another tech company. To oversimplify the development of Apple, we’ll break up their storied ascent into four parts:

  • Apple Computers: This story covers the mid-1970s to mid 1980s and covers Apple rising out of the hobbyist movement and into a gangbuster IPO. The Apple I through III families all centered on one family of chips and took the company into the 90s.
  • The Macintosh: The rise and fall of the Mac covers the introduction of the now-iconic Mac through to the Power Macintosh era. 
  • Mac OS X: This part of the Apple story begins with the return of Steve Jobs to Apple and the acquisition of NeXT, looks at the introduction of the Intel Macs and takes us through to the transition to the Apple M1 CPU.
  • Post PC: Steve Jobs announced the “post PC” era in 2007, and in the coming years the sales of PCs fell for the first time, while tablets, phones, and other devices emerged as the primary means people used devices. 

We’ll start with the early days, which I think of as one of the four key Apple stages of development. And those early days go back far past the days when Apple was hocking the Apple I. They go to high school.

Jobs and Woz

Bill Fernandez and Steve Wozniak built a computer they called “The Cream Soda Computer” in 1970 when Bill was 16 and Woz was 20. It was a crude punch card processing machine built from some parts Woz got from the company he was working for at the time.

Fernandez introduced Steve Wozniak to a friend from middle school because they were both into computers and both had a flare for pranky rebelliousness. That friend was Steve Jobs. 

By 1972, the pranks turned into their first business. Wozniak designed Blue Boxes, initially conceived by Cap’n Crunch John Draper, who got his phreaker name from a whistle in a Cap’n Crunch box that made a tone in 2600 Hz that sent AT&T phones into operator mode. Draper would actually be an Apple employee for a bit. They designed a digital version and sold a few thousand dollars worth. 

Jobs went to Reed College. Wozniak went to Berkely. Both dropped out. 

Woz got a sweet gig at HP designing calculators, where Jobs had worked a summer job in high school.  India to find enlightenment. When Jobs became employee number 40 at Atari, he got Wozniak to help create Breakout. That was the year The Altair 8800 was released and Wozniak went to the first meeting of a little club called the Homebrew Computer Club in 1975 when they got an Altair so the People’s Computer Company could review it. And that was the inspiration. Having already built one computer with Fernandez, Woz designed schematics for another. Going back to the Homebrew meetings to talk through ideas and nerd out, he got it built and proud of his creation, returned to Homebrew with Jobs to give out copies of the schematics for everyone to play with. This was the age of hackers and hobbyists. But that was about to change ever so slightly. 

The Apple I 

Jobs had this idea. What if they sold the boards. They came up with a plan. Jobs sold his VW Microbus and Wozniak sold his HP-65 calculator and they got to work. Simple math. They could sell 50 boards for $40 bucks each and make some cash like they’d done with the blue boxes. But you know, a lot of people didn’t know what to do with the board. Sure, you just needed a keyboard and a television, but that still seemed a bit much. 

Then a little bigger plan - what if they sold 50 full computers. They went to the Byte Shop and talked them into buying 50 for $500. They dropped $20,000 on parts and netted a $5,000 return. They’d go on to sell about 200 of the Apple Is between 1976 and 1977.

It came with a MOS 6502 chip running at a whopping 1 MHz and with 4KB of memory, which could go to 8. They provided Apple BASIC, as most vendors did at the time. That MOS chip was critical. Before it, many used an Intel or the Motorola 6800, which went for $175. But the MOS 6502 was just $25. It was an 8-bit microprocessor designed by a team that Chuck Peddle ran after leaving the 6800 team at Motorola. Armed with that chip at that price, and with Wozniak’s understanding of what it needed to do and how it interfaced with other chips to access memory and peripherals, the two could do something new. 

They started selling the Apple 1 and to quote an ad “the Apple comes fully assembled, tested & burned-in and has a complete power supply on-board, initial set-up is essentially “hassle free” and you can be running in minutes.” This really tells you something about the computing world at the time. There were thousands of hobbyists and many had been selling devices. But this thing had on-board RAM and you could just add a keyboard and video and not have to read LEDs to get output. The marketing descriptions were pretty technical by modern Apple standards, telling us something of the users. It sold for $666.66.

They got help from Patty Jobs building logic boards. Jobs’ friend from college Daniel Kottke joined for the summer, as did Fernandez and Chris Espinosa - now Apple’s longest-tenured employee. It was a scrappy garage kind of company. The best kind. 

They made the Apple I until a few months after they released the successor. But the problem with the Apple I was that there was only one person who could actually support it when customers called: Wozniak. And he was slammed, busy designing the next computer and all the components needed to take it to the mass market, like monitors, disk drives, etc. So they offered a discount for anyone returning the Apple I and destroyed most returned. Those Apple I computers have now been auctioned for hundreds of thousands of dollars all the way up to $1.75 million. 

The Apple II

They knew they were on to something. But a lot of people were building computers. They needed capital if they were going to bring in a team and make a go at things. But Steve Jobs wasn’t exactly the type of guy venture capitalists liked to fund at the time.

Mike Markkula was a product-marketing manager at chip makers Fairchild and Intel who retired early after making a small fortune on stock options. That is, until he got a visit from Steve Jobs. He brought money but more importantly the kind of assistance only a veteran of a successful corporation who’d ride that wave could bring. He brought in Michael "Scotty" Scott, employee #4, to be the first CEO and they got to work on mapping out an early business plan. If you notice the overlapping employee numbers, Scotty might have had something to do with that…

As you may notice by Wozniak selling his calculator, at the time computers weren’t that far removed from calculators. So Jobs brought in a calculator designer named Jerry Manock to design a plastic injection molded case, or shell, for the Apple II. They used the same chip and a similar enough motherboard design. They stuck with the default 4KB of memory and provided jumpers to make it easier to go up to 48. They added a cassette interface for IO. They had a toggle circuit that could trigger the built-in speaker. And they would include two game paddles. This is similar to bundles provided with the Commodore and other vendors of the day. And of course it still worked with a standard TV - but now that TVs were mostly color, so was the video coming out of the Apple II. And all of this came at a starting price of $1,298.

The computer initially shipped with a version of BASIC written by Wozniak but Apple later licensed the Microsoft 6502 BASIC to ship what they called Applesoft BASIC, short for Apple and Micorosft. Here, they turned to Randy Wiggington who was Apple’s employee #6 and had gotten rides to the Homebrew Computer Club from Wozniak as a teenager (since he lived down the street). He and others added features onto Microsoft BASIC to free Wozniak to work on other projects. Deciding they needed a disk operating system, or DOS. Here, rather than license the industry standard CP/M at the time, Wigginton worked with Shepardson, who did various projects for CP/M and Atari.  

The motherboard on the Apple II remains an elegant design. There were certain innovations that Wozniak made, like cutting down the number of DRAM chips by sharing resources between other components. The design was so elegant that Bill Fernandez had to join them as employee number four, in order to help take the board and create schematics to have it silkscreened.  The machines were powerful.

All that needed juice. Jobs asked his former boss Al Alcorn for someone to help out with that. Rod Holt, employee number 5, was brought in to design the power supply. By implementing a switching power supply, as Digital Equipment had done in the PDP-11, rather than a transformer-based power supply, the Apple II ended up being far lighter than many other machines. 

The Apple II was released in 1977 at the West Coast Computer Fair. It, along with the TRS-80 and the Commodore PET would become the 1977 Trinity, which isn’t surprising. Remember Peddle who ran the 6502 design team - he designed the PET. And Steve Leininger was also a member of the Homebrew Computer Club who happened to work at National Semiconductor when Radio Shack/Tandy started looking for someone to build them a computer. 

The machine was stamped with an Apple logo. Jobs hired Rob Janoff, a local graphic designer, to create the logo. This was a picture of an Apple made out of a rainbow, showing that the Apple II had color graphics. This rainbow Apple stuck and became the logo for Apple Computers until 1998, after Steve Jobs returned to Apple, when the Apple went all-black, but the silhouette is now iconic, serving Apple for 45 years and counting.

The computers were an instant success and sold quickly. But others were doing well in the market. Some incumbents and some new. Red oceans mean we have to improve our effectiveness. So this is where Apple had to grow up to become a company. Markkula made a plan to get Apple to $500 million in sales in 10 years on the backs of his $92,000 investment and another $600,000 in venture funding. 

They did $2.7 million dollars in sales in 1977. This idea of selling a pre-assembled computer to the general public was clearly resonating. Parents could use it to help teach their kids. Schools could use it for the same. And when we were done with all that, we could play games on it. Write code in BASIC. Or use it for business. Make some documents in Word Star, spreadsheets in VisiCalc, or use one of the thousands of titles available for the Mac. Sales grew 150x until 1980.

Given that many thought cassettes were for home machines and floppies were for professional machines, it was time to move away from tape. Markkela realized this and had Wozniak design a floppy disk for the Apple II, which went on to be known as the Drive II. Wozniak had experience with disk controllers and studied the latest available. Wozniak again managed to come up with a value engineered design that allowed Apple to produce a good drive for less than any other major vendor at the time. Wozniak would actually later go on to say that it was one of his best designs (and many contemporaries agreed).

Markkula filled gaps as well as anyone. He even wrote free software programs under the name of Johnny Appleseed, a name also used for years in product documentation. He was a classic hacker type of entrepreneur on their behalf, sitting in the guerrilla marketing chair some days or acting as president of the company others, and mentor for Jobs in other days.  

From Hobbyists to Capitalists

Here’s the thing - I’ve always been a huge fan of Apple. Even in their darkest days, which we’ll get to in later episodes, they represented an ideal. But going back to the Apple 1, they were nothing special. Even the Apple II. Osborne, Commodore, Vector Graphics, Atari, and hundreds of other companies were springing up, inspired first by that Altair and then by the rapid drop in the prices of chips. 

The impact of the 1 megahertz barrier and cost of those MOS 6502 chips was profound. The MOS 6502 chip would be used in the Apple II, the Atari 2600, the Nintendo NES, the BBY Micro. And along with the Zylog Z80 and Intel 8080 would spark a revolution in personal computers. Many of those companies would disappear in what we’d think of as a personal computer bubble if there was more money in it. But those that survived, took things to an order of magnitude higher. Instead of making millions they were making hundreds of millions. Many would even go to war in a race to the bottom of prices. And this is where Apple started to differentiate themselves from the rest. 

For starters, due to how anemic the default Altair was, most of the hobbyist computers were all about expansion. You can see it on the Apple I schematics and you can see it in the minimum of 7 expansion slots in the Apple II lineup of computers. Well, all of them except the IIc, marketed as a more portable type of device, with a handle and an RCA connection to a television for a monitor. 

The media seemed to adore them. In an era of JR Ewing of Dallas, Steve Jobs was just the personality to emerge and still somewhat differentiate the new wave of computer enthusiasts. Coming at the tail end of an era of social and political strife, many saw something of themselves in Jobs. He looked the counter-culture part. He had the hair, but this drive. The early 80s were going to be all about the yuppies though - and Jobs was putting on a suit. Many identified with that as well.

Fueled by the 150x sales performance shooting them up to $117M in sales, Apple filed for an IPO, going public in 1980, creating hundreds of millionaires, including at least 40 of their own employees. It was the biggest IPO since Ford in 1956, the same year Steve Jobs was born. The stock was filed at $14 and shot up to $29 on the first day alone, leaving Apple sitting pretty on a $1.778 valuation. 

Scotty, who brought the champagne, made nearly a $100M profit. One of the Venture Capitalists, Arthur Rock, made over $21M on a $57,600 investment. Rock had been the one to convince the Shockley Semiconductor team to found Fairchild, a key turning point in putting silicon into the name of Silicon Valley. When Noyce and Moore left there to found Intel, he was involved. And he would stay in touch with Markkula, who was so enthusiastic about Apple that Rock invested and began a stint on the board of directors at Apple in 1978, often portrayed as the villain in the story of Steve Jobs. But let’s think about something for a moment. Rock was a backer of Scientific Data Systems, purchased by Xerox in 1969, becoming the Xerox 500. Certainly not Xerox PARC and in fact, the anti-PARC, but certainly helping to connect Jobs to Xerox later as Rock served on the board of Xerox.

The IPO Hangover

Money is great to have but also causes problems. Teams get sidetracked trying to figure out what to do with their hauls. Like Rod Holt’s $67M haul that day. It’s a distraction in a time when executional excellence is critical. We have to bring in more people fast, which created a scenario Mike Scott referred to as a “bozo explosion.” Suddenly more people actually makes us less effective. 

Growing teams all want a seat at a limited table. Innovation falls off as we rush to keep up with the orders and needs of existing customers. Bugs, bigger code bases to maintain, issues with people doing crazy things. 

Taking our eyes off the ball and normalizing the growth can be hard. By 1981, Scotty was out after leading some substantial layoffs.  Apple stock was down. A big IPO also creates investments in competitors. Some of those would go on a race to the bottom in price. 

Apple didn’t compete on price. Instead, they started to plan the next revolution, a key piece of Steve Jobs emerging as a household name. They would learn what the research and computer science communities had been doing - and bring a graphical interface and mouse to the world with Lisa and a smaller project brought forward at the time by Jef Raskin that Jobs tried to kill - but one that Markkula not only approved, but kept Jobs from killing, the Macintosh. 

Fernandez, Holt, Wigginton, and even Wozniak just drifted away or got lost in the hyper-growth of the company, as is often the case. Some came back. Some didn’t. Many of us go through the same in rapidly growing companies. 

Next (but not yet NeXT)

But a new era of hackers was on the way. And a new movement as counter to the big computer culture as Jobs. But first, they needed to take a trip to Xerox. In the meantime, the Apple III was an improvement but proved that the Apple computer line had run its course. They released it in 1980 and recalled the first 14,000 machines and never peaked 75,000 machines sold, killing off the line in 1984. A special year. 


The Apple Lisa

     2/2/2021

Apple found massive success on the back of the Apple II. They went public like many of the late 70s computer companies and the story could have ended there, as it did for many computer companies of the era who were potentially bigger, had better technology, better go to market strategies, and/or even some who were far more innovative. 

But it didn’t. The journey to the next stage began with the Apple IIc, Apple IIgs, and other incrementally better, faster, or smaller models. Those funded the research and development of a number of projects. One was a new computer: the Lisa. I bet you thought we were jumping into the Mac next. Getting there. But twists and turns, as the title suggests. 

The success of the Apple II led to many of the best and brightest minds in computers wanting to go work at Apple. Jobs came to be considered a visionary. The pressure to actually become one has been the fall of many a leader. And Jobs almost succumbed to it as well. 

Some go down due to a lack of vision, others because they don’t have the capacity for executional excellence. Some lack lieutenants they can trust. The story isn’t clear with Jobs. He famously sought perfection. And sometimes he got close. 

The Xerox Palo Alto Research Center, or PARC for short, had been a focal point of raw research and development, since 1970. They inherited many great innovations, outlandish ideas, amazing talent, and decades of research from academia and Cold War-inspired government grants. Ever since Sputnik, the National Science Foundation and the US Advanced Research Projects Agency had funded raw research. During Vietnam, that funding dried up and private industry moved in to take products to market. 

Arthur Rock had come into Xerox in 1969, on the back of an investment into Scientific Data Systems. While on the board of Xerox, he got to see the advancements being made at PARC. PARC hired some of the oNLine System (NLS) team who worked to help ship the Xerox Alto in 1973, shipping a couple thousand computers. They followed that up with the Xerox Star in 1981, selling about 20,000. But PARC had been at it the whole time, inventing all kinds of goodness. 

And so always thinking of the next computer, Apple started the Lisa project in 1978, the year after the release of the Apple II, when profits were just starting to roll in. 

Story has it that Steve Jobs secured a visit to PARC and made out the back with the idea for a windowing personal computer GUI complete with a desktop metaphor. But not so fast. Apple had already begun the Lisa and Macintosh projects before Jobs visited Xerox. And after the Alto was shown off internally at Xerox in 1977, complete with Mother of All Demo-esque theatrics on stages using remote computers. They had the GUI, the mouse, and networking - while the other computers released that year, the Apple II, Commodore, and TRS-80 were still doing what Dartmouth, the University of Illinois, and others had been doing since the 60s - just at home instead of on time sharing computers. 

In other words, enough people in computing had seen the oNLine System from Stanford. The graphical interface was coming and wouldn’t be stopped. The mouse had been written about in scholarly journals. But it was all pretty expensive. The visits to PARC, and hiring some of the engineers, helped the teams at Apple figure out some of the problems they didn’t even know they had. They helped make things better and they helped the team get there a little quicker. But by then the coming evolution in computing was inevitable. 

Still, the Xerox Star was considered a failure. But Apple said “hold my beer” and got to work on a project that would become the Lisa. It started off simply enough: some ideas from Apple executives like Steve Jobs and then 10 people, led by Ken Rothmuller, to develop a system with windows and a mouse. Rothmuller got replaced with John Couch, Apple’s 54th employee. Trip Hawkins got a great education in marketing on that team. He would later found Electronic Arts, one of the biggest video game publishers in the world.

Larry Tesler from the Stanford AI Lab and then Xerox PARC joined the team to run the system software team. He’d been on ARPANet since writing Pub an early markup language and was instrumental in the Gypsy Word Processor, Smalltalk, and inventing copy and paste. Makes you feel small to think of some of this stuff. 

Bruce Daniels, one of the Zork creators from MIT, joined the team from HP as the software manager. 

Wayne Rosing, formerly of Digital and Data General, was brought in to design the hardware. He’d later lead the Sparc team and then become a VP of Engineering at Google.  

The team grew. They brought in Bill Dresselhaus as a principal product designer for the look and use and design and even packaging. They started with a user interface and then created the hardware and applications. 

Eventually there would be nearly 100 people working on the Lisa project and it would run over $150 million in R&D. After 4 years, they were still facing delays and while Jobs had been becoming more and more involved, he was removed from the project. The personal accounts I’ve heard seem to be closer to other large out of control projects at companies that I’ve seen though. 

The Apple II used that MOS 6502 chip. And life was good. The Lisa used the Motorola 68000 at 5 MHz. This was a new architecture to replace the 6800. It was time to go 32-bit. 

The Lisa was supposed to ship with between 1 and 2 megabytes of RAM. It had a built-in 12 inch screen that was 720 x 364. 

They got to work building applications, releasing LisaWrite, LisaCalc, LisaDraw, LisaGraph, LisaGuide, LisaList, LisaProject, and LisaTerminal. They translated it to British English, French, German, Italian, and Spanish. 

All the pieces were starting to fall into place. But the project kept growing. And delays. Jobs got booted from the Lisa project amidst concerns it was bloated, behind schedule, wasting company resources, and that Jobs’ perfectionism was going to result in a product that could never ship. The cost of the machine was over $10,000. 

Thing is, as we’ll get into later, every project went over budget and ran into delays for the next decade. Great ideas could then be capitalized on by others - even if a bit watered down. Some projects need to teach us how not to do projects - improve our institutional knowledge about the project or product discipline. That didn’t exactly happen with Lisa. 

We see times in the history of computing and technology for that matter, when a product is just too far advanced for its time. That would be the Xerox Alto. As costs come down, we can then bring ideas to a larger market. That should have been the Lisa. But it wasn’t. While nearly half the cost of a Xerox Star, less than half the number of units were sold.

Following the release of the Lisa, we got other desktop metaphors and graphical interfaces. Agat out of the Soviet Union, SGI, Visi (makers of Visicalc), GEM from Digital Research, DeskMate from Tandy, Amiga Intuition, Acorn Master Compact, the Arthur for the ARM, and the initial releases of Microsoft Windows. By the late 1980s the graphical interface was ubiquitous and computers were the easiest to use for the novice than they’d ever been before. 

But developers didn’t flock to the system as they’d done with the Apple II. You needed a specialized development workstation so why would they? People didn’t understand the menuing system yet. As someone who’s written command line tools, sometimes they’re just easier than burying buttons in complicated graphical interfaces. 

“I’m not dead yet… just… badly burned. Or sick, as it were.” Apple released the Lisa 2 in 1984. It went for about half the price and was a little more stable. One reason was that the Twiggy disk drives Apple built for the Lisa were replaced with Sony microfloppy drives. This looked much more like what we’d get with the Mac, only with expansion slots. 

The end of the Lisa project was more of a fizzle. After the original Mac was released, Lisa shipped as the Macintosh XL, for $4,000. Sun Remarketing built MacWorks to emulate the Macintosh environment and that became the main application of the Macintosh XL. 

Sun Remarketing bought 5,000 of the Mac XLs and improved them somewhat. The last of the 2,700 Lisa computers were buried in a landfill in Utah in 1989. As the whole project had been, they ended up being a write-off. Apple traded them out for a deep discount on the Macintosh Plus. By then, Steve Jobs was long gone, Apple was all about the Mac and the next year General Magic would begin ushering in the era of mobile devices. 

The Lisa was a technical marvel at the time and a critical step in the evolution of the desktop metaphor, then nearly twenty years old, beginning at Stanford on NASA and ARPA grants, evolving further at PARC when members of the team went there, and continuing on at Apple. The lessons learned in the Lisa project were immense and helped inform the evolution of the next project, the Mac. But might the product have actually gained traction in the market if Steve Jobs had not been telling people within Apple and outside that the Mac was the next thing, while the Apple II line was still accounting for most of the revenue of the company? There’s really no way to tell. The Mac used a newer Motorola 68000 at nearly 8 megahertz so was faster, the OS was cleaner, the machine was prettier. It was smaller, boxier like the newer Japanese cars at the time. It was just better. But it probably couldn’t have been if not for the Lisa.

Lisa was slower than it was supposed to be. The operating system tended to be fragile. There were recalls. Steve Jobs was never afraid to cannibalize a product to make the next awesome thing. He did so with Lisa. If we step back and look at the Lisa as an R&D project, it was a resounding success. But as a public company, the shareholders didn’t see it that way at the time. 

So next time there’s an R&D project running amuck, think about this. The Lisa changed the world, ushering in the era of the graphical interface. All for the low cost of $50 million after sales of the device are taken out of it. But they had to start anew with the Mac and only bring in the parts that worked. They built out too much technical debt while developing the product to do anything else. While it can be painful - sometimes it’s best to start with a fresh circuit board and a blank command line editor. Then we can truly step back and figure out how we want to change the world.


Apple and NeXT Computer

     2/15/2021

Steve Jobs had an infamous split with the board of directors of Apple and left the company shortly after the release of the original Mac. He was an innovator who at 21 years old had started Apple in the garage with Steve Wozniak and at 30 years old while already plenty wealthy felt he still had more to give and do. We can say a lot of things about him but he was arguably one of the best product managers ever. 

He told Apple he’d be taking some “low-level staffers” and ended up taking Rich Page, Bud Tribble, Dan'l Lewin, George Crow, and Susan Barnes to be the CFO. They also took Susan Kare and Joanna Hoffman. had their eyes on a computer that was specifically targeting higher education. They wanted to build computers for researchers and universities. 

Companies like CDC and Data General had done well in Universities. The team knew there was a niche that could be carved out there. There were some gaps with the Mac that made it a hard sell in research environments. Computer scientists needed object-oriented programming and protected memory. Having seen the work at PARC on object-oriented languages, Jobs knew the power and future-proof approach. 

Unix System V had branched a number of times and it was a bit more of a red ocean than I think they realized. But Jobs put up $7 million of his own money to found NeXT Computer. He’d add another $5 million and Ross Perot would add another $20 million. The pay bands were one of the most straight-forward of any startup ever founded. The senior staff made $75,000 and everyone else got $50,000. Simple. 

Ironically, so soon after the 1984 Super Bowl ad where Jobs based IBM, they hired the man who designed the IBM logo, Paul Rand, to design a logo for NeXT. They paid him $100,000 flat. Imagine the phone call when Jobs called IBM to get them to release Rand from a conflict of interest in working with them. 

They released the first computer in 1988. The NeXT Computer, as it was called, was expensive for the day, coming in at $6,500. It sported a Motorola 68030 CPU and clocked in at a whopping 25 MHz. And it came with a special operating system called NeXTSTEP.

NeXTSTEP was based on the Mach kernel with some of the source code coming from BSD. If we go back a little, Unix was started at Bell Labs in 1969 and by the late 70s had been forked from Unix System V to BSD, Unix version 7, and PWB - with each of those resulting in other forks that would eventually become OpenBSD, SunOS, NetBSD, Solaris, HP-UX, Linux, AIX, and countless others. 

Mach was developed at Carnegie Mellon University and is one of the earliest microkernels. For Mach, Richard Rashid (who would later found Microsoft Research) and Avie Tevanian, were looking specifically to distributed computing. And the Mach project was kicked off in 1985, the same year Jobs left Apple. 

Mach was backwards-compatible to BSD 4.2 and so could run a pretty wide variety of software. It allowed for threads, or units of execution and tasks or objects that enabled threads. It provided support for messages, which for object oriented languages are typed data objects that fall outside the scope of tasks and threads and then a protected message queue, to manage the messages between tasks and rights of access. They stood it up on a DEC VAX and released it publicly in 1987.

Here’s the thing, Unix licensing from Bell Labs was causing problems. So it was important to everyone that the license be open. And this would be important to NeXT as well. NeXT needed a next-generation operating system and so Avi Tevanian was recruited to join NeXT as the Vice President of Software Engineering. There, he designed NeXTSTEP with a handful of engineers.

The computers had custom boards and were fast. And they were a sleek black like nothing I’d seen before. But Bill Gates was not impressed claiming that “If you want black, I’ll get you a can of paint.” But some people loved the machines and especially some of the tools NeXT developed for programmers.

They got a factory to produce the machines and it only needed to crank out 100 a month as opposed to the thousands it was built to produce. In other words, the price tag was keeping universities from buying the machines. So they pivoted a little. They went up-market with the NeXTcube in 1990, which ran NeXTSTEP, OPENSTEP, or NetBSD and came with the Motorola 68040 CPU. This came machine in at $8,000 to almost $16,000. It came with a hard drive. For the lower end of the market they also released the NeXTstation in 1990, which shipped for just shy of $5,000.

The new models helped but by 1991 they had to lay off 5 percent of the company and another 280 by 1993. That’s when the hardware side got sold to Canon so NeXT could focus exclusively on NeXTSTEP.  That is, until they got acquired by Apple in 1997.

By the end, they’d sold around 50,000 computers. Apple bought NeXT for $429 million and 1.5 million shares of Apple stock, trading at 22 cents at the time, which was trading at $17 a share so worth another $25 and a half million dollars. That makes the deal worth $454 million or $9,080 per machine NeXT had ever built. But it wasn’t about the computer business, which had already been spun down. It was about Jobs and getting a multi-tasking, object-oriented, powerhouse of an operating system, the grandparent of OS X - and the derivative macOS, iOS, iPadOS, watchOS, and tvOS forks.

The work done at NeXT has had a long-term impact on the computer industry as a whole. For one, the spinning pinwheel on a Mac. And the Dock. And the App Store. And Objective-C. But also Interface Builder as an IDE was revolutionary. Today we use Xcode. But many of the components go back all the way. And so much more. 

After the acquisition, NeXT became Mac OS X Server in 1999 and by 2001 was Mac OS X. The rest there is history. But the legacy of the platform is considerable. Just on NeXTSTEP we had a few pretty massive successes.

Tim Berners-Lee developed the first web browser WorldWideWeb on NeXTSTEP for a NeXT . Other browsers for other platforms would come but his work became the web as we know it today. The machine he developed the web on is now on display at the National Museum of Science and Media in the UK.

We also got games like Quake, Heretic, Stife, and Doom from Interface Builder. And webobjects. And the people. 

Tevanian came with NeXT to Apple as the Senior Vice President of Software Engineering. Jobs became an advisor, then CEO. Craig Federighi came with the acquisition as well - now Apple’s VP of software engineering. And I know dozens of others who came in from NeXT and helped reshape the culture at Apple.

Next.com still redirects to Apple.com. It took three years to ship that first computer at NeXT. It took 2 1/2 years to develop the iPhone. The Apple II, iPod, iPad, and first iMac were much less. Nearly 5 years for the original Mac. Some things take a little more time to flush out than others. Some need the price of components or new components to show up before you know it can be insanely great. Some need false starts like the Steve Jobs Steve Jobs famously said Apple wanted to create a computer in a book in 1983. That finally came out with the release of the iPad in 2010, 27 years later. 

And so the final component of the Apple acquisition of NeXT to mention is Steve Jobs himself. He didn’t initially come in. He’d just become a billionaire off Pixar and was doing pretty darn well. His arrival back at Apple signified the end of a long draught for the company and all those products we mentioned and the iTunes music store and the App Store (both initially built on WebObjects) would change the way we consume content forever. His impact was substantial. For one, after factoring stock splits, the company might still be trading at .22 cents a share, which is what it would be today with all that. Instead they’re the most highly valued company in the world. But that pales in comparison to the way he and his teams and that relentless eye to product and design has actually changed the world. And the way his perspectives on privacy help protect us today, long after he passed. 

The heroes journey (as described is a storytelling template that follows a hero from disgrace, to learn the mistakes of their past and reinvent themselves amidst a crisis throughout a grand adventure, and return home transformed. NeXT and Pixar represent part of that journey here. Which makes me wonder: what is my own Monomyth? Where will I return to? What is or was my abyss? These can be large or small. And while very few people in the world will have one like Steve Jobs did, we should all reflect on ours and learn from them. And yes that was plural because life is not so simple that there is one.

The past, and our understanding of it, predicts the future. Good luck on your journey. 


Apple's Lost Decade

     2/12/2021

I often think of companies in relation to their contribution to the next evolution in the forking and merging of disciplines in computing that brought us to where we are today. Many companies have multiple contributions. Few have as many such contributions as Apple. But there was a time when they didn’t seem so innovative. 

This lost decade began about half way through the tenure of John Sculley and can be seen through the lens of the CEOs. There was Sculley, CEO from 1983 to 1993. Co-founders and spiritual centers of Apple, Steve Jobs and Steve Wozniak, left Apple in 1985. Jobs to create NeXT and Wozniak to jump into a variety of companies like making universal remotes, wireless GPS trackers, and and other adventures. 

This meant Sculley was finally in a position to be fully in charge of Apple. His era would see sales 10x from $800 million to $8 billion. Operationally, he was one of the more adept at cash management, putting $2 billion in the bank by 1993. Suddenly the vision of Steve Jobs was paying off. That original Mac started to sell and grow markets. But during this time, first the IBM PC and then the clones, all powered by the Microsoft operating system, completely took the operating system market for personal computers. Apple had high margins yet struggled for relevance. 

Under Sculley, Apple released HyperCard, funded a skunkworks team in General Magic, arguably the beginning of ubiquitous computing, and using many of those same ideas he backed the Newton, coining the term personal digital assistant. Under his leadership, Apple marketing sent 200,000 people home with a Mac to try it out. Put the device in the hands of the people is probably one of the more important lessons they still teach newcomers that work in Apple Stores. 

Looking at the big financial picture it seems like Sculley did alright. But in Apple’s fourth-quarter earnings call in 1993, they announced a 97 drop from the same time in 1992. This was also when a serious technical debt problem began to manifest itself. 

The Mac operating system grew from the system those early pioneers built in 1984 to Macintosh System Software going from version 1 to version 7. But after annual releases leading to version 6, it took 3 years to develop system 7 and the direction to take with the operating system caused a schism in Apple engineering around what would happen once 7 shipped. Seems like most companies go through almost the exact same schism. Microsoft quietly grew NT to resolve their issues with Windows 3 and 95 until it finally became the thing in 2000. IBM had invested heavily into that same code, basically, with Warp - but wanted something new. 

Something happened while Apple was building macOS 7. They lost Jean Lois Gasseé who had been head of development since Steve Jobs left. When Sculley gave everyone a copy of his memoir, Gasseé provided a copy of The Mythical Man-Month, from Fred Brooks’ experience with the IBM System 360. It’s unclear today if anyone read it. To me this is really the first big sign of trouble. Gassée left to build another OS, BeOS. 

By the time macOS 7 was released, it was clear that the operating system was bloated, needed a massive object-oriented overhaul, and under Sculley the teams were split, with one team eventually getting spun off into its own company and then became a part of IBM to help with their OS woes. The team at Apple took 6 years to release the next operating system. Meanwhile, one of Sculley’s most defining decisions was to avoid licensing the Macintosh operating system. Probably because it was just too big a mess to do so. And yet everyday users didn’t notice all that much and most loved it. 

But third party developers left. And that was at one of the most critical times in the history of personal computers because Microsoft was gaining a lot of developers for Windows 3.1 and released the wildly popular Windows 95. 

The Mac accounted for most of the revenue of the company, but under Sculley the company dumped a lot of R&D money into the Newton. As with other big projects, the device took too long to ship and when it did, the early PDA market was a red ocean with inexpensive competitors. The Palm Pilot effectively ended up owning that pen computing market. 

Sculley was a solid executive. And he played the part of visionary from time to time. But under his tenure Apple found operating system problems, rumors about Windows 95, developers leaving Apple behind for the Windows ecosystem, and whether those technical issues are on his lieutenants or him, the buck stocks there. The Windows clone industry led to PC price wars that caused Apple revenues to plummet. And so Markkula was off to find a new CEO. 

Michael Spindler became the CEO from 1993 to 1996. The failure of the Newton and Copland operating systems are placed at his feet, even though they began in the previous regime. Markkula hired Digital Equipment and Intel veteran Spindler to assist in European operations and he rose to President of Apple Europe and then ran all international. He would become the only CEO to have no new Mac operating systems released in his tenure. Missed deadlines abound with Copland and then Tempo, which would become Mac OS 8. 

And those aren’t the only products that came out at the time. We also got the PowerCD, the Apple QuickTake digital camera, and the Apple Pippin. Bandai had begun trying to develop a video game system with a scaled down version of the Mac. The Apple Pippin realized Markkula’s idea from when the Mac was first conceived as an Apple video game system. 

There were a few important things that happened under Spindler though. First, Apple moved to the PowerPC architecture. Second, he decided to license the Macintosh operating system to companies wanting to clone the Macintosh. And he had discussions with IBM, Sun, and Philips to acquire Apple. Dwindling reserves, increasing debt. Something had to change and within three years, Spindler was gone.

Gil Amelio was CEO from 1996 to 1997. He moved from the board while the CEO at National Semiconductor to CEO of Apple. He inherited a company short on cash and high on expenses. He quickly began pushing forward OS 8, cut a third of the staff, streamline operations, dumping some poor quality products, and releasing new products Apple needed to be competitive like the Apple Network Server. 

He also tried to acquire BeOS for $200 million, which would have Brough Gassée back but instead acquired NeXT for $429 million. But despite the good trajectory he had the company on, the stock was still dropping, Apple continued to lose money, and an immovable force was back - now with another decade of experience launching two successful companies: NeXT and Pixar. 

The end of the lost decade can be seen as the return of Steve Jobs. Apple didn’t have an operating system. They were in a lurch soy-to-speak. I’ve seen or read it portrayed that Steve Jobs intended to take control of Apple. And I’ve seen it portrayed that he was happy digging up carrots in the back yard but came back because he was inspired by Johnny Ive. But I remember the feel around Apple changed when he showed back up on campus. As with other companies that dug themselves out of a lost decade, there was a renewed purpose. There was inspiration. 

By 1997, one of the heroes of the personal computing revolution, Steve Jobs, was back. But not quite… He became interim CEO in 1997 and immediately turned his eye to making Apple profitable again. Over the past decade, the product line expanded to include a dozen models of the Mac. Anyone who’s read Geoffrey Moore’s Crossing the Chasm, Inside the Tornado, and Zone To Win knows this story all too well. We grow, we release new products, and then we eventually need to take a look at the portfolio and make some hard cuts. 

Apple released the Macintosh II in 1987 then the Macintosh Portable in 1989 then the Iicx and II ci in 89 along with the Apple IIgs, the last of that series. By facing competition in different markets, we saw the LC line come along in 1990 and the Quadra in 1991, the same year three models of the PowerBook were released. Different printers, scanners, CD-Roms had come along by then and in 1993, we got a Macintosh TV, the Apple Newton, more models of the LC and by 1994 even more of those plus the QuickTake, Workgroup Server, the Pippin and by 1995 there were a dozen Performas, half a dozen Power Macintosh 6400s, the Apple Network Server and yet another versions of the Performa 6200 and we added the eMade and beige G3 in 1997. The SKU list was a mess. Cleaning that up took time but helped prepare Apple for a simpler sales process. Today we have a good, better, best with each device, with many a computer being build-to-order. 

Jobs restructured the board, ending the long tenure of Mike Markkula, who’d been so impactful at each stage of the company so far. One of the forces behind the rise of the Apple computer and the Macintosh was about to change the world again, this time as the CEO. 


(OldComputerPods) ©Sean Haas, 2020