'appleii' Episodes

SimCity

     7/29/2020

SimCity is one those games that helped expand the collective public consciousness of humans. 

I have a buddy who works on traffic flows in Minneapolis. When I asked how he decided to go into urban planning, he quickly responded with “playing SimCity.” Imagine that, a computer game inspiring a generation of people that wanted to make cities better. How did that come to be?

Will Wright was born in 1960. He went to Louisiana State University then Louisiana Tech and then to The New School in New York. By then, he was able to get an AppleII+ and start playing computer games, including Life, a game initially conceived by mathematician John Conway in 1970. A game that expanded the minds of every person that came in contact with it. That game had begun on the PDP, then in BBC BASIC before spreading around. It allowed players to set an initial configuration for cells and watch them mutate over time. 

After reading about LIFE, Wright wanted to port it to his Apple, so he learned Applesoft BASIC and PASCAL. He tinkered and By 1984 was able to produce a game called Raid on Bungeling Bay. And as many a Minecrafter can tell you, part of the fun was really building the islands in a map editor he built for the game. He happened to also be reading about urban planning and system dynamics. He just knew there was something there. Something that could take part of the fun from Life and editing maps in games and this newfound love of urban planning and give it to regular humans. Something that just might expand our own mental models about where we live and about games. 

This led him to build software that gamified editing maps. Where every choice we made impacted the map over time. Where it was on us to build the perfect map. That game was called Micropolis and would become SimCity. One problem, none of the game publishers wanted to produce it when it was ready for the Commodore 64 in 1985. After Brøderbund turned him down, he had to go back to the drawing board. 

So Wright would team up with his friend Jeff Braun who had founded Maxis Software in 1987. They would release SimCity in 1989 for Mac and Amiga and once it had been ported, for the Atari ST, DOS-based PCs, and the ZX Spectrum. Brøderbund did eventually agree to distribute it as it matured. 

And people started to get this software, staring at a blank slab of land where we zone areas as commercial and residential. We tax areas and can increment those rates, giving us money to zone other areas, provide electricity, water, and other services, and then build parks, schools, hospitals, police stations, etc. The more dense and populous the city becomes, the more difficult the game gets. The population fluctuates and we can tweak settings to grow and shrink the city. I was always playing to grow, until I realized sometimes it’s nice to stabilize and look for harmony instead.

And we see the evolution over time. The initial choices we made could impact the ability to grow forever. But unlike Life we got to keep making better and better (or worse and worse) choices over time. We delighted in watching the population explode. In watching the city grow and flourish. And we had to watch parts of our beloved city decay. We raised taxes when we were running out of money and lowered them when population growth was negatively impacted. We built parks and paid for them. We tried to make people love our city. 

We are only limited in how great a city we can build by our own creativity. And our own ability to place parts of the city alongside the features that let the people live in harmony with the economic and ecological impacts of other buildings and zones. For example, build a power plant as far from residential buildings as you can because people don’t want to live right by a power plant. But running power lines is expensive, so it can’t be too far away in the beginning. 

The game mechanics motivate us to push the city into progress. To build. To develop. People choose to move to our cities based on how well we build them. It was unlike anything else out there. And it was a huge success. 

SimCity 2000 came along in 1993. Graphics had come a long way and you could now see the decay in the icons of buildings. It expanded the types of power plants we could build, added churches, museums, prisons and zoos. - each with an impact to the way the city grows. As the understanding of both programming and urban planning grew for the development team, they added city ordinances. The game got more and more popular. 

SimCity 3000 was the third installment in the series, which came out in 1999. By then, the game had sold over 5 million copies. That’s when they added slums and median incomes to create a classification. And large malls, which negatively impact smaller commercial zones. And toxic waste conversion plants. And prisons, which hits residential areas. And casinos, which increase crime. But each has huge upside as well. As graphics cards continued to get better, the simulation also increased, giving us waterfalls, different types of trees, more realistic grass, and even snow. 

Maxis even dabbled with using their software to improve businesses. Maxis Business Simulations built software for refineries and health as well.  

And then came The Sims, which Wright though of after losing his house to a fire in 1991. Here, instead of simulating a whole city of people at once, we simulated a single person, or a Sim. And we attempted to lead a fulfilling life by satisfying the needs and desires of our sim, buying furniture, building larger homes, having a family, and just… well, living life. But the board at Maxis didn’t like the idea. Maxis was acquired by Electronic Arts in 1997. And they were far more into the Sims idea, so The Sims was released in 2000. And it has sold nearly 200 million copies and raked in over $5 billion dollars in sales, making it one of the best-selling games of all times. Even though now it’s free on mobile devices with tons of in app purchases… 

And after the acquisition of Maxis, SimCity is now distributed by EA. Sim 4 would come along in 2003, continuing to improve the complexity and game play. And with processors getting faster, cities could get way bigger and more complex. SimCity 6 came in 2013, from lead designer Stone Librande and team. They added a Google Earth type of zoom effect to see cities and some pretty awesome road creation tools. And the sounds of cars honking on streets, birds chirping, airplanes flying over, and fans cheering in stadiums were amazing. They added layers so you could look at a colorless model of the city highlighting crime or pollution, to make tracking each of the main aspects of the game easier. Like layers in Photoshop. It was pretty CPU and memory intensive but came with some pretty amazing gameplay. In fact, some of the neighborhood planning has been used to simulate neighborhood development efforts in cities. 

And the game spread to consoles as well, coming to iPhone and web browsers in 2008. I personally choose not to play any more because I’m not into in-app purchasing. 

A lot of science fiction films center around two major themes: either societies enter into a phase of utopia or dystopia. The spread of computing into first our living rooms in the form of PCs and then into our pockets via mobile devices has helped push us into the utopian direction. 

SimCity inspired a generation of city planners and was inspired by more and more mature research done on urban planning. A great step on the route to a utopia and eye opening as to the impact our city planning has on advances towards a dystopian future. We were all suddenly able to envision better city planning and design, making cities friendlier for walking, biking, and being outdoors. Living better. Which is important in a time of continued mass urbanization. 

Computer games could now be about more than moving a dot with a paddle or controlling a character to shoot other characters. Other games with an eye opening and mind expanding game play were feasible. Like Sid Myers’ Civilization, which came along in 1991. But SimCity, like Life, was another major step on the way to where we are today. And it’s so relatable now that I’ve owned multiple homes and seen the impact of tax rates and services the governments in those areas provide. 

So thank you to Will Wright. For inspiring better cities. And thank you to the countless developers, designers, and product managers, for continuing the great work at Maxis then EA. 


Apple: The Apple I computer to the ///

     1/30/2021

I’ve been struggling with how to cover a few different companies, topics, or movements for awhile. The lack of covering their stories thus far has little to do with their impact but just trying to find where to put them in the history of computing. One of the most challenging is Apple. This is because there isn’t just one Apple. Instead there are stages of the company, each with their own place in the history of computers. 

Today we can think of Apple as one of the Big 5 tech companies, which include Amazon, Apple, Google, Facebook, and Microsoft. But there were times in the evolution of the company where things looked bleak. Like maybe they would get gobbled up by another tech company. To oversimplify the development of Apple, we’ll break up their storied ascent into four parts:

  • Apple Computers: This story covers the mid-1970s to mid 1980s and covers Apple rising out of the hobbyist movement and into a gangbuster IPO. The Apple I through III families all centered on one family of chips and took the company into the 90s.
  • The Macintosh: The rise and fall of the Mac covers the introduction of the now-iconic Mac through to the Power Macintosh era. 
  • Mac OS X: This part of the Apple story begins with the return of Steve Jobs to Apple and the acquisition of NeXT, looks at the introduction of the Intel Macs and takes us through to the transition to the Apple M1 CPU.
  • Post PC: Steve Jobs announced the “post PC” era in 2007, and in the coming years the sales of PCs fell for the first time, while tablets, phones, and other devices emerged as the primary means people used devices. 

We’ll start with the early days, which I think of as one of the four key Apple stages of development. And those early days go back far past the days when Apple was hocking the Apple I. They go to high school.

Jobs and Woz

Bill Fernandez and Steve Wozniak built a computer they called “The Cream Soda Computer” in 1970 when Bill was 16 and Woz was 20. It was a crude punch card processing machine built from some parts Woz got from the company he was working for at the time.

Fernandez introduced Steve Wozniak to a friend from middle school because they were both into computers and both had a flare for pranky rebelliousness. That friend was Steve Jobs. 

By 1972, the pranks turned into their first business. Wozniak designed Blue Boxes, initially conceived by Cap’n Crunch John Draper, who got his phreaker name from a whistle in a Cap’n Crunch box that made a tone in 2600 Hz that sent AT&T phones into operator mode. Draper would actually be an Apple employee for a bit. They designed a digital version and sold a few thousand dollars worth. 

Jobs went to Reed College. Wozniak went to Berkely. Both dropped out. 

Woz got a sweet gig at HP designing calculators, where Jobs had worked a summer job in high school.  India to find enlightenment. When Jobs became employee number 40 at Atari, he got Wozniak to help create Breakout. That was the year The Altair 8800 was released and Wozniak went to the first meeting of a little club called the Homebrew Computer Club in 1975 when they got an Altair so the People’s Computer Company could review it. And that was the inspiration. Having already built one computer with Fernandez, Woz designed schematics for another. Going back to the Homebrew meetings to talk through ideas and nerd out, he got it built and proud of his creation, returned to Homebrew with Jobs to give out copies of the schematics for everyone to play with. This was the age of hackers and hobbyists. But that was about to change ever so slightly. 

The Apple I 

Jobs had this idea. What if they sold the boards. They came up with a plan. Jobs sold his VW Microbus and Wozniak sold his HP-65 calculator and they got to work. Simple math. They could sell 50 boards for $40 bucks each and make some cash like they’d done with the blue boxes. But you know, a lot of people didn’t know what to do with the board. Sure, you just needed a keyboard and a television, but that still seemed a bit much. 

Then a little bigger plan - what if they sold 50 full computers. They went to the Byte Shop and talked them into buying 50 for $500. They dropped $20,000 on parts and netted a $5,000 return. They’d go on to sell about 200 of the Apple Is between 1976 and 1977.

It came with a MOS 6502 chip running at a whopping 1 MHz and with 4KB of memory, which could go to 8. They provided Apple BASIC, as most vendors did at the time. That MOS chip was critical. Before it, many used an Intel or the Motorola 6800, which went for $175. But the MOS 6502 was just $25. It was an 8-bit microprocessor designed by a team that Chuck Peddle ran after leaving the 6800 team at Motorola. Armed with that chip at that price, and with Wozniak’s understanding of what it needed to do and how it interfaced with other chips to access memory and peripherals, the two could do something new. 

They started selling the Apple 1 and to quote an ad “the Apple comes fully assembled, tested & burned-in and has a complete power supply on-board, initial set-up is essentially “hassle free” and you can be running in minutes.” This really tells you something about the computing world at the time. There were thousands of hobbyists and many had been selling devices. But this thing had on-board RAM and you could just add a keyboard and video and not have to read LEDs to get output. The marketing descriptions were pretty technical by modern Apple standards, telling us something of the users. It sold for $666.66.

They got help from Patty Jobs building logic boards. Jobs’ friend from college Daniel Kottke joined for the summer, as did Fernandez and Chris Espinosa - now Apple’s longest-tenured employee. It was a scrappy garage kind of company. The best kind. 

They made the Apple I until a few months after they released the successor. But the problem with the Apple I was that there was only one person who could actually support it when customers called: Wozniak. And he was slammed, busy designing the next computer and all the components needed to take it to the mass market, like monitors, disk drives, etc. So they offered a discount for anyone returning the Apple I and destroyed most returned. Those Apple I computers have now been auctioned for hundreds of thousands of dollars all the way up to $1.75 million. 

The Apple II

They knew they were on to something. But a lot of people were building computers. They needed capital if they were going to bring in a team and make a go at things. But Steve Jobs wasn’t exactly the type of guy venture capitalists liked to fund at the time.

Mike Markkula was a product-marketing manager at chip makers Fairchild and Intel who retired early after making a small fortune on stock options. That is, until he got a visit from Steve Jobs. He brought money but more importantly the kind of assistance only a veteran of a successful corporation who’d ride that wave could bring. He brought in Michael "Scotty" Scott, employee #4, to be the first CEO and they got to work on mapping out an early business plan. If you notice the overlapping employee numbers, Scotty might have had something to do with that…

As you may notice by Wozniak selling his calculator, at the time computers weren’t that far removed from calculators. So Jobs brought in a calculator designer named Jerry Manock to design a plastic injection molded case, or shell, for the Apple II. They used the same chip and a similar enough motherboard design. They stuck with the default 4KB of memory and provided jumpers to make it easier to go up to 48. They added a cassette interface for IO. They had a toggle circuit that could trigger the built-in speaker. And they would include two game paddles. This is similar to bundles provided with the Commodore and other vendors of the day. And of course it still worked with a standard TV - but now that TVs were mostly color, so was the video coming out of the Apple II. And all of this came at a starting price of $1,298.

The computer initially shipped with a version of BASIC written by Wozniak but Apple later licensed the Microsoft 6502 BASIC to ship what they called Applesoft BASIC, short for Apple and Micorosft. Here, they turned to Randy Wiggington who was Apple’s employee #6 and had gotten rides to the Homebrew Computer Club from Wozniak as a teenager (since he lived down the street). He and others added features onto Microsoft BASIC to free Wozniak to work on other projects. Deciding they needed a disk operating system, or DOS. Here, rather than license the industry standard CP/M at the time, Wigginton worked with Shepardson, who did various projects for CP/M and Atari.  

The motherboard on the Apple II remains an elegant design. There were certain innovations that Wozniak made, like cutting down the number of DRAM chips by sharing resources between other components. The design was so elegant that Bill Fernandez had to join them as employee number four, in order to help take the board and create schematics to have it silkscreened.  The machines were powerful.

All that needed juice. Jobs asked his former boss Al Alcorn for someone to help out with that. Rod Holt, employee number 5, was brought in to design the power supply. By implementing a switching power supply, as Digital Equipment had done in the PDP-11, rather than a transformer-based power supply, the Apple II ended up being far lighter than many other machines. 

The Apple II was released in 1977 at the West Coast Computer Fair. It, along with the TRS-80 and the Commodore PET would become the 1977 Trinity, which isn’t surprising. Remember Peddle who ran the 6502 design team - he designed the PET. And Steve Leininger was also a member of the Homebrew Computer Club who happened to work at National Semiconductor when Radio Shack/Tandy started looking for someone to build them a computer. 

The machine was stamped with an Apple logo. Jobs hired Rob Janoff, a local graphic designer, to create the logo. This was a picture of an Apple made out of a rainbow, showing that the Apple II had color graphics. This rainbow Apple stuck and became the logo for Apple Computers until 1998, after Steve Jobs returned to Apple, when the Apple went all-black, but the silhouette is now iconic, serving Apple for 45 years and counting.

The computers were an instant success and sold quickly. But others were doing well in the market. Some incumbents and some new. Red oceans mean we have to improve our effectiveness. So this is where Apple had to grow up to become a company. Markkula made a plan to get Apple to $500 million in sales in 10 years on the backs of his $92,000 investment and another $600,000 in venture funding. 

They did $2.7 million dollars in sales in 1977. This idea of selling a pre-assembled computer to the general public was clearly resonating. Parents could use it to help teach their kids. Schools could use it for the same. And when we were done with all that, we could play games on it. Write code in BASIC. Or use it for business. Make some documents in Word Star, spreadsheets in VisiCalc, or use one of the thousands of titles available for the Mac. Sales grew 150x until 1980.

Given that many thought cassettes were for home machines and floppies were for professional machines, it was time to move away from tape. Markkela realized this and had Wozniak design a floppy disk for the Apple II, which went on to be known as the Drive II. Wozniak had experience with disk controllers and studied the latest available. Wozniak again managed to come up with a value engineered design that allowed Apple to produce a good drive for less than any other major vendor at the time. Wozniak would actually later go on to say that it was one of his best designs (and many contemporaries agreed).

Markkula filled gaps as well as anyone. He even wrote free software programs under the name of Johnny Appleseed, a name also used for years in product documentation. He was a classic hacker type of entrepreneur on their behalf, sitting in the guerrilla marketing chair some days or acting as president of the company others, and mentor for Jobs in other days.  

From Hobbyists to Capitalists

Here’s the thing - I’ve always been a huge fan of Apple. Even in their darkest days, which we’ll get to in later episodes, they represented an ideal. But going back to the Apple 1, they were nothing special. Even the Apple II. Osborne, Commodore, Vector Graphics, Atari, and hundreds of other companies were springing up, inspired first by that Altair and then by the rapid drop in the prices of chips. 

The impact of the 1 megahertz barrier and cost of those MOS 6502 chips was profound. The MOS 6502 chip would be used in the Apple II, the Atari 2600, the Nintendo NES, the BBY Micro. And along with the Zylog Z80 and Intel 8080 would spark a revolution in personal computers. Many of those companies would disappear in what we’d think of as a personal computer bubble if there was more money in it. But those that survived, took things to an order of magnitude higher. Instead of making millions they were making hundreds of millions. Many would even go to war in a race to the bottom of prices. And this is where Apple started to differentiate themselves from the rest. 

For starters, due to how anemic the default Altair was, most of the hobbyist computers were all about expansion. You can see it on the Apple I schematics and you can see it in the minimum of 7 expansion slots in the Apple II lineup of computers. Well, all of them except the IIc, marketed as a more portable type of device, with a handle and an RCA connection to a television for a monitor. 

The media seemed to adore them. In an era of JR Ewing of Dallas, Steve Jobs was just the personality to emerge and still somewhat differentiate the new wave of computer enthusiasts. Coming at the tail end of an era of social and political strife, many saw something of themselves in Jobs. He looked the counter-culture part. He had the hair, but this drive. The early 80s were going to be all about the yuppies though - and Jobs was putting on a suit. Many identified with that as well.

Fueled by the 150x sales performance shooting them up to $117M in sales, Apple filed for an IPO, going public in 1980, creating hundreds of millionaires, including at least 40 of their own employees. It was the biggest IPO since Ford in 1956, the same year Steve Jobs was born. The stock was filed at $14 and shot up to $29 on the first day alone, leaving Apple sitting pretty on a $1.778 valuation. 

Scotty, who brought the champagne, made nearly a $100M profit. One of the Venture Capitalists, Arthur Rock, made over $21M on a $57,600 investment. Rock had been the one to convince the Shockley Semiconductor team to found Fairchild, a key turning point in putting silicon into the name of Silicon Valley. When Noyce and Moore left there to found Intel, he was involved. And he would stay in touch with Markkula, who was so enthusiastic about Apple that Rock invested and began a stint on the board of directors at Apple in 1978, often portrayed as the villain in the story of Steve Jobs. But let’s think about something for a moment. Rock was a backer of Scientific Data Systems, purchased by Xerox in 1969, becoming the Xerox 500. Certainly not Xerox PARC and in fact, the anti-PARC, but certainly helping to connect Jobs to Xerox later as Rock served on the board of Xerox.

The IPO Hangover

Money is great to have but also causes problems. Teams get sidetracked trying to figure out what to do with their hauls. Like Rod Holt’s $67M haul that day. It’s a distraction in a time when executional excellence is critical. We have to bring in more people fast, which created a scenario Mike Scott referred to as a “bozo explosion.” Suddenly more people actually makes us less effective. 

Growing teams all want a seat at a limited table. Innovation falls off as we rush to keep up with the orders and needs of existing customers. Bugs, bigger code bases to maintain, issues with people doing crazy things. 

Taking our eyes off the ball and normalizing the growth can be hard. By 1981, Scotty was out after leading some substantial layoffs.  Apple stock was down. A big IPO also creates investments in competitors. Some of those would go on a race to the bottom in price. 

Apple didn’t compete on price. Instead, they started to plan the next revolution, a key piece of Steve Jobs emerging as a household name. They would learn what the research and computer science communities had been doing - and bring a graphical interface and mouse to the world with Lisa and a smaller project brought forward at the time by Jef Raskin that Jobs tried to kill - but one that Markkula not only approved, but kept Jobs from killing, the Macintosh. 

Fernandez, Holt, Wigginton, and even Wozniak just drifted away or got lost in the hyper-growth of the company, as is often the case. Some came back. Some didn’t. Many of us go through the same in rapidly growing companies. 

Next (but not yet NeXT)

But a new era of hackers was on the way. And a new movement as counter to the big computer culture as Jobs. But first, they needed to take a trip to Xerox. In the meantime, the Apple III was an improvement but proved that the Apple computer line had run its course. They released it in 1980 and recalled the first 14,000 machines and never peaked 75,000 machines sold, killing off the line in 1984. A special year. 


The Apple Lisa

     2/2/2021

Apple found massive success on the back of the Apple II. They went public like many of the late 70s computer companies and the story could have ended there, as it did for many computer companies of the era who were potentially bigger, had better technology, better go to market strategies, and/or even some who were far more innovative. 

But it didn’t. The journey to the next stage began with the Apple IIc, Apple IIgs, and other incrementally better, faster, or smaller models. Those funded the research and development of a number of projects. One was a new computer: the Lisa. I bet you thought we were jumping into the Mac next. Getting there. But twists and turns, as the title suggests. 

The success of the Apple II led to many of the best and brightest minds in computers wanting to go work at Apple. Jobs came to be considered a visionary. The pressure to actually become one has been the fall of many a leader. And Jobs almost succumbed to it as well. 

Some go down due to a lack of vision, others because they don’t have the capacity for executional excellence. Some lack lieutenants they can trust. The story isn’t clear with Jobs. He famously sought perfection. And sometimes he got close. 

The Xerox Palo Alto Research Center, or PARC for short, had been a focal point of raw research and development, since 1970. They inherited many great innovations, outlandish ideas, amazing talent, and decades of research from academia and Cold War-inspired government grants. Ever since Sputnik, the National Science Foundation and the US Advanced Research Projects Agency had funded raw research. During Vietnam, that funding dried up and private industry moved in to take products to market. 

Arthur Rock had come into Xerox in 1969, on the back of an investment into Scientific Data Systems. While on the board of Xerox, he got to see the advancements being made at PARC. PARC hired some of the oNLine System (NLS) team who worked to help ship the Xerox Alto in 1973, shipping a couple thousand computers. They followed that up with the Xerox Star in 1981, selling about 20,000. But PARC had been at it the whole time, inventing all kinds of goodness. 

And so always thinking of the next computer, Apple started the Lisa project in 1978, the year after the release of the Apple II, when profits were just starting to roll in. 

Story has it that Steve Jobs secured a visit to PARC and made out the back with the idea for a windowing personal computer GUI complete with a desktop metaphor. But not so fast. Apple had already begun the Lisa and Macintosh projects before Jobs visited Xerox. And after the Alto was shown off internally at Xerox in 1977, complete with Mother of All Demo-esque theatrics on stages using remote computers. They had the GUI, the mouse, and networking - while the other computers released that year, the Apple II, Commodore, and TRS-80 were still doing what Dartmouth, the University of Illinois, and others had been doing since the 60s - just at home instead of on time sharing computers. 

In other words, enough people in computing had seen the oNLine System from Stanford. The graphical interface was coming and wouldn’t be stopped. The mouse had been written about in scholarly journals. But it was all pretty expensive. The visits to PARC, and hiring some of the engineers, helped the teams at Apple figure out some of the problems they didn’t even know they had. They helped make things better and they helped the team get there a little quicker. But by then the coming evolution in computing was inevitable. 

Still, the Xerox Star was considered a failure. But Apple said “hold my beer” and got to work on a project that would become the Lisa. It started off simply enough: some ideas from Apple executives like Steve Jobs and then 10 people, led by Ken Rothmuller, to develop a system with windows and a mouse. Rothmuller got replaced with John Couch, Apple’s 54th employee. Trip Hawkins got a great education in marketing on that team. He would later found Electronic Arts, one of the biggest video game publishers in the world.

Larry Tesler from the Stanford AI Lab and then Xerox PARC joined the team to run the system software team. He’d been on ARPANet since writing Pub an early markup language and was instrumental in the Gypsy Word Processor, Smalltalk, and inventing copy and paste. Makes you feel small to think of some of this stuff. 

Bruce Daniels, one of the Zork creators from MIT, joined the team from HP as the software manager. 

Wayne Rosing, formerly of Digital and Data General, was brought in to design the hardware. He’d later lead the Sparc team and then become a VP of Engineering at Google.  

The team grew. They brought in Bill Dresselhaus as a principal product designer for the look and use and design and even packaging. They started with a user interface and then created the hardware and applications. 

Eventually there would be nearly 100 people working on the Lisa project and it would run over $150 million in R&D. After 4 years, they were still facing delays and while Jobs had been becoming more and more involved, he was removed from the project. The personal accounts I’ve heard seem to be closer to other large out of control projects at companies that I’ve seen though. 

The Apple II used that MOS 6502 chip. And life was good. The Lisa used the Motorola 68000 at 5 MHz. This was a new architecture to replace the 6800. It was time to go 32-bit. 

The Lisa was supposed to ship with between 1 and 2 megabytes of RAM. It had a built-in 12 inch screen that was 720 x 364. 

They got to work building applications, releasing LisaWrite, LisaCalc, LisaDraw, LisaGraph, LisaGuide, LisaList, LisaProject, and LisaTerminal. They translated it to British English, French, German, Italian, and Spanish. 

All the pieces were starting to fall into place. But the project kept growing. And delays. Jobs got booted from the Lisa project amidst concerns it was bloated, behind schedule, wasting company resources, and that Jobs’ perfectionism was going to result in a product that could never ship. The cost of the machine was over $10,000. 

Thing is, as we’ll get into later, every project went over budget and ran into delays for the next decade. Great ideas could then be capitalized on by others - even if a bit watered down. Some projects need to teach us how not to do projects - improve our institutional knowledge about the project or product discipline. That didn’t exactly happen with Lisa. 

We see times in the history of computing and technology for that matter, when a product is just too far advanced for its time. That would be the Xerox Alto. As costs come down, we can then bring ideas to a larger market. That should have been the Lisa. But it wasn’t. While nearly half the cost of a Xerox Star, less than half the number of units were sold.

Following the release of the Lisa, we got other desktop metaphors and graphical interfaces. Agat out of the Soviet Union, SGI, Visi (makers of Visicalc), GEM from Digital Research, DeskMate from Tandy, Amiga Intuition, Acorn Master Compact, the Arthur for the ARM, and the initial releases of Microsoft Windows. By the late 1980s the graphical interface was ubiquitous and computers were the easiest to use for the novice than they’d ever been before. 

But developers didn’t flock to the system as they’d done with the Apple II. You needed a specialized development workstation so why would they? People didn’t understand the menuing system yet. As someone who’s written command line tools, sometimes they’re just easier than burying buttons in complicated graphical interfaces. 

“I’m not dead yet… just… badly burned. Or sick, as it were.” Apple released the Lisa 2 in 1984. It went for about half the price and was a little more stable. One reason was that the Twiggy disk drives Apple built for the Lisa were replaced with Sony microfloppy drives. This looked much more like what we’d get with the Mac, only with expansion slots. 

The end of the Lisa project was more of a fizzle. After the original Mac was released, Lisa shipped as the Macintosh XL, for $4,000. Sun Remarketing built MacWorks to emulate the Macintosh environment and that became the main application of the Macintosh XL. 

Sun Remarketing bought 5,000 of the Mac XLs and improved them somewhat. The last of the 2,700 Lisa computers were buried in a landfill in Utah in 1989. As the whole project had been, they ended up being a write-off. Apple traded them out for a deep discount on the Macintosh Plus. By then, Steve Jobs was long gone, Apple was all about the Mac and the next year General Magic would begin ushering in the era of mobile devices. 

The Lisa was a technical marvel at the time and a critical step in the evolution of the desktop metaphor, then nearly twenty years old, beginning at Stanford on NASA and ARPA grants, evolving further at PARC when members of the team went there, and continuing on at Apple. The lessons learned in the Lisa project were immense and helped inform the evolution of the next project, the Mac. But might the product have actually gained traction in the market if Steve Jobs had not been telling people within Apple and outside that the Mac was the next thing, while the Apple II line was still accounting for most of the revenue of the company? There’s really no way to tell. The Mac used a newer Motorola 68000 at nearly 8 megahertz so was faster, the OS was cleaner, the machine was prettier. It was smaller, boxier like the newer Japanese cars at the time. It was just better. But it probably couldn’t have been if not for the Lisa.

Lisa was slower than it was supposed to be. The operating system tended to be fragile. There were recalls. Steve Jobs was never afraid to cannibalize a product to make the next awesome thing. He did so with Lisa. If we step back and look at the Lisa as an R&D project, it was a resounding success. But as a public company, the shareholders didn’t see it that way at the time. 

So next time there’s an R&D project running amuck, think about this. The Lisa changed the world, ushering in the era of the graphical interface. All for the low cost of $50 million after sales of the device are taken out of it. But they had to start anew with the Mac and only bring in the parts that worked. They built out too much technical debt while developing the product to do anything else. While it can be painful - sometimes it’s best to start with a fresh circuit board and a blank command line editor. Then we can truly step back and figure out how we want to change the world.


(OldComputerPods) ©Sean Haas, 2020