BASIC Welcome to the History of Computing Podcast, where we explore the history of information technology. Because by understanding the past prepares us to innovate the future! Today we’re going to look at the computer that was the history of the BASIC programming language. We say BASIC but really BASIC is more than just a programming language. It’s a family of languages and stands for Beginner’s All-purpose Symbolic Instruction Code. As the name implies it was written to help students that weren’t math nerds learn how to use computers. When I was selling a house one time, someone was roaming around in my back yard and apparently they’d been to an open house and they asked if I’m a computer scientist after they saw a dozen books I’d written on my bookshelf. I really didn’t know how to answer that question We’ll start this story with Hungarian John George Kemeny. This guy was pretty smart. He was born in Budapest and moved to the US with his family in 1940 when his family fled anti-Jewish sentiment and laws in Hungary. Some of his family would go on to die in the Holocaust, including his grandfather. But safely nestled in New York City, he would graduate high school at the top of his class and go on to Princeton. Check this out, he took a year off to head out to Los Alamos and work on the Manhattan Project under Nobel laureate Richard Feynman. That’s where he met fellow Hungarian immigrant Jon Von Neumann - two of a group George Marx wrote about in his book on great Hungarian Emmigrant Scientists and thinkers called The Martians. When he got back to Princeton he would get his Doctorate and act as an assistant to Albert Einstein. Seriously, THE Einstein. Within a few years he was a full professor at Dartmouth and go on to publish great works in mathematics. But we’re not here to talk about those contributions to the world as an all around awesome place. You see, by the 60s math was evolving to the point that you needed computers. And Kemeny and Thomas Kurtz would do something special. Now Kurtz was another Dartmoth professor who got his PhD from Princeton. He and Kemeny got thick as thieves and wrote the Dartmouth Time-Sharing System (keep in mind that Time Sharing was all the rage in the 60s, as it gave more and more budding computer scientists access to those computer-things that prior to the advent of Unix and the PC revolution had mostly been reserved for the high priests of places like IBM. So Time Sharing was cool, but the two of them would go on to do something far more important. In 1956, they would write DARSIMCO, or Dartmouth Simplified Code. As with Pascal, you can blame Algol. Wait, no one has ever heard of DARSIMCO? Oh… I guess they wrote that other language you’re here to hear the story of as well. So in 59 they got a half million dollar grant from the Alfred P. Sloan foundation to build a new department building. That’s when Kurtz actually joined the department full time. Computers were just going from big batch processed behemoths to interactive systems. They tried teaching with DARSIMCO, FORTRAN, and the Dartmouth Oversimplified Programming Experiment, a classic acronym for 1960s era DOPE. But they didn’t love the command structure nor the fact that the languages didn’t produce feedback immediately. What was it called? Oh, so in 1964, Kemeny wrote the first iteration of the BASIC programming language and Kurtz joined him very shortly thereafter. They did it to teach students how to use computers. It’s that simple. And as most software was free at the time, they released it to the public. We might think of this as open source-is by todays standards. I say ish as Dartmouth actually choose to copyright BASIC. Kurtz has said that the name BASIC was chosen because “We wanted a word that was simple but not simple-minded, and BASIC was that one.” The first program I wrote was in BASIC. BASIC used line numbers and read kinda’ like the English language. The first line of my program said 10 print “Charles was here” And the computer responded that “Charles was here” - the second program I wrote just added a second line that said: 20 goto 10 Suddenly “Charles was here” took up the whole screen and I had to ask the teacher how to terminate the signal. She rolled her eyes and handed me a book. And that my friend, was the end of me for months. That was on an Apple IIc. But a lot happened with BASIC between 1964 and then. As with many technologies, it took some time to float around and evolve. The syntax was kinda’ like a simplified FORTRAN, making my FORTRAN classes in college a breeze. That initial distribution evolved into Dartmouth BASIC, and they received a $300k grant and used student slave labor to write the initial BASIC compiler. Mary Kenneth Keller was one of those students and went on to finish her Doctorate in 65 along with Irving Tang, becoming the first two PhDs in computer science. After that she went off to Clarke College to found their computer science department. The language is pretty easy. I mean, like PASCAL, it was made for teaching. It spread through universities like wildfire during the rise of minicomputers like the PDP from Digital Equipment and the resultant Data General Nova. This lead to the first text-based games in BASIC, like Star Trek. And then came the Altair and one of the most pivotal moments in the history of computing, the porting of BASIC to the platform by Microsoft co-founders Bill Gates and Paul Allen. But Tiny BASIC had appeared a year before and suddenly everyone needed “a basic.” You had Commodore BASIC, BBC Basic, Basic for the trash 80, the Apple II, Sinclair and more. Programmers from all over the country had learned BASIC in college on minicomputers and when the PC revolution came, a huge part of that was the explosion of applications, most of which were written in… you got it, BASIC! I typically think of the end of BASIC coming in 1991 when Microsoft bought Visual Basic off of Alan Cooper and object-oriented programming became the standard. But the things I could do with a simple if, then else statement. Or a for to statement or a while or repeat or do loop. Absolute values, exponential functions, cosines, tangents, even super-simple random number generation. And input and output was just INPUT and PRINT or LIST for source. Of course, functional programming was always simpler and more approachable. So there, you now have Kemeny as a direct connection between Einstein and the modern era of computing. Two immigrants that helped change the world. One famous, the other with a slightly more nuanced but probably no less important impact in a lot of ways. Those early BASIC programs opened our eyes. Games, spreadsheets, word processors, accounting, Human Resources, databases. Kemeny would go on to chair the commission investigating Three Mile Island, a partial nuclear meltdown that was a turning point in nuclear proliferation. I wonder what Kemeny thought when he read the following on the Statue of Liberty: Give me your tired, your poor, Your huddled masses yearning to breathe free, The wretched refuse of your teeming shore. Perhaps, like many before and after, he thought that he would breathe free and with that breath, do something great, helping bring the world into the nuclear era and preparing thousands of programmers to write software that would change the world. When you wake up in the morning, you have crusty bits in your eyes and things seem blurry at first. You have to struggle just a bit to get out of bed and see the sunrise. BASIC got us to that point. And for that, we owe them our sincerest thanks. And thank you dear listeners, for your contributions to the world in whatever way they may be. You’re beautiful. And of course thank you for giving me some meaning on this planet by tuning in. We’re so lucky to have you, have a great day!
Visual Basic Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Today we’re going to cover an important but often under appreciated step on the path to ubiquitous computing: Visual Basic. Visual Basic is a programming language for Windows. It’s in most every realistic top 10 of programming languages of all time. It’s certainly split into various functional areas over the last decade or so, but it was how you did a lot of different tasks in Windows automation and programming for two of the most important decades through a foundational period of the PC movement. But where did it come from? Let’s go back to 1975. This was a great year. The Vietnam War ended, Sony gave us Betamax, JVC gave us VHS. Francisco Franco died. I don’t wish ill on many, but if I could go back in time and wish ill on him, I would. NASA launched a joint mission with the Soviet Union. The UK voted to stay the EU. Jimmy Hoffa disappears. And the Altair ships. Altair Basic is like that lego starter set you buy your kid when you think they’re finally old enough to be able to not swallow the smallest pieces. From there, you buy them more and more, until you end up stepping on those smallest pieces and cursing. Much as I used to find myself frequently cursing at Visual Basic. And such is life. Or at least, such is giving life to your software ideas. No matter the language, there’s often plenty of cursing. So let’s call the Altair a proto-PC. It was underpowered, cheap, and with this Microsoft Basic programming language you could, OMG, feed it programs that would blink lights, or create early games. That was 1978. And based largely on the work of John Kemeny and Thomas Kurts, the authors of the original BASIC in 1964, at Dartmouth College. As the PC revolution came, BASIC was popular on the Apple II and original PCs with QuickBASIC coming in 1985, and an IDE, or Integrated Development Environment, for QuickBASIC shipped in 2.0. At the time Maestro was the biggest IDE in use, but they’d been around since Microsoft released the first in 1974. Next, you could compile these programs into DOS executables, or .exe files in 3.0 and 4.0 brought debugging in the IDE. Pretty sweet. You could run the interpreter without ever leaving the IDE! No offense to anyone but Apple was running around the world pitching vendors to build software for the Mac, but had created an almost contentious development environment. And it showed from the number of programs available for the Mac. Microsoft was obviously investing heavily in enabling developers to develop in a number of languages and it showed; Microsoft had 4 times the software titles. Many of which were in BASIC. But the last version of QuickBASIC as it was known by then came in 4.5, in 1988, the year the Red Army withdrew from Afghanistan - probably while watching Who Framed Roger Rabbit on pirated VHS tapes. But by the late 80s, use began to plummet. Much as my daughters joy of the legos began to plummet when she entered tweenhood. It had been a huge growth spurt for BASIC but the era of object oriented programming was emerging. But Microsoft was in an era of hyper growth. Windows 3.0 - and what’s crazy is they were just entering the buying tornado. 1988, the same year as the final release of QuickBASIC, Alan Cooper created a visual programming language he’d been calling Ruby. Now, there would be another Ruby later. This language was visual and Apple had been early to the market on Visual programming, with the Mac - introduced in 1984. Microsoft had responded with Windows 1.0 in 1985. But the development environment just wasn’t very… Visual. Most people at the time used Windows to open a Window of icky text. Microsoft leadership knew they needed something new; they just couldn’t get it done. So they started looking for a more modern option. Cooper showed his Ruby environment to Bill Gates and Gates fell in love. Gates immediately bought the product and it was renamed to Visual Basic. Sometimes you build, sometimes you partner, and sometimes you buy. And so in 1991, Visual Basic was released at Comdex in Atlanta, Georgia and came around for DOS the next year. I can still remember writing a program for DOS. They faked a GUI using ASCII art. Gross. VB 2 came along in 1992, laying the foundations for class modules. VB 3 came in 93 and brought us the JET database engine. Not only could you substantiate an object but you had somewhere to keep it. VB 4 came in 95 because we got a 32-bit option. That adds a year or 6 for every vendor. The innovations that Visual Basic brought to Windows can still be seen today. VBX and DLL are two of the most substantial. A DLL is a “dynamic link library” file that holds code and procedures that Windows programs can then consume. DLL allow multiple programs to use that code, saving on memory and disk space. Shared libraries are the cornerstone of many an object-oriented language. VBX isn’t necessarily used any more as they’ve been replaced with OCXs but they’re similar and the VBX certainly spawned the innovation. These Visual Basic Extensions, or VBX for short, were C or C++ components that were assembled into an application. When you look at applications you can still see DLLs and OCXs. VB 4 was when we switched from VBX to OCX. VB 5 came in 97. This was probably the most prolific, both for software you wanted on your computer and malware. We got those crazy ActiveX controls in VB 5. VB 6 came along in 1998, extending the ability to create web apps. And we sat there for 10 years. Why? The languages really started to split with the explosion of web tools. VBScript was put into Active Server Pages . We got the .NET framework for compiled web pages. We got Visual Basic for Applications, allowing Office to run VB scripts using VBA 7. Over the years the code evolved into what are now known as Unified Windows Platform apps, written in C++ with WinRT or C++ with CX. Those shared libraries are now surfaced in common APIs and sandboxed given that security and privacy have become a much more substantial concern since the Total Wave of the Internet crashed into our lego sets, smashing them back to single blocks. Yah, those blocks hurt when you step on them. So you look for ways not to step on them. And controlling access to API endpoints with entitlements is a pretty good way to walk lightly. Bill Gates awarded Cooper the first “Windows Pioneer Award” for his work on Visual Basic. Cooper continued to consult with companies, with this crazy idea of putting users first. He was an earlier proponent of User Experience and putting users first when building interfaces. In fact, his first book was called “About Face: The Essentials of User Interface Design.” That was published in 1995. He still consults and trains on UX. Honestly, Alan Cooper only needs one line on his resume: “The Father of Visual Basic.” Today Eclipse and Visual Studio are the most used IDEs in the world. And there’s a rich ecosystem of specialized IDEs. The IDE gives code completion, smart code completion, code search, cross platform compiling, debugging, multiple language support, syntax highlighting, version control, visual programming, and so much more. Much of this isn’t available on every platform or for every IDE, but those are the main features I look for - like the first time I cracked open IntelliJ. The IDE is almost optional in functional programming - but In an era of increasingly complex object-oriented programming where classes are defined in hundreds or thousands of itty bitty files, a good, smart, feature-rich IDE is a must. And Visual Studio is one of the best you can use. Given that functional programming is dead, there’s no basic remaining in any of the languages you build modern software in. The explosion of object-orientation created flaws in operating systems, but we’ve matured beyond that and now get to find all the new flaws. Fun right? But it’s important to think, from Alan Kay’s introduction of Smalltalk in 1972, new concepts in programming in programming had been emerging and evolving. The latest incarnation is the API-driven programming methodology. Gone are the days when we accessed memory directly. Gone are the days when the barrier of learning to program was understanding functional and top to bottom syntax. Gone are the days when those Legos were simple little sets. We’ve moved on to building Death Stars out of legos with more than 3500 pieces. Due to increasingly complex apps we’ve had to find new techniques to keep all those pieces together. And as we did we learned that we needed to be much more careful. We’ve learned to write code that is easily tested. And we’ve learned to write code that protects people. Visual Basic was yet another stop towards the evolution to modern design principals. We’ve covered others and we’ll cover more in coming episodes. So until next time, think of the continuing evolution and what might be next. You don’t have to be in front of it, but it does help to have a nice big think on how it can impact projects you’re working on today. So thank you for tuning in to yet another episode of the History of Computing Podcast. We’re so lucky to have you. Have a great day!
Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Today we’re going to look at Pong. In the beginning there was Pong. And it was glorious! Just think of the bell bottoms at Andy Capp’s Tavern in Sunnyvale, California on November 29th 1972. The first Pong they built was just a $75 black and white tv from a Walgreens and some cheap parts. The cabinet wasn’t even that fancy. And after that night, the gaming industry was born. It started with people starting to show up and play the game. They ended up waiting for the joint to open, not drinking, and just gaming the whole time. The bartender had never seen anything like it. I mean, just a dot being knocked around a screen. But it was social. You had to have two players. There was no machine learning to play the other side yet. Pretty much the same thing as real ping pong. And so Pong was released by Atari in 1972. It reminded me of air hockey the first time I saw it. You bounced a ball off a wall and tried to get it past the opponent using paddles. It never gets old. Ever. That’s probably why of all the Atari games at the arcade, more quarters got put into it than any. The machines were sold for three times the cost to produce them; unheard of at the time. The game got popular, that within a year, the company had sold 2,500 , which they tripled in 1974. I wasn’t born yet. But I remember my dad telling me that they didn’t have a color tv yet in 72. They’d manufactured the games in an old skate rink. And they were cheap because with the game needing so few resources they pulled it off without a CPU. But what about the code? It was written by Al Alcorn as a training exercise that Nolan Bushnell gave him after he was hired at Atari. He was a pretty good hire. It was supposed to be so easy a kid could play it. I mean, it was so easy a kid could play it. Bushnell would go down as the co-creator of Pong. Although maybe Ralph Baer should have as well, given that Bushnell tested his table tennis game at a trade show the same year he had Alcorn write Pong. Baer had gotten the idea of building video games while working on military systems at a few different electronics companies in the 50s and even patented a device called the Brown Box in 1973, which was filed in 1971 prior to licensing it to Magnavox to become the Odyssey. Tennis for Two had been made available in 1958. Spacewar! had popped up in 1962 , thanks to MIT’s Steven “Slug” Russel’s being teased until he finished it. It was initially written on the TX-0 and was ported to the PDP, slowly making its way across the world as the PDP was shipping. Alan Kotok had whipped up some sweet controllers, but it could be played with just the keyboard as well. No revolution seemed in sight yet as it was really just shipping to academic institutions. And to very large companies. The video game revolution was itching to get out. People were obsessed with space at the time. Space was all over science fiction, there was a space race being won by the United States, and so Spacewar gave way to Computer Space, the first arcade game to ship, in 1971, modeled after Spacewar!. But as an early coin operated video game it was a bit too complicated. As was Galaxy Game, whipped up in 1971 by Bushnell and cofounder Ted Dabney, who’s worked together at Ampex. They initially called their company Syzygy Engineering but as can happen, there was a conflict on that trademark and they changed the name to Atari. Atari had programmed Galaxy Game, but it was built and distributed by Nutting Associates. It was complex and needed a fair amount of instructions to get used to it. Pong on the other hand needed no instructions. A dot bounced from you to a friend and you tried to get it past the other player. Air hockey. Ping pong. Ice hockey. Football. It just kinda’ made sense. You bounced the dot off a paddle. The center of each returned your dot at a regular 90 degree angle and the further out you got, the smaller that angle. The ball got faster the longer the game went on. I mean, you wanna’ make more quarters, right?!?! Actually that was a bug, but one you like. They added sound effects. They spent three months. It was glorious and while Al Alcorn has done plenty of great stuff in his time in the industry I doubt many have been filled with the raw creativity he got to display during those months. It was a runaway success. There were clones of Pong. Coleco released Telestar and Nintendo came out with Color TV Game 6. In fact, General Instruments just straight up cloned the chip. Something else happened in 1972. The Magnavox Odyssey shipped and was the first console with interchangeable dice. After Pong, Atari had pumped out Gotcha, Rebound, and Space Race. They were finding success in the market. Then Sears called. They wanted to sell Pong in the home. Atari agreed. They actually outsold the Odyssey when they finally made the single-game console. Magnavox sued, claiming the concept had been stolen. They settled for $700k. Why would they settle? Well, they could actual prove that they’d written the game first and make a connection for where Atari got the idea from them. The good, the bad, and the ugly of intellectual property is that the laws exist for a reason. Baer beat Atari to the punch, but he’d go on to co-develop Simon says. All of his prototypes now live at the Smithsonian. But back to Pong. The home version of pong was released in 1974 and started showing up in homes in 1975, especially after the Christmas buying season in 1975. It was a hit, well on its way to becoming iconic. Two years later, Atari released the iconic Atari 2600, which had initially been called the VCS. This thing was $200 and came with a couple of joysticks, a couple of paddles, and a game called Combat. Suddenly games were showing up in homes all over the world. They needed more money to make money and Bushnell sold the company. Apple would become one of the fastest growing companies in US History with their release of the Apple II, making Steve Jobs a quarter of a billion dollars in 1970s money. But Atari ended up selling of units and becoming THE fastest growing company in US history at the time. There were sequels to Pong but by the time Breakout and other games came along, you really didn’t need them. I mean, pin-pong? Pong Doubles was fine but , Super Pong, Ultra Pong, and Quadrapong, never should have happened. That’s cool though. Other games definitely needed to happen. Pac Man became popular and given it wasn’t just a dot but a dot with a slice taken out for a mouth, it ended up on the cover of Time in 1982. A lot of other companies were trying to build stuff, but Atari seemed to rule the world. These things have a pretty limited life-span. The video game crash of 1983 caused Atari to lose half a billion dollars. The stock price fell. At an important time in computers and gaming, they took too long to release the next model, the 5200. It was a disaster. Then the Nintendo arrived in some parts of the world in 1983 and took the US by storm in 1985. Atari went into a long decline that was an almost unstoppable downward spiral in a way. That was sad to watch. I’m sure it was sadder to be a part of. it was even sadder when I studied corporate mergers in college. I’m sure that was even sadder to be a part of as well. Nolan Bushnell and Ted Dabney, the founders of Atari, wanted a hit coin operated game. They got it. But they got way more than they bargained for. They were able to parlay Pong into a short lived empire. Here’s the thing. Pong wasn’t the best game ever made. It wasn’t an original Bushnell idea. It wasn’t even IP they could keep anyone else from cloning. But It was the first successful video game and helped fund the development of the VCS, or 2600, that would bring home video game consoles into the mainstream, including my house. And the video game industry would later eclipse the movie industry. But the most important thing pong did was to show regular humans that microchips were for more than… computing. Ironically the game didn’t even need real microchips. The developers would all go on to do fun things. Bushnell founded Chuck E. Cheese with some of his cresis-mode cash. Once it was clear that the Atari consoles were done you could get iterations of Pong for the Sega Genesis, the Playstation, and even the Nintendo DS. It’s floated around the computer world in various forms for a long, long time. The game is simple. The game is loved. Every time I see it I can’t help but think about bell bottoms. It launched industries. And we’re lucky to have had it. Just like I’m lucky to have had you as a listener today. Thank you so much for choosing to spend some time with us. We’re so lucky to have you.