'binary' Episodes

Boolean Algebra

     2/8/2020

Boolean algebra Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us to innovate (and sometimes cope with) the future! Today we’re going to talk a little about math. Or logic. Computers are just a bunch of zeroes and ones, right? Binary. They make shirts about it. You know, there are 10 types of people in the world. But where did that come from? After centuries of trying to build computing devices that could help with math using gears that had lots of slots in them, armed with tubes and then transistors, we had to come up with a simpler form of logic. And why write your own complicated math when you can borrow it and have instant converts to your cause? Technical innovations are often comprised of a lot of building blocks from different fields of scientific or scholastic studies. The 0s and 1s, which make up the flip-flop circuits computers are so famous for, are made possible by the concept that all logic can be broken down into either true or false. And so the mathematical logic that we have built trillions of dollars in industry off of began in 1847 in a book called The Mathematical Analysis of Logic, by George Boole. He would follow that up in a book called An Investigation of the Laws of Thought in 1854. He was he father of what we would later call Boolean Algebra once the science of an entire mathematical language built on true and false matured enough for Charles Sanders Peirce wrote a book called The Simplest Mathematics and had a title called Boolian Algebra with One Constant. By 1913, there were many more works with the name and it became Boolean algebra. This was right around the time that the electronic research community had first started experimenting with using vacuum tubes as flip-flop switches. So there’s elementary algebra where you can have any old number with any old logical operation. Those operators can be addition, subtraction, multiplication, division, etc. But in boolean algebra the only variables available are a 0 or a 1. Later we would get abstract algebra as well, but for computing it was way simpler to just stick with those 0s and 1s and in fact, ditching the gears from the old electromechanical computing paved the way for tubes to act as flip-flop switches, and transistors to replace those. And the evolutions came. Both to the efficiency of flip-flop switches and to the increasingly complex uses for mechanical computing devices. But they hadn’t all been mashed up together. So set theory and statistics were evolving. And Huntington, Jevons, Schröder, basically perfected Boolean logic, paving the way for MH Stone to provide that Boolean algebra is isomorphic to a field of sets by 1936. And so it should come as no surprise that Boolean algebra would be key to the development of basic mathematical functions used on the Berry-Attansoff computer. Remember that back then, all computing was basically used for math. Claude Shannon would help apply Boolean algebra to switching circuits. This involved binary decision diagrams for synthesizing and verifying the design of logic circuits. And so we could analyze and design circuits using algebra to define logic gates. Those gates would get smaller and faster and combined using combinational logic until we got LSI circuits and later with the automation of the design of chips, VLSI. So to put it super-simple, let’s say you are trying to do some maths. First up, you convert values to bits, which are binary digits. Those binary digits would be represented as a 0 or a 1, expressed in binary algebra as . There’s a substantial amount of information you can pack into those bits, with all major characters easily allowed for in a byte, which is 8 of those bits. So let’s say you also map your algebraic operators using those 0s and 1s, another byte. Now you can add the number in the first byte. To do so though, you would need to basically translate the notations from classical propositional calculus to their expression in Boolean algebra, typically done in an assembler. Much, much more logic is required to apply quantifiers. And simple true values are 0 and 1 but have a one step truth table to define AND (also known as a conjunction), OR (also known as a disjunction), and NOT XOR (also known as an exclusive-or). This allows for an exponential increase in the amount of logic you can apply to a problem. The act of deceasing if the problem satisfies the ability to translate into boolean capabilities is known as the Boolean satisfiability problem or SAT. At this point though, all problems really seem solvable using some model of computation given the amount of complex circuitry we now have. So the computer interprets information the functions and sets the state of a switch based on the input. The computer then combines all those trues and false into the necessary logic and outputs an answer. Because the 0s and 1s took too much the input got moved to punch cards, and modern programming was born. These days we can also add Boolean logic into higher functions, such as running AND for google searches. So ultimately the point of this episode is to explore what exactly all those 0s and 1s are. They’re complex thoughts and formulas expressed as true and false using complicated Boolean algebra to construct them. Now, there’s a chance that some day we’ll find something beyond a transistor. And then we can bring a much more complicated expression of thought broken down into different forms of algebra. But there’s also the chance that Boolean algebra sitting on transistors or other things that are the next evolution of boolean gates or transistors is really, well, kinda’ it. So from the Barry-Attansoff computer comes Colossus and then ENIAC in 1945. It wasn’t obvious yet but nearly 100 years after the development of Boolean algebra, it had been combined with several other technologies to usher in the computing revolution, setting up the evolution to microprocessors and the modern computer. These days, few programmers are constrained by programming in Boolean logic. Instead, we have many more options. Although I happen to believe that understanding this fundamental building block was one of the most important aspects of studying computer science and provided an important foundation to computing in general. So thank you for listening to this episode. I’m sure algebra got ya’ totally interested and that you’re super-into math. But thanks for listening anyways. I’m pretty lucky to have ya’. Have a great day


Happy Birthday ENIAC

     2/15/2020

Today we’re going to celebrate the birthday of the first real multi-purpose computer: the gargantuan ENIAC which would have turned 74 years old today, on February 15th. Many generations ago in computing. The year is 1946. World War II raged from 1939 to 1945. We’d cracked Enigma with computers and scientists were thinking of more and more ways to use them. The press is now running articles about a “giant brain” built in Philadelphia. The Electronic Numerical Integrator and Computer was a mouthful, so they called it ENIAC. It was the first true electronic computer. Before that there were electromechanical monstrosities. Those had to physically move a part in order to process a mathematical formula. That took time. ENIAC used vacuum tubes instead. A lot of them. To put things in perspective: very hour of processing by the ENiAC was worth 2,400 hours of work calculating formulas by hand. And it’s not like you can do 2,400 hours in parallel between people or in a row of course. So it made the previous almost impossible, possible. Sure, you could figure out the settings to fire a bomb where you wanted two bombs to go in a minute rather than about a full day of running calculations. But math itself, for the purposes of math, was about to get really, really cool. The Bush Differential Analyzer, a later mechanical computer, had been built in the basement of the building that is now the ENIAC museum. The University of Pennsylvania ran a class on wartime electronics, based on their experience with the Differential Analyzer. John Mauchly and J. Presper Eckert met in 1941 while taking that class, a topic that had included lots of shiny new or newish things like radar and cryptanalysis. That class was mostly on ballistics, a core focus at the Moore School of Electrical Engineering at the University of Pennsylvania. More accurate ballistics would be a huge contribution to the war effort. But Echert and Mauchly wanted to go further, building a multi-purpose computer that could analyze weather and calculate ballistics. Mauchly got all fired up and wrote a memo about building a general purpose computer. But the University shot it down. And so ENIAC began life as Project PX when Herman Goldstine acted as the main sponsor after seeing their proposal and digging it back up. Mauchly would team up with Eckert to design the computer and the effort was overseen and orchestrated by Major General Gladeon Barnes of the US Army Ordnance Corps. Thomas Sharpless was the master programmer. Arthur Burkes built the multiplier. Robert Shaw designed the function tables. Harry Huskey designed the reader and the printer. Jeffrey Chu built the dividers. And Jack Davis built the accumulators. Ultimately it was just a really big calculator and not a computer that ran stored programs in the same way we do today. Although ENIAC did get an early version of stored programming that used a function table for read only memory. The project was supposed to cost $61,700. The University of Pennsylvania Department of Computer and Information Science in Philadelphia actually spent half a million dollars worth of metal, tubes and wires. And of course the scientists weren’t free. That’s around $6 and a half million worth of cash today. And of course it was paid for by the US Army. Specifically the Ballistic Research Laboratory. It was designed to calculate firing tables to make blowing things up a little more accurate. Herman Goldstine chose a team of programmers that included Betty Jennings, Betty Snyder, Kay McNulty, Fran Bilas, Marlyn Meltzer, and Ruth Lichterman. They were chosen from a pool of 200 and set about writing the necessary formulas for the machine to process the requirements provided from people using time on the machine. In fact, Kay McNulty invented the concept of subroutines while working on the project. They would flip switches and plug in cables as a means of programming the computer. And programming took weeks of figuring up complex calculations on paper. . Then it took days of fiddling with cables, switches, tubes, and panels to input the program. Debugging was done step by step, similar to how we use break points today. They would feed ENIAC input using IBM punch cards and readers. The output was punch cards as well and these punch cards acted as persistent storage. The machine then used standard octal radio tubes. 18000 tubes and they ran at a lower voltage than they could in order to minimize them blowing out and creating heat. Each digit used in calculations took 36 of those vacuum tubes and 20 accumulators that could run 5,000 operations per second. The accumulators used two of those tubes to form a flip-flop and they got them from the Kentucky Electrical Lamp Company. Given the number that blew every day they must have loved life until engineers got it to only blowing a tube every couple of days. ENIAC was modular computer and used different panels to perform different tasks, or functions. It used ring counters with 10 positions for a lot of operations making it a digital computer as opposed to the modern binary computational devices we have today. The pulses between the rings were used to count. Suddenly computers were big money. A lot of research had happened in a short amount of time. Some had been government funded and some had been part of corporations and it became impossible to untangle the two. This was pretty common with technical advances during World War II and the early Cold War years. John Atanasoff and Cliff Berry had ushered in the era of the digital computer in 1939 but hadn’t finished. Maunchly had seen that in 1941. It was used to run a number of calculations for the Manhattan Project, allowing us to blow more things up than ever. That project took over a million punch cards and took precedent over artillery tables. Jon Von Neumann worked with a number of mathematicians and physicists including Stanislaw Ulam who developed the Monte Method. That led to a massive reduction in programming time. Suddenly programming became more about I/O than anything else. To promote the emerging computing industry, the Pentagon had the Moore School of Electrical Engineering at The University of Pennsylvania launch a series of lectures to further computing at large. These were called the Theory and Techniques for Design of Electronic Digital Computers, or just the Moore School Lectures for short. The lectures focused on the various types of circuits and the findings from Eckert and Mauchly on building and architecting computers. Goldstein would talk at length about math and other developers would give talks, looking forward to the development of the EDVAC and back at how they got where they were with ENIAC. As the University began to realize the potential business impact and monetization, they decided to bring a focus to University owned patents. That drove the original designers out of the University of Pennsylvania and they started the Eckert-Mauchly Computer Corporation in 1946. Eckert-Mauchley would the build EDVAC, taking use of progress the industry had made since the ENIAC construction had begun. EDVAC would effectively represent the wholesale move away from digital and into binary computing and while it weighed tons - it would become the precursor to the microchip. After the ENIAC was finished Mauchly filed for a patent in 1947. While a patent was granted, you could still count on your fingers the number of machines that were built at about the same time, including the Atanasoff Berry Computer, Colossus, the Harvard Mark I and the Z3. So luckily the patent was avoided and digital computers are a part of the public domain. That patent was voided in 1973. By then, the Eckert-Mauchly computer corporation had been acquired by Remington Rand, which merged with Sperry and is now called Unisys. The next wave of computers would be mainframes built by GE, Honeywell, IBM, and another of other vendors and so the era of batch processing mainframes began. The EDVAC begat the UNIVAC and Grace Hopper being brought in to write an assembler for that. Computers would become the big mathematical number crunchers and slowly spread into being data processors from there. Following decades of batch processing mainframes we would get minicomputers and interactivity, then time sharing, and then the PC revolution. Distinct eras in computing. Today, computers do far more than just the types of math the ENIAC did. In fact, the functionality of ENIAC was duplicated onto a 20 megahertz microchip in 1996. You know, ‘cause the University of Pennsylvania wanted to do something to celebrate the 50th birthday. And a birthday party seemed underwhelming at the time. And so the date of release for this episode is February 15th, now ENIAC Day in Philadelphia, dedicated as a way to thank the university, creators, and programmers. And we should all reiterate their thanks. They helped put computers front and center into the thoughts of the next generation of physicists, mathematicians, and engineers, who built the mainframe era. And I should thank you - for listening to this episode. I’m pretty lucky to have ya’. Have a great day! .


(OldComputerPods) ©Sean Haas, 2020