Napier Bones, 1617
There is a method of multiplication called Lattice Multiplication that some find simpler than traditional multiplication for large numbers. While no one knows who actually developed this form of multiplication, it was mentioned in Arab texts as early as the 13th century. When I was in elementary school, we only learned traditional multiplication. Today, several methods are taught, including lattice multiplication.
Traditional Multiplication
|
Lattice Multiplication
|
65
x 23
----
195
1300
----
1495
|
|
6
|
5
|
|
1
|
1 2
|
1 0
|
2
|
4
|
1 8
|
1 5
|
3
|
|
9
|
5
|
|
|
A few hundred years later, John Napier, the same man who had recently invented logarithms designed a more efficient way to do lattice multiplication. He marked strips of ivory with the multiples of the digits 0-9. By placing certain strips or bones next to each other, one could do lattice multiplication and with less writing. It was also possible to use Napier’s Bones to divide and compute square roots.
|
|
Slide Rule, 1622
William Oughtred created the slide rule based on the recent invention of logarithms by John Napier. This device allows sophisticated mathematical calculations. The slide rule was used for centuries until it was replaced by the scientific calculator in the 1970s.
|
|
The Second Era – Gear-Driven Devices
More complicated calculations and tasks required more complicated devices. These devices have rotating gears. Since they did not use electricity, they would require some form of manual cranking in order to function. One problem with devices that have moving parts is that they wear out and break.
Numerical Calculating Machine, 1642
Blaise Pascal, the same mathematician who is known for Pascal’s Triangle, built the Pascaline, the first numerical calculating machine. The inner workings of this device are similar to the tumbler odometers found in old cars. It could perform addition and subtraction. Multiplication could be performed with repeated additions.
|
|
|
Even in the early to mid-1970s, a plastic version of this device – called a Pocket Adder – was still being used. This was because a 4-function calculator could cost $150 or more back then. To put that into perspective, a plate lunch was 55 cents in 1975. By the end of the 1970s, the cost of a 4-function pocket calculator had dropped to below $10 and the Pocket Adder disappeared.
|
Jacquard's Loom, 1805
A loom is a device used to make cloth. To make plain cloth was relatively simple. To make cloth with intricate patterns was very complicated and took a great deal of time. This is because the loom itself had to be recalibrated continuously. Joseph Jacquard invented a special loom that would accept special flexible cards that are punched with information in such a manner that it is possible to program how cloth will be weaved. This did not require the continuous manual recalibration. It was even possible to make a duplicate cloth by feeding it the same cards again. It is one of the first examples of programming.
|
|
Analytical Engine, 1833
Charles Babbage made a machine that can read instructions from a sequence of punched cards – similar to those used in Jacquard’s Loom. While Jacquard’s Loom was a device dedicated to one specific task, Charles Babbage’s Analytical Engine was the first general purpose computing machine. Essentially, this was the first computer. For this reason, he is considered “The Father of Computers”. In the 1990s many malls had a video games store called Babbage’s which was named after him. You do not see those stores today because Babbage’s was bought out by GameStop.
|
|
Programming, 1842
Ada Byron, the Countess of Lovelace, was Charles Babbage’s assistant. She has the title “countess” because she is the daughter of Lord Byron and therefore is royalty. Royalty or not, she was a woman very much ahead of her time.
Imagine you had the greatest video game console ever made, but you had no video games for it. A video game console with no video games is essentially a very expensive paper weight. In the same way, a computer that has no software is practically useless.
Ada Byron understood this more than 170 years ago. She knew that Charles Babbage’s device required instructions – what we would today call programs or software. So, over 170 years ago, before modern computers even existed, she started designing computer programs. In so doing she developed certain programming ideas and techniques that are still used in programming languages today. She is known as “The Mother of Programming” and “The World’s First Programmer”. Today the programming language Ada is named after her.
The Third Era – Electro-Mechanical Devices
The term electro-mechanical device means the device uses electricity, but still has moving parts. These devices are not yet “fully electronic”. That is the next era. Since they do use electricity, the manual cranking is no longer needed. Since they still have moving parts, they still break down easily.
Tabulating Machine, 1889
The first US Census took place in 1790. Since then, every 10 years the United State Government counts all of the people in the country. This information is used for various things like determining how many representatives a state gets in the House of Representatives. It also helps determine how many schools need to be built and where.
In 1880, the US Government conducted the 10th decennial census, just as it had done every decade for the past 90 years; however, this census was different. 1880 marked the beginning of the “Great Wave of Immigration”. There were many more people in the country. As a result, it took much longer to count everyone. The 1880 census was not finished until 1888.
The government realized they had a major problem. The combination of normal population growth with the continued surge of immigration would mean even more people would need to be counted for the 1890 census. There was serious concern that they would not be able to finish the 1890 census by 1900.
In 1889, Herman Hollerith came to the rescue with his invention of the Tabulating Machine. This machine used punch cards similar to the flexible cards used in Jacquard’s Loom, but smaller. While today you might fill in the bubbles on a SCANTRON with a #2 pencil to answer a survey, back then you would poke a hole through the punch card. Holes in different positions would indicate things like gender, number of children in the household, etc.
|
|
These cards could be feed into the machine at a rate of about 1 per second. Once inserted, metal wires would come down on top of the card. Wherever the card had holes, the wires would pass through the card; touch the metal on the other side; complete a circuit; and the information would be tabulated.
Herman Hollerith’s machine was a success. The 1890 census was completed in just one year. Building on his success, in 1896 Hollerith founded the Tabulating Machine Company. Hollerith’s machines were now being used all over the world.
|
In 1911, his firm merged with three other companies to form the Computing Tabulating Recording Company. In 1924, the company was renamed International Business Machines Corporation.
|
|
Differential Analyzer, 1931
Harold Locke Hazen and Vannevar Bush, a professor at MIT, built a large scale computing machine capable of solving differential equations. If you have no idea what “differential equations” are, this is the type of math done in Calculus and it makes things like landing a man on the moon possible.
|
|
In 1934 a model of the Differential Analyzer was made at Manchester University by Douglas Hartree and Arthur Porter. This model made extensive use of the parts from a Meccano building set. While you and your parents may have played with Lego when you were young, your grandparents and your great-grandparents may have played with Meccano or some other erector set. These were building sets that had strips of metal with pre-drilled holes. It also came with gears, screws, nuts and washers. By using Meccano, they were able to make a less expensive model of the Differential Analyzer that was "accurate enough for the solution of many scientific problems".
Z3, 1941
Konrad Zuse builds an electro-mechanical computer capable of automatic computations in Germany during World War II. It was the first functional, programmable, fully automatic digital computer. The Z3 was destroyed in 1943 during the Allied bombing of Berlin. Some people credit Konrad Zuse as the “inventor of the computer”.
|
|
Mark I, 1944
The first IBM Automatic Sequence Control Calculator (ASCC) was dubbed the Mark I by Harvard University. Technically, the first version of many devices is called the “Mark I” by its creators, implying that there will be a new and improved “Mark II” at some point in the future. Today, when most people talk about “The Mark I” they are usually referring the Harvard’s Mark-I.
This electro-mechanical calculator was 51 feet long and 8 feet tall. It was the first machine that could execute long computations automatically. Put another way, it could actually process many numbers that were each up to 23 digits long.
|
|
|
Grace Hopper, then a Navy Lieutenant, was one of the first programmers of the Mark-I. She would make many contributions to the world of computer science, so many in fact that the United States Congress allowed her to stay in the Navy past mandatory retirement age. She finally retired as an Admiral in 1986 at the age of 79.
|
|
Mark II, 1947
As one might expect, Harvard University eventually replaced their Mark-I with a Mark-II. While this computer was faster than the Mark-I, that alone would not get it recognition in this chapter. The Mark-II is known for something that has nothing to do with any technological milestone.
On September 9, 1947 the Mark-II simply stopped working. A technician found the problem. There was a moth stuck one of the relays. In other words, there was a bug in the computer. He then took a pair of tweezers and removed the moth. He debugged the computer. The actual moth is currently on display at the San Diego Computer Museum.
|
|
The 4th Era – Fully Electronic Computers with Vacuum Tubes
This is often referred to as “The First Generation of Computers”. Fully electronic computers do not rely on moving parts. This makes them faster and more reliable. The vacuum tubes used at the time still had their share of drawbacks.
First, vacuum tubes are big and bulky, about the size of a normal light bulb. With 8 of these vacuum tubes the computer could process a single character. In order to process anything of consequence a computer would need thousands and thousands of vacuum tubes. This is why the early computers were so massive back then. Imagine how big a machine would need to be if it had about 17,000 normal size light bulbs.
Not only are vacuum tubes about the same size as a light bulb, they also generate heat and burn out like a light bulb. If a vacuum tube burns out, the computer stops working and needs to be replaced. The heat is a bigger issue. A single light bulb can get pretty hot after a while. Imagine the heat produced by 17,000 light bulbs. At the time, workers complained about unsafe working conditions due to the intense heat.
|
|
You may notice a little overlap in the dates between my third and fourth eras. This is because people did not stop making electro-mechanical devices the instant that fully electronic computers were invented.
ABC, 1940
The very first electronic digital computer was invented by John Atanasoff and Clifford Berry at Iowa State University. They called it the Atanasoff Berry Computer or ABC. This device was not a “general purpose computer”, nor was it programmable. It was specifically designed to solve systems of linear equations.
|
|
Colossus, 1943
This was the first electronic digital computer that was somewhat programmable. It was designed by an engineer named Tommy Flowers based on the work by Max Newman, a mathematician and code breaker. During the next couple years a total of 10 Colossus computers were made. They were used by code breakers in England to help decrypt the secret coded messages of the Germans during World War II.
|
|
ENIAC, 1946
The ENIAC (Electronic Numerical Integrator And Computer) was the first electronic general purpose computer. It was invented by John Mauchly and J. Presper Eckert. This computer was twice the size of the Mark-I, contained 17,468 vacuum tubes, and was programmed by rewiring the machine.
The ENIAC was capable of performing 385 multiplication operations per second. In 1949, John Von Newman, and various colleges, used the ENIAC to calculate the first 2037 digits of PI (shown below). The process took 70 hours. This was actually the first time a computer had been used to calculate the value of PI!
3.14159265358979323846264338327950288419716939937510582097494459230781640628620899862803482534211706798214808651328230664709384460955058223172535940812848111745028410270193852110555964462294895493038196442881097566593344612847564823378678316527120190914564856692346034861045432664821339360726024914127372458700660631558817488152092096282925409171536436789259036001133053054882046652138414695194151160943305727036575959195309218611738193261179310511854807446237996274956735188575272489122793818301194912983367336244065664308602139494639522473719070217986094370277053921717629317675238467481846766940513200056812714526356082778577134275778960917363717872146844090122495343014654958537105079227968925892354201995611212902196086403441815981362977477130996051870721134999999837297804995105973173281609631859502445945534690830264252230825334468503526193118817101000313783875288658753320838142061717766914730359825349042875546873115956286388235378759375195778185778053217122680661300192787661119590921642019893809525720106548586327886593615338182796823030195203530185296899577362259941389124972177528347913151557485724245415069595082953311686172785588907509838175463746493931925506040092770167113900984882401285836160356370766010471018194295559619894676783744944825537977472684710404753464620804668425906949129331367702898915210475216205696602405803815019351125338243003558764024749647326391419927260426992279678235478163600934172164121992458631503028618297455570674983850549458858692699569092721079750930295532116534498720275596023648066549911988183479775356636980742654252786255181841757467289097777279380008164706001614524919217321721477235014144197356854816136115735255213347574184946843852332390739414333454776241686251898356948556209921922218427255025425688767179049460165346680498862723279178608578438382796797668145410095388378636095068006422512520511739298489608412848862694560424196528502221066118630674427862203919494504712371378696095636437191728746776465757396241389086583264599581339047802759009946576407895126946839835259570982582
The ENIAC cost $500,000. Adjusted for inflation, that would be almost $6,000,000 in 2013. Unlike earlier computers like the Z3 and the Colossus, which were military secrets, the public actually knew about the ENIAC. The press called it “The Giant Brain.”
EDVAC, 1949
The EDVAC (Electronic Discrete Variable Automatic Computer) was the successor to the ENIAC and was also invented by John Mauchly and J. Presper Eckert. The main improvement in the EDVAC was that it was a Stored Program Computer. This meant is could store a program in electronic memory. (Earlier computers stored programs on punched
|
|
tape.) The EDVAC could store about 5½ kilobytes. Like the ENIAC, this computer also cost about half a million dollars.
UNIVAC I, 1951
The UNIVAC I (UNIVersal Automatic Computer) was the world’s first commercially available computer. While the Mark-I and the ENIAC were not “for sale”, any company with enough money could actually purchase a UNIVAC computer. This computer was mass-produced and commercially successful. The UNIVAC-I became famous when it correctly predicted the results of the 1952 presidential election.
|
|
The 5th Era – Computers with Transistors/Integrated Circuits
The invention of the transistor changed computers forever. This is often referred to as “The Second Generation of Computers”. The University of Manchester made the first transistor computer in 1953. Transistors have certain key advantages over vacuum tubes. First, they are much smaller. This allowed computers to become smaller and cheaper. Second, transistors do not get hot and do not burn out like vacuum tubes. This means we no longer have to deal with the issues of intense heat and replacing burned out vacuum tubes.
|
|
Integrated Circuit, 1958
Jack Kilby, of Texas Instruments, in Richardson, Texas, developed the first integrated circuit. Integrated circuits have multiple transistors on a tiny thin piece of metal, often called a chip. Jack Kilby used germanium. Six months later Robert Noyce came up with his own idea for an improved integrated circuit which uses silicon. He is now known as “The Mayor of Silicon Valley”. Both gentlemen are
|
|
|
credited as co-inventors of the integrated circuit. This began a period which is often referred to as “The Third Generation of Computers”. As technology improved, we developed the ability to put thousands, then millions, and now billions of transistors on what we now call a microchip. Microchips lead to microprocessors which have an entire CPU on a single chip. The first 2 came out in 1971. These were the TMS 1000 and the Intel 4004 which was a 4-bit microprocessor.
|
Video Games, 1958/1962
Video games have been around for much longer than most people realize. Your parents probably played video games when they were kids. It is even possible that some of your grandparents played them as well. If you ask most people what the first video game is, they would say Pong. This was made by Atari in 1972. This actually was not the first video game. It was the first successful arcade game.
|
|
|
The first video game was called Tennis for Two. It was created by William Higinbotham and played on a Brookhaven National Laboratory oscilloscope. Since this game did not use an actual computer monitor, some give credit for the first video game to SpaceWar written by Stephen Russell at MIT in 1962.
What some people do not realize, is that video games almost disappeared completely long before you were born. In 1982, one of the biggest blockbuster movies of all time, E.T. the Extra Terrestrial, came out. Atari wanted a video game based on the movie to be released in time for Christmas. The programmer had just over 5 weeks to create the game. The game was a huge flop, causing Atari to lose millions of dollars. It was not only considered the worst video game ever made, it was cited as the reason for the Video Game Industry Crash of 1983. Soon after, some computer literacy textbooks stated that “the video game fad is dead.” Luckily, Nintendo revived the market in 1985 with Super Mario Bros.
|
In 2012, G4 ranked Super Mario Bros. #1 on its Top 100 Video Games of All Time special for “almost single-handedly rescuing the video game industry”.
IBM System/360, 1964
In the early days of computers, what exactly constituted a computer was not clearly defined. The fact that different groups of people had different computer needs did not help. The 2 biggest groups were the mathematicians and the business people. The mathematicians wanted computers good in number crunching. Business people wanted computers good in record handling. Companies like IBM would actually make different devices to satisfy the needs of each group.
IBM’s System/360 changed that by creating a series of compatible computers that covered a complete range of applications. They worked for the mathematicians and the business community.
All of the computers in this series were compatible. This means that a program created on one System/360 computer can be transported and used on another System/360 computer. The different computers in this series sold for different prices based on their speed.
System/360 essentially standardized computer hardware. It is also responsible for several other standards including the 8 bit byte.
Apple II Personal Computer, 1977
In 1976 Steve Jobs and Steve Wozniak created a computer in Steve Jobs’ parents’ garage. This was the original Apple computer. Only 200 or so of these computers were made and sold as kits. With the profit they were able to form Apple Computer Inc. A year later, they released the much improved Apple-II computer. It became the first commercially successful personal
|
|
computer. Several versions of the Apple-II were released over the years. There actually was an Apple-III computer released in 1980, but it was not successful. The Apple-IIe (enhanced), Apple-IIc (compact) and the Apple-IIgs (graphics & sound) actually came out after the failed Apple-III. For a time, the slogan at Apple Computer Inc. was “Apple II Forever!”
Apple-II 1977
|
Apple-III 1980
|
Apple-IIe 1983
|
Apple IIc 1984
|
Apple IIgs 1986
|
On January 9, 2007 Apple dropped the “Computer” from its name and simply became Apple Inc. This was due to the company’s diversification into the home entertainment and cell phone industries.
Computer Applications, 1979
The first 2 applications or “apps” available for personal computers (not including video games) were electronic spreadsheets and word processing. Dan Bricklin created VisiCalc, a spreadsheet program, which became the first wide spread software to be sold. He initially lived in a hut somewhere in the mountains of Montana and received an average of $30,000 a week for his invention. He did not benefit from the tremendous boom in the spreadsheet market, because his software could not get a patent.
|
|
Later that same year, MicroPro released WordStar, which becomes the most popular word processing software program in the late seventies and eighties. This was before the age of WYSIWYG (What You See Is What You Get) word processors. Feature like bold and italics would show up on the printer, but not on the screen. Since word processors did not yet use a mouse, everything was done by typing a combination of keys.
These applications mean you never actually spread and large sheet of paper over a table to look at hundreds of numbers. You also will never know the horror of having to retype a 30 page report because of a simple mistake on the first page.
Note, the carrot ( ^ ) symbol in the Edit Menu above refers to the key.
IBM PC, 1981
As far as the business world was concerned, these new personal computers were amusing toys. No one would even think of using a personal computer to do any serious business work. That changed when IBM introduced the IBM P.C. It is a computer with a monochrome monitor and two floppy drives. Hard drives were not yet available for personal computers. IBM's entry into the personal computer market gives the personal computer an image as a serious business tool and not some electronic game playing machine.
|
|
MS-DOS, 1981
IBM decided not to create its own operating system for the personal computing market and decided to out-source development of its operating system for its trivial little personal computer department. Many companies rejected IBM’s proposal. Microsoft, an unknown little company run by Bill Gates, agreed to produce the operating system for the IBM Personal Computer. Over the years, Microsoft grew and became a company larger than IBM.
|
|
Portability and Compatibility, 1982
The Compaq Portable is known for two things. It was the first portable computer. By today’s standards it was nothing like a modern laptop. The 28 pound computer was the size of a small suitcase, and looked very much like one as well. The removable bottom was the keyboard which would reveal a 9 inch monitor and a couple of floppy drives. Compaq was also the first computer to be 100% compatible with an IBM PC.
Macintosh, 1984
Apple started to sell the Apple Macintosh computer which uses a mouse with a GUI (Graphics User Interface) environment. The mouse technology was already developed earlier by Xerox Corporation and Apple actually introduced this technology with its Lisa computer in 1982. The Lisa computer cost $10,000 and was a commercial failure. The "Mac" was the first commercially successful computer with mouse/GUI technology. The computer was introduced to the public with the now famous 1984 commercial during Super Bowl XVIII. This started a trend of having the best commercials air during the super bowl. While the Lisa was named after Steve Jobs’ daughter, the Macintosh was not named after anyone. A macintosh is a type of apple – and the favorite of Jes Raskin, the leader of the team that designed and built the Macintosh – hence its name.
|
|
Share with your friends: |