Generation 3: Integrated Circuits (1963-1973) Transistors were far superior to the vacuum tubes they replaced in many ways they were smaller, cheaper to mass produce, and more energy-efficient. However, early transistors still generated a great deal of heat, which could damage other components inside the computer. In 1958, Jack Kilby (1923-) at Texas Instruments developed a technique for manufacturing transistors as layers of conductive material on a silicon disc. This new form of transistor could be made much smaller than existing, freestanding transistors, and also generated less heat. Thus, tensor even hundreds of transistors could be layered onto the same disc and connected with conductive layers to form simple circuits. Such a disc, packaged in metal or plastic with external pins for connecting with other components, is known as an Integrated Circuit (IC) or IC chip. Packaging transistors and related circuitry on an IC that could be mass-produced made it possible to build computers that were smaller, faster, and cheaper. A complete computer system could be constructed by mounting a set of IC chips (and other support devices) on circuit boards and soldering them together. As manufacturing technology improved, the number of transistors that could befit on a single chip increased. In 1965, Gordon Moore (1929-) noticed that the number of transistors that could fit on a chip roughly doubled every 12 to 18 months. This trend became known as Moore's Law, and has continued to bean accurate predictor of technological advances. By the 1970's, the Large Scale Integration (LSI) of thousands of transistors on a single IC chip became possible. In 1971, the Intel Corporation made the logical step of combining all of the control circuitry fora calculator into a single chip called a microprocessor. The Intel 4004 microprocessor contained more than 2,300 transistors. Its descendant, the 8080, which was released in 1974, contained 6,000 transistors and was the first general-purpose microprocessor, meaning that it could be programmed to serve various functions. The Intel 8080 and its successors, the 8086 and 8088 chips, served as central processing units for numerous personal computers in the s, including the IBM PC. Other semiconductor vendors such as Texas Instruments, National Semiconductors, Fairchild Semiconductors, and Motorola also produced microprocessors in the 1970's. As was the case with previous technological advances, the development of integrated circuits led to faster and cheaper computers. Whereas computers in the early s were affordable only for large corporations, IC technology rapidly brought down costs to the level where small businesses could afford them. As such, more and more people needed to be able to interact with computers. A key to making computers easier for nontechnical users was the development of operating systems, master control programs that oversee the operation of the computer and manage peripheral devices. An operating system acts as an interface between the user and the physical components of the computer, and through a technique known as time-sharing, can even allow multiple users to share the same computer simultaneously. In addition to operating systems, specialized programming languages were developed to fill the needs of the broader base of users. In 1971, Niklaus Wirth (1934-) developed Pascal, a simple language designed primarily for teaching but which has dedicated users to this day. In, Dennis Ritchie (1941-) developed Ca systems language used in the development of the UNIX operating system. Partly because of the success of UNIX, C and its descendant C+ have become the industry standards for systems programming and software engineering.