NextPrevious

Science and Invention

Computer Technology

When was the computer chip developed?

The computer chip, or integrated circuit, was developed in the late 1950s by two researchers who were working independently of each other: Jack Kilby (1923-) of Texas Instruments (who developed his chip in 1958) and by Robert Noyce (1927–1990)of Fairchild Semiconductor (in 1959). The chip is an electronic device made of a very small piece (usually less than one-quarter inch square) of silicon wafer, and today has typically hundreds of thousands miniature transistors and other circuit components that are interconnected. Since its development in the late 1950s, the number of tiny components a chip can have has steadily risen, improving computer performance, since the chips perform a computer’s control, logic, and memory functions. A computer’s microprocessor is a single chip that holds all of the computer’s logic and arithmetic. It is responsible for interpreting and executing instructions given by a computer program (software). The microprocessor can be thought of as the brain of the computer’s operating system.

Many other consumer electronic devices rely on the computer chip as well, including the microwave, the VCR, and calculators.



Close

This is a web preview of the "The Handy History Answer Book" app. Many features only work on your mobile device. If you like what you see, we hope you will consider buying. Get the App