| Definition: | | The fundamental rate in cycles per second at which a computer performs its most basic operations such as adding two numbers or transfering a value from one register to another. The clock rate of a computer is normally determined by the frequency of a crystal. The original IBM PC, circa 1981, had a clock rate of 4.77 MHz (almost five million cycles/second). As of 1995, Intel's Pentium chip runs at 100 MHz (100 million cycles/second). The clock rate of a computer is only useful for providing comparisons between computer chips in the same processor family. An IBM PC with an Intel 486 CPU running at 50 MHz will be about twice as fast as one with the same CPU, memory and display running at 25 MHz. However, there are many other factors to consider when comparing different computers. Clock rate should not be used when comparing different computers or different processor families. Rather, some benchmark should be used. Clock rate can be very misleading, since the amount of work different computer chips can do in one cycle varies. For example, RISC CPUs tend to have simpler instructions than CISC CPUs (but higher clock rates) and pipelined processors execute more than one instruction per cycle. |