- Megahertz myth
-
The megahertz myth, or less commonly the gigahertz myth, refers to the misconception of only using clock rate to compare the performance of different microprocessors. While clock rates are a valid way of comparing the performance of different speeds of the same model and type of processor, other factors such as pipeline depth and instruction sets can greatly affect the performance when considering different processors. For example, one processor may take two clock cycles to add two numbers and another clock cycle to multiply by a third number, whereas another processor may do the same calculation in two clock cycles. Comparisons between different types of processors are difficult because performance varies depending on the type of task. A benchmark is a more thorough way of measuring and comparing computer performance.
Contents
History
Background
The x86 CISC based CPU architecture which Intel introduced in 1978 was used as the standard for the DOS based IBM PC, and developments of it still continue to dominate the Microsoft Windows market. An IBM RISC based architecture was used for the PowerPC CPU which was released in 1992. In 1994 Apple Computer introduced Macintosh computers using these PowerPC CPUs, but IBM's intention to produce its own desktop computers using these processors was thwarted by delays in Windows NT and a falling out with Microsoft.[citation needed] Initially this architecture met hopes for performance, and different ranges of PowerPC CPUs were developed, often delivering different performances at the same clock rate. Similarly, at this time the Intel 80486 was selling alongside the Pentium which delivered almost twice the performance of the 80486 at the same clock rate.[1]
Rise of the myth
The myth arose because the clock rate was commonly taken as a simple measure of processor performance, and was promoted in advertising and by enthusiasts without taking into account other factors. The term came into use in the context of comparing PowerPC-based Apple Macintosh computers with Intel-based PCs. Marketing based on the myth led to the clock rate being given higher priority than actual performance, and led to AMD introducing model numbers giving a notional clock rate based on comparative performance to overcome a perceived deficiency in their actual clock rate.[2]
Computer advertising emphasized processor megahertz, and by late 1997 rapidly increasing clock rates enabled the Pentium II to surpass the PowerPC in performance. Apple then introduced Macs using the PowerPC 750 (or G3) which they claimed outperformed Pentium IIs while consuming less power. Intel continued to promote their higher clock rate, and the Mac press frequently used the "megahertz myth" term to emphasise claims that Macs had the advantage in certain real world uses, particularly in laptops.[citation needed]
Comparisons between PowerPC and Pentium had become a staple of Apple presentations. At the New York Macworld Expo Keynote on July 18, 2001, Steve Jobs described an 867 MHz G4 as completing a task in 45 seconds while a 1.7 GHz Pentium 4 took 82 seconds for the same task, saying that "the name that we've given it is the megahertz myth"[3]. He then introduced senior hardware VP Jon Rubinstein who gave a tutorial describing how shorter pipelines gave better performance at half the clock rate.
References
- ^ "Analysis: x86 Vs PPC". http://www.osnews.com/story/3997. Retrieved 2008-09-18.
- ^ Tony Smith (February 28 2002). "Megahertz myth : Technology". The Guardian. http://www.guardian.co.uk/technology/2002/feb/28/onlinesupplement3. Retrieved 2008-09-18.
- ^ "A video of Megahertz Myth presentation". http://www.youtube.com/watch?v=I3WnXaWjQYE.
External links
- Analysis: x86 Vs PPC — OSNews.com
- Apple's explanation of the megahertz myth Keynote at Macworld 2001
- Intel to 'ditch' Pentium 4 core after Prescott 2004 news article
Categories:- Clock signal
- Microprocessors
Wikimedia Foundation. 2010.