CSC/ECE 506 Spring 2012/1b as: Difference between revisions
No edit summary |
No edit summary |
||
Line 2: | Line 2: | ||
[[Image:TransCount59-75.png|right]] | [[Image:TransCount59-75.png|right]] | ||
Nearly 40 years ago, Intel co-founder Gordon Moore forecasted the rapid pace of technology innovation. His prediction, popularly known as “Moore’s Law,” states that transistor density on [http://en.wikipedia.org/wiki/Integrated_circuit integrated circuits] doubles about every two years.<ref>http://www.computerhistory.org/semiconductor/timeline/1965-Moore.html</ref>.The original prediction presented in [http://download.intel.com/museum/Moores_Law/Articles-Press_releases/Gordon_Moore_1965_Article.pdf 1965 paper in Electronics Magazine] observed an annual doubling in the number of chip elements called transistors. He refined his view in 1975 with a two-year cycle in an [http://download.intel.com/museum/Moores_Law/Articles-Press_Releases/Gordon_Moore_1975_Speech.pdf updated paper]. Instead of giving an empirical formula predicting the rate of increase, Moore instead used prose, graphs, and images to convey these predictions and observations to the masses. This in some ways increased the staying power of Moore's law, allowing the industry to use it as a benchmark of success and a measurable determination of their success. Virtually all digital devices are in some way fundamentally linked to the growth set in place by Moore's law.<ref>http://en.wikipedia.org/wiki/Moore's_law</ref> | Nearly 40 years ago, Intel co-founder Gordon Moore forecasted the rapid pace of technology innovation. His prediction, popularly known as “Moore’s Law,” states that transistor density on [http://en.wikipedia.org/wiki/Integrated_circuit integrated circuits] doubles about every two years.<ref>http://www.computerhistory.org/semiconductor/timeline/1965-Moore.html</ref>.The original prediction presented in [http://download.intel.com/museum/Moores_Law/Articles-Press_releases/Gordon_Moore_1965_Article.pdf 1965 paper in Electronics Magazine] observed an annual doubling in the number of chip elements called transistors. He refined his view in 1975 with a two-year cycle in an [http://download.intel.com/museum/Moores_Law/Articles-Press_Releases/Gordon_Moore_1975_Speech.pdf updated paper]. Instead of giving an empirical formula predicting the rate of increase, Moore instead used prose, graphs, and images to convey these predictions and observations to the masses. This in some ways increased the staying power of Moore's law, allowing the industry to use it as a benchmark of success and a measurable determination of their success. Virtually all digital devices are in some way fundamentally linked to the growth set in place by Moore's law.<ref>http://en.wikipedia.org/wiki/Moore's_law</ref> | ||
==Moore's law, past to present== | ==Moore's law, past to present== | ||
Line 349: | Line 338: | ||
|} | |} | ||
== | ==Why do we need Moore's law?== | ||
Begun as a simple observation, Moore’s Law has come to represent the amazing and seemingly inexhaustible capacity for exponential growth in electronics.<ref>P. K. Bondyopadhyay, “Moore’s Law governs the silicon revolution,”Proc. IEEE, vol. 86, no. 1, pp. 78–81, Jan. 1998.</ref> The historical regularity and predictability of Moore's Law produce organizing and coordinating effects throughout the semiconductor industry that not only set the pace of innovation, but define the rules and very nature of competition. And since semiconductors increasingly comprise a larger portion of electronics components and systems, either used directly by consumers or incorporated into end-use items purchased by consumers, the impact of Moore's Law has led users and consumers to come to expect a continuous stream of faster, better, and cheaper high-technology products. As integrated circuit costs have decreased, they have made their way into modern products ranging from automobiles to greeting cards[5].<ref>http://download.intel.com/museum/Moores_Law/Printed_Materials/Moores_Law_Backgrounder.pdf</ref> | |||
The drivers for technology development fall into two categories: push and pull. Push drivers are technology enablers, those things that make it possible to achieve the technical improvements. Moore described the three push drivers as increasing chip area, decreasing feature size, and design cleverness. The economic drivers for Moore’s Law have been extraordinarily compelling. As the dimensions of a transistor shrank, the transistor became smaller, lighter, faster, consumed less power, and in most cases was more reliable. All of these factors make the transistor more desirable for virtually every possible application. But there is more. Historically, the semiconductor industry has been able to manufacture silicon devices at an essentially constant cost per area of processed silicon. Thus, as the devices shrank they enjoyed a shrinking cost per transistor. Each step along the roadmap of Moore’s Law virtually guaranteed economic success. <ref>http://commonsenseatheism.com/wp-content/uploads/2011/12/Mack-Fifty-Years-of-Moores-Law.pdf</ref> | |||
==Beyond Moore's Law== | ==Beyond Moore's Law== |
Revision as of 17:01, 3 February 2013
What is Moore's Law?
Nearly 40 years ago, Intel co-founder Gordon Moore forecasted the rapid pace of technology innovation. His prediction, popularly known as “Moore’s Law,” states that transistor density on integrated circuits doubles about every two years.<ref>http://www.computerhistory.org/semiconductor/timeline/1965-Moore.html</ref>.The original prediction presented in 1965 paper in Electronics Magazine observed an annual doubling in the number of chip elements called transistors. He refined his view in 1975 with a two-year cycle in an updated paper. Instead of giving an empirical formula predicting the rate of increase, Moore instead used prose, graphs, and images to convey these predictions and observations to the masses. This in some ways increased the staying power of Moore's law, allowing the industry to use it as a benchmark of success and a measurable determination of their success. Virtually all digital devices are in some way fundamentally linked to the growth set in place by Moore's law.<ref>http://en.wikipedia.org/wiki/Moore's_law</ref>
Moore's law, past to present
Reviewing data from the inception of Moore's law to the present shows that, consistent to Moore's prediction, the number of transistors on a chip has doubled approximately every 2 years. There are several contributing factors, that had they not been developed, could have slowed or plateaued Moore's law. One of these is the invention Dynamic random access memory (DRAM). This is a type of random access memory that allows for the storage of each bit in a separate capacitor on an integrated circuit. The main advantage of DRAM over its predecessor, SRAM, is that only one transistor and a capacitor are required per bit, compared to four or six transistors with SRAM. Another is most certainly the complementary metal-oxide-semiconductor (CMOS). This allowed a higher density of logic functions on a chip with the added benefit of low power consumption and electrical noise immunity. Lastly was the invention of the integrated circuit itself. Moore's law isn't only responsible for making larger and faster chips, but also smaller, cheaper, and more efficient ones as well.
As visible in the examples below and to the right, Moore's law has seemed to hold at an overall constant rate of growth consistent to what Moore predicted. There is some variation from year to year that can be explained by the introduction of new technology, manufacturing or otherwise that helped kick-start Moore's law back on track. This is most evident with the "Dual-Core Itanium 2" processor. It was ahead of it's competitors by a full four years. Looking at the example to the right, we can visibly see a dip in transistor count during 1995-2003. But this is balanced out by an almost equal increase in the following eight years.
Processor | Transistor count | Date of introduction | Manufacturer | Process | Area |
---|---|---|---|---|---|
Intel 4004 | 2,300 | 1971 | Intel | 10 µm | 12 mm² |
Intel 8008 | 3,500 | 1972 | Intel | 10 µm | 14 mm² |
MOS Technology 6502 | 3,510 | 1975 | MOS Technology | 21 mm² | |
Motorola 6800 | 4,100 | 1974 | Motorola | 16 mm² | |
Intel 8080 | 4,500 | 1974 | Intel | 6 μm | 20 mm² |
RCA 1802 | 5,000 | 1974 | RCA | 5 μm | 27 mm² |
Intel 8085 | 6,500 | 1976 | Intel | 3 μm | 20 mm² |
Zilog Z80 | 8,500 | 1976 | Zilog | 4 μm | 18 mm² |
Motorola 6809 | 9,000 | 1978 | Motorola | 5 μm | 21 mm² |
Intel 8086 | 29,000 | 1978 | Intel | 3 μm | 33 mm² |
Intel 8088 | 29,000 | 1979 | Intel | 3 μm | 33 mm² |
Intel 80186 | 55,000 | 1982 | Intel | ||
Motorola 68000 | 68,000 | 1979 | Motorola | 4 μm | 44 mm² |
Intel 80286 | 134,000 | 1982 | Intel | 1.5 µm | 49 mm² |
Intel 80386 | 275,000 | 1985 | Intel | 1.5 µm | 104 mm² |
Intel 80486 | 1,180,000 | 1989 | Intel | 1 µm | 160 mm² |
Pentium | 3,100,000 | 1993 | Intel | 0.8 µm | 294 mm² |
AMD K5 | 4,300,000 | 1996 | AMD | 0.5 µm | |
Pentium II | 7,500,000 | 1997 | Intel | 0.35 µm | 195 mm² |
AMD K6 | 8,800,000 | 1997 | AMD | 0.35 µm | |
Pentium III | 9,500,000 | 1999 | Intel | 0.25 µm | |
AMD K6-III | 21,300,000 | 1999 | AMD | 0.25 µm | |
AMD K7 | 22,000,000 | 1999 | AMD | 0.25 µm | |
Pentium 4 | 42,000,000 | 2000 | Intel | 180 nm | |
Atom | 47,000,000 | 2008 | Intel | 45 nm | |
Barton | 54,300,000 | 2003 | AMD | 130 nm | |
AMD K8 | 105,900,000 | 2003 | AMD | 130 nm | |
Itanium 2 | 220,000,000 | 2003 | Intel | 130 nm | |
Cell | 241,000,000 | 2006 | Sony/IBM/Toshiba | 90 nm | |
Core 2 Duo | 291,000,000 | 2006 | Intel | 65 nm | |
AMD K10 | 463,000,000 | 2007 | AMD | 65 nm | |
AMD K10 | 758,000,000 | 2008 | AMD | 45 nm | |
Itanium 2 with 9MB cache | 592,000,000 | 2004 | Intel | 130 nm | |
Core i7 (Quad) | 731,000,000 | 2008 | Intel | 45 nm | 263 mm² |
Six-Core Xeon 7400 | 1,900,000,000 | 2008 | Intel | 45 nm | |
POWER6 | 789,000,000 | 2007 | IBM | 65 nm | 341 mm² |
Six-Core Opteron 2400 | 904,000,000 | 2009 | AMD | 45 nm | 346 mm² |
16-Core SPARC T3 | 1,000,000,000 | 2010 | Sun/Oracle Corporation|Oracle | 40 nm | 377 mm² |
Core i7 (Gulftown) | 1,170,000,000 | 2010 | Intel | 32 nm | 240 mm² |
8-core POWER7 | 1,200,000,000 | 2010 | IBM | 45 nm | 567 mm² |
z196 | 1,400,000,000 | 2010 | IBM | 45 nm | 512 mm² |
Dual-Core Itanium 2 | 1,700,000,000 | 2006 | Intel | 90 nm | 596 mm² |
Tukwila | 2,000,000,000 | 2010 | Intel | 65 nm | 699 mm² |
Core i7 (Sandy Bridge-E) | 2,270,000,000 | 2011 | Intel | 32 nm | 434 mm² |
Nehalem-EX | 2,300,000,000 | 2010 | Intel | 45 nm | 684 mm² |
10-Core Xeon Westmere-EX | 2,600,000,000 | 2011 | Intel | 32 nm | 512 mm² |
Why do we need Moore's law?
Begun as a simple observation, Moore’s Law has come to represent the amazing and seemingly inexhaustible capacity for exponential growth in electronics.<ref>P. K. Bondyopadhyay, “Moore’s Law governs the silicon revolution,”Proc. IEEE, vol. 86, no. 1, pp. 78–81, Jan. 1998.</ref> The historical regularity and predictability of Moore's Law produce organizing and coordinating effects throughout the semiconductor industry that not only set the pace of innovation, but define the rules and very nature of competition. And since semiconductors increasingly comprise a larger portion of electronics components and systems, either used directly by consumers or incorporated into end-use items purchased by consumers, the impact of Moore's Law has led users and consumers to come to expect a continuous stream of faster, better, and cheaper high-technology products. As integrated circuit costs have decreased, they have made their way into modern products ranging from automobiles to greeting cards[5].<ref>http://download.intel.com/museum/Moores_Law/Printed_Materials/Moores_Law_Backgrounder.pdf</ref>
The drivers for technology development fall into two categories: push and pull. Push drivers are technology enablers, those things that make it possible to achieve the technical improvements. Moore described the three push drivers as increasing chip area, decreasing feature size, and design cleverness. The economic drivers for Moore’s Law have been extraordinarily compelling. As the dimensions of a transistor shrank, the transistor became smaller, lighter, faster, consumed less power, and in most cases was more reliable. All of these factors make the transistor more desirable for virtually every possible application. But there is more. Historically, the semiconductor industry has been able to manufacture silicon devices at an essentially constant cost per area of processed silicon. Thus, as the devices shrank they enjoyed a shrinking cost per transistor. Each step along the roadmap of Moore’s Law virtually guaranteed economic success. <ref>http://commonsenseatheism.com/wp-content/uploads/2011/12/Mack-Fifty-Years-of-Moores-Law.pdf</ref>
Beyond Moore's Law
There are a few new technologies that have the potential to change the underlying architecture of processors and extend performance gains past the theoretical limits of traditional transistors.
Do Transistor Counts Matter?
Moore's Law concerns only the doubling of transistors on the same die space every 2 years. While some of these new technologies deal directly with adding more transistors into the same amount of space, others take a different approach to boost overall computational performance. While not strictly following Moore's Law, per se, these advanced designs will lead to a continuation of the increase in computational power that can be harnessed from hardware. They are included in the discussion to illustrate that performance is not necessarily dependent on the number of transistors that can be placed on a die. Novel approaches, such as 3-D transistor manufacturing will allow for greater densities, but other approaches, such as quantum computing operate in a different way than the traditional transistor to solve the same problem more efficiently.<ref>http://www.monolithic3d.com/2/post/2011/09/is-there-a-fundamental-limit-to-miniaturizing-cmos-transistors1.html</ref><ref>http://www.iue.tuwien.ac.at/phd/wittmann/node6.html</ref>
The Memristor
Currently being developed by Hewlett Packard, the memristor is a new type of transistor that combines both electrical charge and magnetic flux. As current flows in one direction through the circuit, resistance increases. Reversing the flow will decrease the resistance and stopping the flow will leave the resistance in the current state. This type of structure allows for both data storage and data processing (logic gate construction). Currently, it is postulated that memristors could be layered in three dimensions on silicone, yielding data and transistor densities of up to 1000 times greater than currently available. HP has reported the ability to fit 100GB in a square centimeter<ref>http://www.eetimes.com/electronics-news/4076910/-Missing-link-memristor-created-Rewrite-the-textbooks</ref> and with the ability to layer memristors, this could lead to pocket devices with a capacity of over 1 petabyte.
Some advanced theoretical capabilities of memristors are the ability to store more than one state, which can lead to analog computing. Memristor technology may also provide an excellent architecture for synaptic modeling and self-learning systems.
Quantum Computing
Quantum computing works by essentially allowing all available bits to enter into superposition. Using this superposition, each "q-bit" can be entangled with other q-bits to represent multiple states at once. By using quantum logic gates, the qbits can be manipulated to find the desired state among the superposition of states. This has great potential for drastically shortening the time necessary to solve several important problems, including integer factorization and the discrete logarithm problem, upon which much current encryption is based. Quantum computing faces several technical issues, including decoherence, which makes quantum computers difficult to construct and maintain.
Ballistic Deflection Transistors
Another promising avenue is a re-design of the traditional transistor. Essentially, single electrons are passed through a transistor and deflected into one path or the other, thus delivering a 0 or a 1. The theoretical speed of these transistors is in the terahertz range.<ref>http://www.rochester.edu/news/show.php?id=2585</ref>
Other Technologies
The arena of research to produce an alternative to the traditional transistor includes many novel approaches. They include (but are not limited to):
- Optical Computing
- DNA Computing
- Molecular Electronics
- Spintronics
- Chemical Computing
- Artificial Neural Networks
- Unconventional Computing
Conclusions
The demise of Moore's Law has been predicted several times during the past 40 years, but transistor counts continue to follow a two year doubling on average. With the traditional transistor approach, inevitable physical limits will be reached around the 16 nm process, due to quantum tunneling<ref>http://news.cnet.com/2100-1008-5112061.html</ref>. If this is true, the current pace of innovation would lead to hitting "Moore's Wall" around 2022, or in about 10 years. This "10 year horizon" for Moore's Law has existed since the early 1990's, with new designs, processes, and breakthroughs which continue to extend the timeline.<ref>http://arxiv.org/pdf/astro-ph/0404510v2.pdf</ref><ref>http://java.sys-con.com/node/557154</ref> New technologies that leverage three dimensional chip architecture would allow for years of continued growth in transistor counts and exotic designs could further increase the theoretical capacity of transistors in a particular space. If the past is used as a predictor for future trends, it is safe to say that the end of Moore's Law "is about 10 years away".
On another note, if we relax the definition of Moore's Law to include computational performance gains, we open a whole new avenue by which to measure computing power. Most of the easy gains in performance related to transistor counts have been realized, but new designs of how basic computing is performed can theoretically yield large increases in performance without doubling of transistor counts or extremely high power requirements.<ref>http://abcnews.go.com/Technology/story?id=4006166&page=1#.TzAbDcVA_H4</ref><ref>http://www.gotw.ca/publications/concurrency-ddj.htm</ref> The era of the traditional transistor is not quite over yet, but the relevance of transistor counts may be nearing it's end.
References
<references/>