Moore’s Law, a computing term which is used in the semiconductor
industry to help in long term planning as well as research and development in
the industry was first coined by Gordon E. Moore, the co-founder of Intel
Corporation. The law, which is prediction or rather an observation, states that the number of transistors per
square inch on integrated circuits had doubled every year since the integrated
circuit was invented. In simple terms this means the processing power and speed
of computers will double every two years. This remarkable observation was preceded
by a number of inventions and innovations by scientists and engineers. These
inventions and innovations played critical role in the advancement of
integrated circuits. For example, the invention and advancement of
complementary metal-oxide semiconductor made possible extremely dense and high
performance ICs that are in use today.
![]() |
| Chip |
With this possibility of putting
millions of these tiny switches (transistors) on a chip the computing powers
and speed has gone to unprecedented level. The invention of the DRAM (Dynamic
Random Access Memory) made it easier to fabricate single transistor memory
cells. Its the invention of DRAM and flash memory that made feasible the manufacturing of low-cost and high capacity memory we have today. As the number
of transistors on a microprocessor chip increases since the Intel 4004 in 1970 the question arises how long before we
reach the limits of miniaturization of these tiny switches otherwise known as
transistors. Or could there be something revolutionary in the pipeline?
