- History of nanotechnology
Although
nanotechnology is a relatively recent development in scientific research, the development of its central concepts happened over a longer period of time.Overview
In
1965 ,Gordon Moore , one of the founders ofIntel Corporation, made the astounding prediction that the number of transistors that could be fit in a given area would double every 18 months for the next ten years. This it did and the phenomenon became known asMoore's Law . This trend has continued far past the predicted 10 years until this day, going from just over 2000 transistors in the original 4004 processors of1971 to over 40,000,000 transistors in the Pentium 4. There has, of course, been a corresponding decrease in the size of individual electronic elements, going from millimeters in the 60's to hundreds of nanometers in modern circuitry.At the same time, the
chemistry ,biochemistry andmolecular genetics communities have been moving in the other direction. Over much the same period, it has become possible to direct the synthesis, either in the test tube or in modified living organisms,Finally, the last quarter of a century has seen tremendous advances in our ability to control and manipulate light. We can generate light pulses as short as a few femtoseconds (1 fs = 10−15 s). Light too has a size and this size is also on the hundred nanometer scale.
Thus now, at the beginning of a new century, three powerful technologies have met on a common scale — the nanoscale — with the promise of revolutionizing both the worlds of electronics and of biology. This new field, which we refer to as biomolecular nanotechnology, holds many possibilities from fundamental research in molecular biology and biophysics to applications in biosensing, biocontrol, bioinformatics, genomics, medicine, computing, information storage and energy conversion.
Historical background
Humans have unwittingly employed nanotechnology for thousands of years, for example in making steel, paintings and in vulcanizing rubber. [ [http://www.deccanherald.com/Content/Jan62008/national2008010645193.asp?section=updatenews Indian craftsmen, artisans used nanotech 2000 years ago] ] Each of these processes rely on the properties of
stochastic ally-formed atomic ensembles mere nanometers in size, and are distinguished fromchemistry in that they don't rely on the properties of individual molecules. But the development of the body of concepts now subsumed under the term nanotechnology has been slower.The first mention of some of the distinguishing concepts in nanotechnology (but predating use of that name) was in 1867 by
James Clerk Maxwell when he proposed as a thought experiment a tiny entity known asMaxwell's Demon able to handle individual molecules.The first observations and size measurements of nano-particles was made during first decade of 20th century. They are mostly associated with
Richard Adolf Zsigmondy who made detail study of gold sols and other nanomaterials with sizes down to 10 nm and less. He published a book in 1914. [ Zsigmondy, R. "Colloids and the Ultramicroscope", J.Wiley and Sons, NY, (1914)] . He usedultramicroscope that employes "dark field" method for seeing particles with sizes much less thanlight wavelength . Zsigmondy was also the first who used nanometer explicitly for characterizing particle size. He determined it as 1/1,000,000 ofmillimeter . He developed a first system classification based on particle size in nanometer range.There have been many significant developments during 20th century in characterizing nanomaterials and related phenomena, belonging to the field of
interface and colloid science . In the 1920s,Irving Langmuir andKatharine B. Blodgett introduced the concept of amonolayer , a layer of material one molecule thick. Langmuir won aNobel Prize in chemistry for his work. In early 1950s, Derjaguin and Abrikosova conducted the first measurement of surface forces [Derjaguin, B.V. "Discuss. Faraday Soc., No. 18, 24-27, 182-187, 198, 211, 215-219 (1954)] .There have been many studies of "periodic colloidal structures" and principles of
molecular self-assembly that are overviewed in the paper [ Efremov, I.F. "Periodic Colloidal Structures", in "Surface and Colloid Science", vol. 8, Wiley, NY (1975)] . There are many other discoveries that serve as scientific basis for the modern nanotechnology can be found in the "Fundamentals of Interface and Colloid Science by H.Lyklema [Lyklema, J. "Fundamentals of Interface and Colloid Science", vol.1-5 Academic Press, (1995-2000)] .Conceptual origins
The topic of nanotechnology was again touched upon by "
There's Plenty of Room at the Bottom ," a talk given by physicistRichard Feynman at anAmerican Physical Society meeting atCaltech onDecember 29 ,1959 . Feynman described a process by which the ability to manipulate individual atoms and molecules might be developed, using one set of precise tools to build and operate another proportionally smaller set, so on down to the needed scale. In the course of this, he noted, scaling issues would arise from the changing magnitude of various physical phenomena: gravity would become less important, surface tension and Van der Waals attraction would become more important, etc. This basic idea appears feasible, andexponential assembly enhances it withparallelism to produce a useful quantity of end products. At the meeting, Feynman announced two challenges, and he offered a prize of $1000 for the first individuals to solve each one. The first challenge involved the construction of ananomotor , which, to Feynman's surprise, was achieved by November of 1960 by William McLellan. The second challenge involved the possibility of scaling down letters small enough so as to be able to fit the entireEncyclopedia Britannica on the head of a pin; this prize was claimed in 1985 by Tom Newman. [Gribbin, John. "Richard Feynman: A Life in Science" Dutton 1997, pg 170. ]In 1965
Gordon Moore observed that silicon transistors were undergoing a continual process of scaling downward, an observation which was later codified asMoore's law . Since his observation transistor minimum feature sizes have decreased from 10 micrometers to the 45-65 nm range in2007 ; one minimum feature is thus roughly 180 silicon atoms long.The term "nanotechnology" was first defined by
Norio Taniguchi of theTokyo Science University in a1974 paper Norio Taniguchi, "On the Basic Concept of 'Nano-Technology'," Proc. Intl. Conf. Prod. Eng. Tokyo, Part II, Japan Society of Precision Engineering, 1974. ] as follows: "'Nano-technology' mainly consists of the processing of, separation, consolidation, and deformation of materials by one atom or one molecule." Since that time the definition of nanotechnology has generally been extended to include features as large as 100 nm. Additionally, the idea that nanotechnology embraces structures exhibitingquantum mechanical aspects, such asquantum dots , has further evolved its definition.Also in 1974 the process of [http://www.fmnt.fi/index.pl?id=2515 atomic layer deposition] , for depositing uniform thin films one atomic layer at a time, was developed and patented by [http://www.fmnt.fi/index.pl?id=2913&isa=Category&op=show Dr. Tuomo Suntola and co-workers in Finland.]
In the 1980s the idea of nanotechnology as
deterministic , rather thanstochastic , handling of individual atoms and molecules was conceptually explored in depth by Dr. K. Eric Drexler, who promoted the technological significance of nano-scale phenomena and devices through speeches and the books and " [http://www.e-drexler.com/d/06/00/Nanosystems/toc.html Nanosystems: Molecular Machinery, Manufacturing, and Computation] ," (ISBN 0-471-57518-6). Drexler's vision of nanotechnology is often called "Molecular Nanotechnology " (MNT) or "molecular manufacturing," and Drexler at one point proposed the term "zettatech" which never became popular.In 2004 [http://www.softmachines.org/wordpress/ Richard Jones] wrote a book called Soft Machines (nanotechnology and life), is a book about nanotechnology for the general reader, published by Oxford University. In this book he describe radical nanotechnology as a deterministic/mechanistic idea of nano engineered machines that does not take into account the nanoscale challenges such as wetness, stickness, brownian motion, high viscosity (Drexler view). He also explains what is soft nanotechnology or more appropriatelly
biomimetic nanotechnology which is the way forward, if not the best , to design functional nanodevices that can cope with all the problems at nanoscale. One can think of soft nanotechnology as the development of nanomachines that uses the lessons learned from biology on how things work, chemistry to precisely engineer such devices and stochastic physics to model the system and its natural processes in detail.Experimental advances
Nanotechnology and
nanoscience got a boost in the early 1980s with two major developments: the birth of cluster science and the invention of thescanning tunneling microscope (STM). This development led to the discovery offullerenes in 1985 and the structural assignment ofcarbon nanotubes a few years later. In another development, the synthesis and properties of semiconductornanocrystal s were studied. This led to a fast increasing number ofsemiconductor nanoparticles ofquantum dots .In the early 1990s Huffman and Kraetschmer, of the
University of Arizona , discovered how to synthesize and purify large quantities of fullerenes. This opened the door to their characterization and functionalization by hundreds of investigators in government and industrial laboratories. Shortly after, rubidium doped C60 was found to be a mid temperature (Tc = 32 K) superconductor. At a meeting of the Materials Research Society in 1992, Dr. T. Ebbesen (NEC) described to a spellbound audience his discovery and characterization of carbon nanotubes. This event sent those in attendance and others downwind of his presentation into their laboratories to reproduce and push those discoveries forward. Using the same or similar tools as those used by Huffman and Kratschmere, hundreds of researchers further developed the field of nanotube-based nanotechnology.At present in
2007 the practice of nanotechnology embraces both stochastic approaches (in which, for example,supramolecular chemistry creates waterproof pants) and deterministic approaches wherein single molecules (created by stochastic chemistry) are manipulated on substrate surfaces (created by stochastic deposition methods) by deterministic methods comprising nudging them withSTM or AFM probes and causing simple binding or cleavage reactions to occur. The dream of a complex, deterministic molecular nanotechnology remains elusive. Since the mid 1990s, thousands of surface scientists and thin film technocrats have latched on to the nanotechnology bandwagon and redefined their disciplines as nanotechnology. This has caused much confusion in the field and has spawned thousands of "nano"-papers on the peer reviewed literature. Most of these reports are extensions of the more ordinary research done in the parent fields.For the future, some means has to be found for MNT design evolution at the nanoscale which mimics the process of biological evolution at the molecular scale. Biological evolution proceeds by random variation in ensemble averages of organisms combined with culling of the less-successful variants and reproduction of the more-successful variants, and macroscale engineering design also proceeds by a process of design evolution from simplicity to complexity as set forth somewhat satirically by
John Gall : "A complex system that works is invariably found to have evolved from a simple system that worked. . . . A complex system designed from scratch never works and can not be patched up to make it work. You have to start over, beginning with a system that works." Richard Dawkins, The Blind Watchmaker: Why the Evidence of Evolution Reveals a Universe Without Design, W. W. Norton; Reissue edition (September 19, 1996)] comprising random molecular variation and deterministic reproduction/extinction.References
Wikimedia Foundation. 2010.