An Overview of Nanotechnology INTRODUCTION
Nanotechnology is an anticipated manufacturing
technology giving thorough, inexpensive control of the structure of
matter. The term has sometimes been used to refer to any technique
able to work at a submicron scale; Here on sci.nanotech we are
interested in what is sometimes called molecular nanotechnology, which
means basically "A place for every atom and every atom in its place."
(other terms, such
as molecular engineering, molecular manufacturing, etc. are also often
applied).
Molecular manufacturing will enable the construction of
giga-ops computers smaller than a cubic micron; cell repair machines;
personal manufacturing and recycling appliances; and much more.
NANOTECHNOLOGY
Broadly speaking, the central thesis of nanotechnology
is that almost any chemically stable structure that can be specified
can in fact be built. This possibility was first advanced by Richard
Feynman in 1959 [4] when he said: "The principles of physics, as far
as I can see, do not speak against the possibility of maneuvering
things atom by atom." (Feynman won the 1965 Nobel prize in physics).
This concept is receiving increasing attention in the
research community. There have been two international conferences
directly on molecular nanotechnology[30,31] as well as a broad range
of conferences on related subjects. Science [23, page 26] said "The
ability to design and manufacture devices that are only tens or
hundreds of atoms across promises rich rewards in electronics,
catalysis, and materials. The scientific rewards should be just as
great, as researchers approach an ultimate level of control -
assembling matter one atom at a time." "Within the decade, [John]
Foster [at IBM Almaden] or some other scientist is likely to learn how
to piece together atoms and molecules one at a time using the STM
[Scanning Tunnelling Microscope]."
Eigler and Schweizer[25] at IBM reported on "...the use
of the STM at low temperatures (4 K) to position individual xenon
atoms on a single-crystal nickel surface with atomic precision. This
capacity has allowed us to fabricate rudimentary structures of our own
design, atom by atom. The processes we describe are in principle
applicable to molecules also. ..."
ASSEMBLERS
Drexler[1,8,11,19,32] has proposed the "assembler", a
device having a submicroscopic robotic arm under computer control. It
will be capable of holding and positioning reactive compounds in order
to control the precise location at which chemical reactions take
place. This general
approach should allow the construction of large atomically precise
objects by a sequence of precisely controlled chemical reactions,
building objects molecule by molecule. If designed to do so,
assemblers will be able to build copies of themselves, that is, to
replicate.
Because they will be able to copy themselves, assemblers
will be inexpensive. We can see this by recalling that many other
products of molecular machines--firewood, hay, potatoes--cost very
little. By working in large teams, assemblers and more specialized
nanomachines will be able to build objects cheaply. By ensuring that
each atom is properly placed, they will manufacture products of high
quality and
reliability. Left-over molecules would be subject to this strict
control as well, making the manufacturing process extremely clean.
Ribosomes
The plausibility of this approach can be illustrated by
the ribosome. Ribosomes manufacture all the proteins used in all
living things on this planet. A typical ribosome is relatively small
(a few thousand cubic nanometers) and is capable of building almost
any protein by stringing together amino acids (the building blocks of
proteins) in a
precise linear sequence. To do this, the ribosome has a means of
grasping a specific amino acid (more precisely, it has a means of
selectively grasping a specific transfer RNA, which in turn is
chemically bonded by a specific enzyme to a specific amino acid), of
grasping the growing polypeptide, and of causing the specific amino
acid to react with and be added to the end of the polypeptide[9].
The instructions that the ribosome follows in building a
protein are provided by mRNA (messenger RNA). This is a polymer formed
from the four bases adenine, cytosine, guanine, and uracil. A sequence
of several hundred to a few thousand such bases codes for a specific
protein. The ribosome "reads" this "control tape" sequentially, and
acts on the directions it provides.
Assemblers
In an analogous fashion, an assembler will build an
arbitrary molecular structure following a sequence of instructions.
The assembler, however, will provide three-dimensional positional and
full orientational control over the molecular component (analogous to
the individual amino acid) being added to a growing complex molecular
structure (analogous to the growing polypeptide). In addition, the
assembler will be able to form any one of several different kinds of
chemical bonds, not just the single kind (the peptide bond) that the
ribosome makes.
Calculations indicate that an assembler need not
inherently be very large. Enzymes "typically" weigh about 10^5 amu
(atomic mass units). while the ribosome itself is about 3 x 10^6
amu[9]. The smallest assembler might be a factor of ten or so larger
than a ribosome. Current design ideas for an assembler are somewhat
larger than this: cylindrical "arms" about 100 nanometers in length
and 30 nanometers in diameter, rotary joints to allow arbitrary
positioning of the tip of the arm, and a worst-case positional
accuracy at the tip of perhaps 0.1 to 0.2 nanometers, even in the
presence of thermal noise. Even a
solid block of diamond as large as such an arm weighs only sixteen
million amu, so we can safely conclude that a hollow arm of such
dimensions would weigh less. Six such arms would weigh less than 10^8
amu.
Molecular Computers
The assembler requires a detailed sequence of control
signals, just as the ribosome requires mRNA to control its actions.
Such detailed control signals can be provided by a computer. A
feasible design for a molecular computer has been presented by
Drexler[2,11]. This design is mechanical in nature, and is based on
sliding rods that interact by blocking or unblocking each other at
"locks." This design has a size
of about 5 cubic nanometers per "lock" (roughly equivalent to a single
logic gate). Quadrupling this size to 20 cubic nanometers (to allow
for power, interfaces, and the like) and assuming that we require a
minimum of 10^4 "locks" to provide minimal control results in a volume
of 2 x 10^5 cubic nanometers (.0002 cubic microns) for the
computational element. (This many gates is sufficient to build a
simple 4-bit or 8-bit general purpose computer, e.g. a 6502).
An assembler might have a kilobyte of high speed
(rod-logic based) RAM, (similar to the amount of RAM used in a modern
one-chip computer) and 100 kilobytes of slower but more dense "tape"
storage - this tape storage would have a mass of 10^8 amu or less
(roughly 10 atoms per bit - see below). Some additional mass will be
used for communications (sending and receiving signals from other
computers) and power. In addition, there will probably be a "toolkit"
of interchangable tips
that can be placed at the ends of the assembler's arms. When
everything is added up a small assembler, with arms, computer,
"toolkit," etc. should weigh less than 10^9 amu.
Escherichia coli (a common bacterium) weigh about 10^12
amu[9, page 123]. Thus, an assembler should be much larger than a
ribosome, but much smaller than a bacterium.
Self-Replicating Systems
It is also interesting to compare Drexler's architecture
for an assembler with the Von Neumann architecture for a self
replicating device. Von Neumann's "universal constructing
automaton"[21] had both a universal Turing machine to control its
functions and a "constructing arm" to build the "secondary automaton."
The constructing arm can be positioned in a two-dimensional plane, and
the "head" at the end of the constructing arm is used to build the
desired structure. While Von Neumann's construction was theoretical
(existing
in a two dimensional cellular automata world), it still embodied many
of the critical elements that now appear in the assembler.
Should we be concerned about runaway replicators? It
would be hard to build a machine with the wonderful adaptability of
living organisms. The replicators easiest to build will be inflexible
machines, like automobiles or industrial robots, and will require
special fuels and raw materials, the equivalents of hydraulic fluid
and gasoline. To
build a runaway replicator that could operate in the wild would be
like building a car that could go off-road and fuel itself from tree
sap. With enough work, this should be possible, but it will hardly
happen by accident. Without replication, accidents would be like those
of industry today: locally harmful, but not catastrophic to the
biosphere. Catastrophic problems seem more likely to arise though
deliberate misuse, such as the use of nanotechnology for military
aggression.
Positional Chemistry
Chemists have been remarkably successful at synthesizing
a wide range of compounds with atomic precision. Their successes,
however, are usually small in size (with the notable exception of
various polymers). Thus, we know that a wide range of atomically
precise structures with perhaps a few hundreds of atoms in them are
quite feasible. Larger atomically precise structures with complex
three-dimensional shapes can be viewed as a connected sequence of
small atomically precise structures. While chemists have the ability
to precisely sculpt small collections of atoms there is currently no
ability to extend this capability in a general way to structures of
larger size. An obvious structure of considerable scientific and
economic interest is the computer. The ability to manufacture a
computer from atomically precise logic elements of molecular size, and
to position those logic elements into a three- dimensional volume with
a highly precise and intricate interconnection pattern would have
revolutionary consequences for the computer industry.
A large atomically precise structure, however, can be
viewed as simply a collection of small atomically precise objects
which are then linked together. To build a truly broad range of large
atomically precise objects requires the ability to create highly
specific positionally controlled bonds. A variety of highly flexible
synthetic techniques have been considered in [32]. We shall describe
two such methods here
to give the reader a feeling for the kind of methods that will
eventually be feasible.
We assume that positional control is available and that
all reactions take place in a hard vacuum. The use of a hard vacuum
allows highly reactive intermediate structures to be used, e.g., a
variety of radicals with one or more dangling bonds. Because the
intermediates are in a vacuum, and because their position is
controlled (as opposed to solutions, where the position and
orientation of a molecule are largely random), such radicals will not
react with the wrong thing for the very simple reason that they will
not come into contact with the wrong thing.
Normal solution-based chemistry offers a smaller range
of controlled synthetic possibilities. For example, highly reactive
compounds in solution will promptly react with the solution. In
addition, because positional control is not provided, compounds
randomly collide with other compounds. Any reactive compound will
collide randomly and react randomly with anything available.
Solution-based chemistry requires
extremely careful selection of compounds that are reactive enough to
participate in the desired reaction, but sufficiently non-reactive
that they do not accidentally participate in an undesired side
reaction. Synthesis under these conditions is somewhat like placing
the parts of a radio into a box, shaking, and pulling out an assembled
radio. The ability of chemists to synthesize what they want under
these conditions is amazing.
Much of current solution-based chemical synthesis is
devoted to preventing unwanted reactions. With assembler-based
synthesis, such prevention is a virtually free by-product of
positional control.
To illustrate positional synthesis in vacuum somewhat
more concretely, let us suppose we wish to bond two compounds, A and
B. As a first step, we could utilize positional control to selectively
abstract a specific hydrogen atom from compound A. To do this, we
would employ a radical that had two spatially distinct regions: one
region would have a high affinity for hydrogen while the other region
could be built
into a larger "tip" structure that would be subject to positional
control. A simple example would be the 1-propynyl radical, which
consists of three co-linear carbon atoms and three hydrogen atoms
bonded to the sp3 carbon at the "base" end. The radical carbon at the
radical end is triply bonded to the middle carbon, which in turn is
singly bonded to the base carbon. In a real abstraction tool, the base
carbon would be bonded to other carbon atoms in a larger diamondoid
structure which provides positional control, and the tip might be
further stabilized by a surrounding "collar" of unreactive atoms
attached near the base that would prevent lateral motions of the
reactive tip.
The affinity of this structure for hydrogen is quite
high. Propyne (the same structure but with a hydrogen atom bonded to
the "radical" carbon) has a hydrogen-carbon bond dissociation energy
in the vicinity of 132 kilocalories per mole. As a consequence, a
hydrogen atom will prefer being bonded to the 1-propynyl hydrogen
abstraction tool in preference to being bonded to almost any other
structure. By positioning the hydrogen abstraction tool over a
specific hydrogen atom on compound A, we can perform a site specific
hydrogen
abstraction reaction. This requires positional accuracy of roughly a
bond length (to prevent abstraction of an adjacent hydrogen). Quantum
chemical analysis of this reaction by Musgrave et. al.[41] show that
the activation energy for this reaction is low, and that for the
abstraction of hydrogen from the hydrogenated diamond (111) surface
(modeled by isobutane) the barrier is very likely zero.
Having once abstracted a specific hydrogen atom from
compound A, we can repeat the process for compound B. We can now join
compound A to compound B by positioning the two compounds so that the
two dangling bonds are adjacent to each other, and allowing them to
bond.
This illustrates a reaction using a single radical. With
positional control, we could also use two radicals simultaneously to
achieve a specific objective. Suppose, for example, that two atoms A1
and A2 which are part of some larger molecule are bonded to each
other. If we were to position the two radicals X1 and X2 adjacent to
A1 and A2, respectively, then a bonding structure of much lower free
energy would be one in which the A1-A2 bond was broken, and two new
bonds A1-X1 and
A2-X2 were formed. Because this reaction involves breaking one bond
and making two bonds (i.e., the reaction product is not a radical and
is chemically stable) the exact nature of the radicals is not
critical. Breaking one bond to form two bonds is a favored reaction
for a wide range of cases. Thus, the positional control of two
radicals can be used to break any of a wide range of bonds.
A range of other reactions involving a variety of
reactive intermediate compounds (carbenes are among the more
interesting ones) are proposed in [32], along with the results of
semi-empirical and ab initio quantum calculations and the available
experimental evidence.
Another general principle that can be employed with
positional synthesis is the controlled use of force. Activation
energy, normally provided by thermal energy in conventional chemistry,
can also be provided by mechanical means. Pressures of 1.7 megabars
have been achieved experimentally in macroscopic systems[43]. At the
molecular
level such pressure corresponds to forces that are a large fraction of
the force required to break a chemical bond. A molecular vise made of
hard diamond-like material with a cavity designed with the same
precision as the reactive site of an enzyme can provide activation
energy by the extremely precise application of force, thus causing a
highly specific reaction between two compounds.
To achieve the low activation energy needed in reactions
involving radicals requires little force, allowing a wider range of
reactions to be caused by simpler devices (e.g., devices that are able
to generate only small force). Further analysis is provided in [32].
Feynman said: "The problems of chemistry and biology can
be greatly helped if our ability to see what we are doing, and to do
things on an atomic level, is ultimately developed - a development
which I think cannot be avoided." Drexler has provided the substantive
analysis required before this objective can be turned into a reality.
We are nearing an era when we will be able to build virtually any
structure
that is specified in atomic detail and which is consistent with the
laws of chemistry and physics. This has substantial implications for
future medical technologies and capabilities.
Cost
One consequence of the existence of assemblers is that
they are cheap. Because an assembler can be programmed to build almost
any structure, it can in particular be programmed to build another
assembler. Thus, self reproducing assemblers should be feasible and in
consequence the manufacturing costs of assemblers would be primarily
the cost of the raw materials and energy required in their
construction. Eventually (after amortization of possibly quite high
development costs), the
price of assemblers (and of the objects they build) should be no
higher than the price of other complex structures made by
self-replicating systems. Potatoes - which have a staggering design
complexity involving tens of thousands of different genes and
different proteins directed by many megabits of genetic information -
cost well under a dollar per pound.
PATHWAYS TO NANOTECHNOLOGY
The three paths of protein design (biotechnology),
biomimetic chemistry, and atomic positioning are parts of a broad
bottom up strategy: working at the molecular level to increase our
ability to control matter. Traditional miniaturization efforts based
on microelectronics technology have reached the submicron scale; these
can be characterized as the top down strategy. The bottom-up strategy,
however, seems more promising. INFORMATION
More information on nanotechnology can be found in these
books (all by Eric Drexler (and various co-authors)):
Engines of Creation (Anchor, 1986) ISBN: 0-385-19972-2
This book was the definition of the original charter of
sci.nanotech. Popularly written, it introduces assemblers, and
discusses the various social and technical implications nanotechnology
might have.
Unbounding the Future (Morrow, 1991) 0-688-09124-5
Essentially an update of Engines, with a better
low-level description of how nanomachines might work, and less
speculation on space travel, cryonics, etc.
Nanosystems (Wiley, 1992) 0-471-57518-6
This is the technical book that grew out of Drexler's
PhD thesis. It is a real tour de force that provides a substantial
theoretical background for nanotech ideas.
The Foresight Institute publishes on both technical and
nontechnical issues in nanotechnology. For example, students may write
for their free Briefing #1, "Studying Nanotechnology". The Foresight
Institute's
main publications are the Update newsletter and Background essay
series. The Update newsletter includes both policy discussions and a
technical column enabling readers to find material of interest in the
recent scientific literature. These publications can be found at
Foresight's web page.
email address: foresight@cup.portal.com
A set of papers and the archives of sci.nanotech can be
had by standard anonymous FTP to nanotech.rutgers.edu. /nanotech
Sci.nanotech is moderated and is intended to be of a
technical nature.
--JoSH (moderator)
REFERENCES
[Not all of these are referred to in the text, but they
are of interest nevertheless.]
1. "Engines of Creation" by K. Eric Drexler, Anchor
Press, 1986.
2. "Nanotechnology: wherein molecular computers control
tiny circulatory submarines", by A. K. Dewdney, Scientific American,
January 1988, pages 100 to 103.
3. "Foresight Update", a publication of the Foresight
Institute, Box 61058, Palo Alto, CA 94306.
4. "There's Plenty of Room at the Bottom" a talk by
Richard Feynman (awarded the Nobel Prize in Physics in 1965) at an
annual meeting of the American Physical Society given on December 29,
1959. Reprinted in "Miniaturization", edited by H. D. Gilbert
(Reinhold, New York, 1961) pages 282-296.
5. "Scanning Tunneling Microscopy and Atomic Force
Microscopy: Application to Biology and Technology" by P. K. Hansma, V.
B. Elings, O. Marti, and C. E. Bracker. Science, October 14 1988, page
209-216.
6. "Molecular manipulation using a tunnelling
microscope," by J. S. Foster, J. E. Frommer and P. C. Arnett. Nature,
Vol. 331 28 January 1988, pages 324-326.
7. "The fundamental physical limits of computation" by
Charles H. Bennet and Rolf Landauer, Scientific American Vol. 253,
July 1985, pages 48-56.
8. "Molecular Engineering: An Approach to the
Development of General Capabilities for Molecular Manipulation," by K.
Eric Drexler, Proceedings of the National Academy of Sciences (USA),
Vol 78, pp 5275- 78, 1981.
9. "Molecular Biology of the Gene", fourth edition, by
James D. Watson, Nancy H. Hopkins, Jeffrey W. Roberts, Joan
Argetsinger Steitz, and Alan M. Weiner. Benjamin Cummings, 1987. It
can now be purchased as a single large volume.
10. "Tiny surgical robot being developed", San Jose
Mercury News, Feb. 18, 1989, page 26A
11. "Rod Logic and Thermal Noise in the Mechanical
Nanocomputer", by K. Eric Drexler, Proceedings of the Third
International Symposium on Molecular Electronic Devices, F. Carter
ed., Elsevier 1988.
12. "Submarines small enough to cruise the bloodstream",
in Business Week, March 27 1989, page 64.
13. "Conservative Logic", by Edward Fredkin and Tommaso
Toffoli, International Journal of Theoretical Physics, Vol. 21 Nos.
3/4, 1982, pages 219-253.
14. "The Tomorrow Makers", Grant Fjermedal, MacMillan
1986.
15. "Dissipation and noise immunity in computation and
communication" by Rolf Landauer, Nature, Vol. 335, October 27 1988,
page 779.
16. "Notes on the History of Reversible Computation" by
Charles H. Bennett, IBM Journal of Research and Development, Vol. 32,
No. 1, January 1988.
17. "Classical and Quantum Limitations on Energy
Consumption in Computation" by K. K. Likharev, International Journal
of Theoretical Physics, Vol. 21, Nos. 3/4, 1982.
18. "Principles and Techniques of Electron Microscopy:
Biological Applications," Third edition, by M. A. Hayat, CRC Press,
1989.
19. "Machines of Inner Space" by K. Eric Drexler, 1990
Yearbook of Science and the Future, pages 160-177, published by
Encyclopedia Britannica, Chicago 1989.
20. "Reversible Conveyer Computation in Array of
Parametric Quantrons" by K. K. Likharev, S. V. Rylov, and V. K.
Semenov, IEEE Transactions on Magnetics, Vol. 21 No. 2, March 1985,
pages 947-950
21. "Theory of Self Reproducing Automata" by John Von
Neumann, edited by Arthur W. Burks, University of Illinois Press,
1966.
22. "The Children of the STM" by Robert Pool, Science,
Feb. 9, 1990, pages 634-636.
23. "A Small Revolution Gets Under Way," by Robert Pool,
Science, Jan. 5 1990.
24. "Advanced Automation for Space Missions",
Proceedings of the 1980 NASA/ASEE Summer Study, edited by Robert A.
Freitas, Jr. and William P. Gilbreath. Available from NTIS, U.S.
Department of Commerce, National Technical Information Service,
Springfield, VA 22161; telephone 703-487- 4650, order no. N83-15348
25. "Positioning Single Atoms with a Scanning Tunnelling
Microscope," by D. M. Eigler and E. K. Schweizer, Nature Vol 344,
April 5 1990, page 524-526.
26. "Mind Children" by Hans Moravec, Harvard University
Press, 1988.
27. "Microscopy of Chemical-Potential Variations on an
Atomic Scale" by C.C. Williams and H.K. Wickramasinghe, Nature, Vol
344, March 22 1990, pages 317-319.
28. "Time/Space Trade-Offs for Reversible Computation"
by Charles H. Bennett, SIAM J. Computing, Vol. 18, No. 4, pages
766-776, August 1989.
29. "Fixation for Electron Microscopy" by M. A. Hayat,
Academic Press, 1981.
30. "Nonexistent technology gets a hearing," by I.
Amato, Science News, Vol. 136, November 4, 1989, page 295.
31. "The Invisible Factory," The Economist, December 9,
1989, page 91.
32. "Nanosystems: Molecular Machinery, Manufacturing and
Computation," by K. Eric Drexler, John Wiley 1992.
33. "MITI heads for inner space" by David Swinbanks,
Nature, Vol 346, August 23 1990, page 688-689.
34. "Fundamentals of Physics," Third Edition Extended,
by David Halliday and Robert Resnick, Wiley 1988.
35. "General Chemistry" Second Edition, by Donald A.
McQuarrie and Peter A. Rock, Freeman 1987.
36. "Charles Babbage On the Principles and Development
of the Calculator and Other Seminal Writings" by Charles Babbage and
others. Dover, New York, 1961.
37. "Molecular Mechanics" by U. Burkert and N. L.
Allinger, American Chemical Society Monograph 177 (1982).
38. "Breaking the Diffraction Barrier: Optical
Microscopy on a Nanometric Scale" by E. Betzig, J. K. Trautman, T.D.
Harris, J.S. Weiner, and R.L. Kostelak, Science Vol. 251, March 22
1991, page 1468.
39. "Two Types of Mechanical Reversible Logic," by Ralph
C. Merkle, submitted to Nanotechnology.
40. "Atom by Atom, Scientists build 'Invisible' Machines
of the Future," Andrew Pollack, The New York Times, Science section,
Tuesday November 26, 1991, page B7.
41. "Theoretical analysis of a site-specific hydrogen
abstraction tool," by Charles Musgrave, Jason Perry, Ralph C. Merkle
and William A. Goddard III, in Nanotechnology, April 1992.
42. "Near-Field Optics: Microscopy, Spectroscopy, and
Surface Modifications Beyond the Diffraction Limit" by Eric Betzig and
Jay K. Trautman, Science, Vol. 257, July 10 1992, pages 189-195.
43. "Guinness Book of World Records," Donald McFarlan
et. al., Bantam 1989.
Adapted by J.Storrs Hall from papers by Ralph C. Merkle and K. Eric
Drexler