A General Overview of the Sensibilities and Ideas of Nanotechnology
rockies
"Nanotechnology", in the correct sense, is the manipulation of single atoms into not just compounds, but mechanisms, measuring as small as they could possibly be. As broad as this definition is, the fields of semiconductor manufacture, as well as the fields of synthetic biology, fall into this category. Although these two fields are dominated by different molecular assumptions, and different design philosophies, they are concerned with the same subject matter.

The burgeoning field was first popularized by Richard Feynman, in his 1959 lecture, "Plenty of Room at the Bottom". Within, he describes atomic layer deposition of "letters" as a kind of primitive example of what he is talking about; he discusses layering out the entirety of the Encyclopedia Brittanica, thousands of times, on the head of a pin. The methods by which he suggests this could be achieved, are very similar to photolithography, the process by which circuits are etched into silicon substrate... invented, in that form, just 7 years prior. The ability to micronize machines, smaller than ever possible, is expounded upon; across medicine, manufacture, et al.

It is the unique regime, spoken about by Feynman and others who are interested in nanotechnology, of the atomic scale which is of greatest interest. As Feynman mentions; since one is working with atoms (a very tricky thing to be sure), one has quite large tolerances to insure that each machine is exactly the same as the next. As mentioned in Peter Hoffman's book, "Life's Ratchet", many of the forces which are at such a disparity at the macro scale, begin to balance at the nanoscale. Friction and gravity decrease in importance, whereas electrostatic force, which is at best a nuisance for larger entities, becomes gravely important once one is dealing with a body only a few atoms wide. Because these forces are so similar in impact, there can exist the spontaneous, and highly efficient, transitions between different forms of energy; only at this scale. The spinning action which causes the flagellum of an E. Coli bacteria to move, is an electric motor, made up of a few dozen different types of proteins; it can spin at nearly 50,000 RPM, can change directions in a near instant, and it is estimated that it's efficiency from transforming electrical to kinetic energy is nearly 100%.

Of course, there are ongoing arguments. To squeeze into this niche, what is the best technique which one is to use? "Soft" robots, or "Hard" ones? A soft robot, makes use of the thermal cavalcade at those smaller sizes, to assist in movement and operation; a machine of this type resists thermal turmoil which would hinder it's operation, and permits itself to be pushed around by thermal activity which would assist it. Any protein motor would be an excellent example of this; the prime example, being a family of motors called "kinesin", which carry various payloads around cells. Kinesin has two "heads", which bind to a track; thru the use of a variety of chemical actions, the kinesin, in taking a step, releases one "head" from the track, leans forwards, and binds that "head" back to the track as required, without any direction; these robots can and must exist in tumultuous environments.

A "hard" robot, on the other hand, is very nearly like photolithographical technology... sometimes more absurd. [Image: GBqMBMXXYAAPDSo?format=jpg&name=large]

It is exactly how it sounds- rather than the origami-style folded-protein machines of biology, this is the construction of levers, wheels, gears, and switches using single atoms as the building blocks. The above is from @philipturnerar on Twitter (please take a look, even just for entertainment purposes), and it is a much stranger world than biology. Common elements like carbon are used, but atoms are placed exactly where they need to be in order to generate the structures, forces, and outcomes desired from the machines constructed.

In light of both schools of thought, which have their merits and should likely considered be different "tools" rather than exclusive disciplines, the Overwhelming Need for Two Technologies emerges. The Perfect Microscope, and the Perfect Placer.

Very good atomic level microscopes currently exist; however, they have their flaws. For the electron microscope, the specimen must be under very high vacuum, and a great deal of time must be spent to carefully focus the microscope. For the Atomic Force Deposition variety; even smaller, live molecules can be sampled (there is a video on YouTube where one can see a representation of kinesin moving), but it is much like a blind man tapping on Heaven's domain with his cane; the picture is blurry, and not quite complete.

Very good atomic level tweezers exist- especially for low-force applications in soft robotics, certain types of lasers can be utilized to pull, pluck, and deposit exposed crystals; but as of yet, there is no method by which a single atom can be placed in a precise spot in a lattice containing other atoms, without many, many chemical steps in between. The assembly process thought up by Nature, where chains of proteins are strung together and then folded by entropy is fantastic, but it precludes what hard nanotechnology would require, which is shown rather adequately above in the mechanical logic gate created with atomically precise manufacture. Further, these plucking tools work on "exposed" surfaces, and wouldn't suffice for in-fit molecules, of any type.



[-]
Quick Reply
Message
Type your reply to this message here.




Users browsing this thread: 1 Guest(s)