[extropy-chat] Re: Nano-assembler feasibility
Chris Phoenix
cphoenix at CRNano.org
Mon Apr 5 22:34:08 UTC 2004
Eugen Leitl wrote:
> There is a continuum of approaches to self-rep molecular systems. We know
> self-assembly works, and has very large processivity due to intrinsic
> parallelism. This would work on all scales, beginning from folded linear
> biopolymers, engineered biopolymers, biopolymer analoga and completely
> synthetic analoga, as well as small cycles and cages, large complementary
> surfaces, and even macroscale assembly. It's a sufficiently powerful
paradigm
> to reach full-closure self-replicating and autopoietic systems.
When you talk of macroscale assembly and full-closure self-replicating,
are you just arguing from biology, or are you referring to some
engineering work that would tell us how to do this?
Biology uses active transport at many scales. The machine has to work
while it's still being built. I wouldn't want to try to engineer such a
thing. If someone told me to build a bunch of molecules that could
simply be mixed together and diffuse to their proper assembly slots, I
would not expect that to be useful for anything with heterogeneous
features much larger than few microns.
> Machine-phase
> goes a long way to more control, but it clearly pays the price in
energy and
> processivity.
Have you done the math? There's no problem with processivity. Even
with very crude designs--you might lose an order of magnitude or so over
the fastest bacteria, but that still means it takes a few hours to make
its mass, which is more than adequate for human engineering.
Energy, likewise; a very primitive design may take 250 kWh/kg. By
comparison, 1 kg of beef requires 7 kg of grain, or 35,000 kcal, or 41 kWh.
These are calculated for a tabletop machine-phase manufacturing system.
http://www.jetpress.org/volume13/Nanofactory.htm#s8
> I personally think swapping discrete too tips is a red herring.
> It's unnecessary, and it results in massive increase in complexity and
> decrese in processivity. Continous processes are better than discrete
cycles.
Swapping tool tips is a concession to primitive design, not a red
herring. Mill-type fabrication will be vastly more efficient of energy
and time. The nanofactory design referenced above does not include
mill-type fabrication. As far as I can see, mill-type machine-phase
chemistry, supplemented by 6DOF manipulators capable of doing
mechanochemistry and assembling mill-built components, will be far more
efficient than biology. And even if I'm missing an order of magnitude
or two, it'll still be good enough that we can pick which technology to
use according to the desired product rather than by the manufacturing
efficiency.
> Hollow ducts and small-molecule and linear-strand monomers are good
enough to
> do 3d nanolithoprinting of structural parts. Self-assembly is good
enough for
> 3d crystalline computation. A bucky mill processing batches of
stochastically
> synthesized substrate, sorting and covalently modifying, and assembling
> structures looks far better to me than building stuff by hammering
reactive
> moieties down on HOPG or diamond in UHV.
How would this be programmable? Would you have to invent new chemistry
for each product? At some point, you're probably going to need
something programmable like pick-and-place or programmable masking if
you want to do CAD-driven production. I don't see where that would fit
in this scheme.
> So we have a large space of approach candidates. Instead of sterile
> arguments about feasibility of XY, we should explore as many of these
> pathways as possible,
I agree!
> pumping as much R&D resources as we can syphon away
> from other areas of human enterprise.
There's no chance of that. And it's not necessary; funding at the
levels required to develop MNT will not require zero-sum thinking.
Chris
--
Chris Phoenix cphoenix at CRNano.org
Director of Research
Center for Responsible Nanotechnology http://CRNano.org
More information about the extropy-chat
mailing list