[extropy-chat] Name that system

Eugen Leitl eugen at leitl.org
Wed Dec 20 17:28:10 UTC 2006


On Wed, Dec 20, 2006 at 08:35:43AM -0800, Jef Allbright wrote:

> Thanks for the pointer to the work at Nanorex.  I'm pleased with the
> work they're doing with atomic level CAD/simulation software (but why in

It's a good package. I'm missing 6DOF device support (btw, if you thought
the SpaceTraveler was too expensive, look at http://www.3dconnexion.com/products/3a1d.php
-- yes, there will be an SDK soon, and Linux support is tentatively
promised). What this thing can be used for -- if you're not a CAD user,
check out http://www.gearthblog.com/blog/archives/2006/11/youtube_demo_of_spac.html
and http://www.ogleearth.com/2006/11/3dconnexions_sp.html

> the world did they state performance in terms of hours to process on a
> laptop running XP?), but I would like to see much more empirical work.

Yes, this thing doesn't really scale for very large models. Needs multicore
support at least, MPI for large clusters.

> As you may know, until May of 2006 I was a technical manager with the
> world's leading manufacturer of Atomic Force Microscopes, and I'm well

It's too bad functionalized tips and multip instruments with multiple
degrees of freedom are still a long while off.

> aware of the many serious challenges involved in probing nanoscale
> structures, but the challenge of picking and placing an assortment of
> atoms in an assortment of configurations far exceeds the measurement
> problem.  The pure designs are neat, but I'd like to know more about how

Machine-phase has severe bootstrap/catch 22 issues. Bootstrap will most
likely happen by a mix of self-assembly (bottom-up) and manipulative
proximal probe (top-down), which makes it extremely demanding. Because
of this there's an all-or-nothing capability threshold. Right now
the bottleneck for bootstrap is building a library of anabolic/catabolic
reaction steps computationally, and validate that experimentally. Then
one has to construct a minimal sequence of events to lead to an autocatalytic
machine-phase system, and then go out and get funding to address that
Grand Challenge. 

The Nanoengineer package teaches people that there's a space of structures
with very neat properties, especially if build in large (mole) quantities.
People tend to believe in things they can build and manipulate interactively 
on the screen. Seeing and doing is very much believing.

> they'll deal with the inevitable contamination from stray atoms and
> water molecules at various stages of the process.  I see some elegant

The machine-phase approach doesn't bake out things in UHV, which of course
will never get them clean enough. It would tend to pull things like HOPG
apart, which will result in an atomically clean surface -- only, for a
volume. You've got your deposition tool which is fed by a selective
pump, so there's no contamination but whatever volatiles are cleaved off
during botched deposition/abstraction steps. You'd have to catch these, of
course, as otherwise your volume and especially the surface which concentrates
volatiles moieties will dirty up until eventually the error rate goes up
to being intolerable.

> mechanical designs, but I don't see the robustness that comes with the
> organic configurations of nature. Any pointers to relevant recent

That's deliberate. These people don't believe in self-assembly with failure-tolerance.
Both approaches have advantages and disadvantages. The machine-phase has access to
much wider structural space, and can achieve much higher functionality concentrations
than anything done with self-assembly, which needs to encode instructions for assembly
along with the circuitry itself (yes, you can cleave off some auxiliary scaffolding,
but only up to a point). On the minus side, the only effective nanotechnology
today is synthetic biology (de novo engineering, blah-blah, whatever's the name these
days), and it doesn't help you at all much with machine-phase. So if you want to
go self-assembly, you've got some pretty powerful tools up to said mole-sized
(fermenters and reactors) massively-parallel fabbing capacites which you can only
match by scaling up many machine-phase reactive sites, which is firmly in our future.

> thinking on this would be appreciated.
> 
> > What may not be clear is the jump from the ~2600 atom
> > fine motion controller design to the ~25,000 atom worm
> > gear design.   That is an order of magnitude jump
> > by ~4 people.  The jump to multi-million atom nanoassembly
> > arms is only two more orders of magnitude.  400 people or
> > much more clever design... you be the judge. 
> 
> Robert, on what basis do you think you can scale such a project in such
> simple terms?  I don't know what else to say here.

The machine-phase stuff is supposed to scale like conventional CAD,
so one assumes building up more complexity from modules with 
standartized interfaces. This is different from evolutionary design,
which typically don't result in clean modules. Of course, if you want
to build an optimal structure, CAD and human design won't work. You'll
need statistical methods to sample the structure space in a massively
parallel fashion, which is prohibitive with human work.

-- 
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820            http://www.ativel.com
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 191 bytes
Desc: Digital signature
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20061220/776c6646/attachment.bin>


More information about the extropy-chat mailing list