[ExI] future of proletariat, was: RE: future of slavery
atymes at gmail.com
Wed Apr 10 16:10:10 UTC 2013
On Wed, Apr 10, 2013 at 7:42 AM, spike <spike at rainier66.com> wrote:
> Some of you hardware gurus help me out here. Assuming Google glass or
> equivalent, along with Skype, let us use the example of the motorcycle
> repair. Is current hardware up to the task of letting me point to things
> with a cursor on the other person's vision? If I had the other prole
> looking at her bike so I can see what she is seeing, I want to use my
> to point at something in her field of view and say "Ok, now remove this
> bolt, and don't lose the washer underneath it." I think I could coach a
> prole all the way through a complicated repair job that way.
This was, in fact, being demonstrated on a fully automated basis back in
1997, with Boeing's Maintenance Aid Computer. Or at least there was a
prototype for it; I suspect computer vision (among other things) wasn't
quite up to the task, but it was close enough to make a serious effort.
Of course, this was also for jet airplanes, which are much more designed
for maintenance than an average motorcycle (since the market is more
businesses which care about cost of operation).
That said, the manual sharing you describe is quite possible. You might
be able to do it with a two-person Google Hangout: share the Glass's
perspective with the chat, the other person takes that as their screen,
and then screenshare the other person's view - i.e., said video plus the
other person's mouse cursor - which the Glass's user then sees. (I am
not certain if all the steps in that are possible with Glass and Hangout
as they exist now, but certainly one could make software that could do
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat