[ExI] The tyranny of context free grammars.
ryanobjc at gmail.com
Tue Dec 7 07:25:25 UTC 2010
Having known a few compiler writers, they seem to resoundingly come
down against LALR parsers for implementing CFGs.
Alternatives to CFGs maybe might be nice, but to what end? The
advantage of the CFG is it's simplicity in both usage and development
(compiler side). I'd be careful about pining for more languages like
perl, you have obviously never maintained significant systems written
in perl. To be avoided if at all possible.
I'd also like to leave you with this gem by the (in)famous Dijkstra:
"On the foolishness of "natural language programming"."
On Mon, Dec 6, 2010 at 10:49 PM, Alan Grimes <agrimes at speakeasy.net> wrote:
> While I should be sleeping right now, I was thinking about my extensive
> library of books and was inspired to break character and write an
> inspirational piece about some transhumanist research that can be done
> by anyone with little or no equipment.
> Most of my books on computer science present context free grammars as
> the one and only way to program computers. It is true that languages
> such as Perl do have some context sensitivity but that doesn't disprove
> the point. The point is that all of computer programming and compiler
> design is nested in the ontology of the Chomsky Heirarchy. The reason
> for this is that Chomsky grammars provably solve the problem of
> describing and classifying computations so why not simply teach that and
> then move on to programming them?
> Because it is a difficult problem, there seems to be only one widely
> known solution. While it is possible to prove that it solves the problem
> of describing computations, I seriously doubt that it is either a unique
> or optimal solution.
> In my 6-core phenom machine, I have, lets say, four ALUs per core, so if
> given an optimal workload, I can compute a maximum of
> 64 * 4 * 6 * 3.2 = 4.9 trillion bit operations per second, absolute
> maximum under optimal conditions. But the more interesting thing is that
> I only have 1536 bits of general purpose computing in my late model
> machine. (there are some other computational circuits devoted to
> managing the instruction stream and accessing memory that are not
> general purpose).
> When contemplating (om) an optimal computational substrate in what we
> believe to be the 3D nature of our real world, we are lead to an
> architecture that resembles the human brain in many respects. The gray
> mater would be a 3D cellular automata architecture. Then, in order to
> maximize bandwidth between distant nodes, a substantial portion of the
> available volume would consist of communication channels of some sort so
> that distant parts of the network could communicate at low latency. Each
> cell would be comparable in size to 3 or 4 cache cells on a processor of
> a similar process technology, so my 9 megs of cache would equate to
> roughly two or three MILLION computational units ( as opposed to the
> number just over 1,500 given above).
> What we lack, of course, is a good way to design and program these CAM
> cells so that they can be used for high performance computing. Some
> basic work has been done with Conway's game of life. Reportedly, there
> have been implementations of turing machines in the form of gliders and
> other structures in the CAM matrix.
> What is really needed is a serious re-evaluation of everything the
> textbooks say we know for certain about computer science to achieve the
> conceptual breakthrough required to really master this type of programming.
> I would consider this important basic research and the tools required to
> carry it out are already in front of you.
> DO NOT USE OBAMACARE.
> DO NOT BUY OBAMACARE.
> Powers are not rights.
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
More information about the extropy-chat