To me, and I think many other outsiders, putting a lot of emphasis on the equivalent of grammars-language-automata looks like mathematical naivety. I don't say this to be rude but because you (and Chomsky) claim to be able to interpret the implications of these mathematical results, but I don't think you are doing so correctly. Grammars look like a human (mathematical) invention and not some deep mathematical structure, and these results appear shallow. In the broader context, lots of mechanisms are able to do Turing complete computation.
This doesn't just apply to grammars. There is a huge array of formalisms (e.g. logics, type systems) out there and most just look like the result of someone saying "what if I did this?".
>> I don't say this to be rude but because you (and Chomsky) claim to be able to
interpret the implications of these mathematical results, but I don't think
you are doing so correctly.
It's alright- if I'm being naive, I'm being naive.
But- what am I missing? You're saying we're doing it wrong- how? For me the
intuition that infinite generative ability flows naturally from unbounded
recursion, like an egg from a hen's bottom, is kind of obvious. Is it naive? I
guess it's empirical, for me at least.
Also, btw, I was introduced to the idea of language equivalence through
Hopcroft and Ullman, so from the point of view of computer science, where it's
been very useful, in practical terms. I guess if you're coming from a
mathematical or theoretical physics background it might sound a bit silly, but
it's allowed us to make a lot of progress, for instance to create a few
thousand different architectures and languages... but maybe I shouldn't be
bringing that up as progress...
Anyway, I don't know- how would you interpret the observation correctly? Where
are we going wrong?
This doesn't just apply to grammars. There is a huge array of formalisms (e.g. logics, type systems) out there and most just look like the result of someone saying "what if I did this?".