Pages

Wednesday, 13 March 2019

The X Files & Chomky's Galilean Challenge

When I was a kid I was told about a horrible Professor named Skinner who denied free-will and believed human beings were like vicious rats- responsive only to their own pain or pleasure. Our behaviour- even our language- was wholly acquired and regulated by rewards and punishments. Thus, Skinner represented brute materialism or base utilitarianism, equally abhorrent whether expressed as Capitalist greed or the Communist Gulag.

Fortunately, a brilliant and handsome young Linguist, Noam Chomsky, showed that language was innate- people were born with it- and thus Man wasn't a rat in a business suit but had, if not a soul, then a Nous, a Cogito, of the sort Plato or Descartes might approve of.

Later on, when I was old enough to read Scripture, I was assured that Chomsky, from emulating Panini, had evolved to invoking something like Bhratrhari's Vakyapadiya theory. Language was, in essence, not communication or expression but something inward and eternal and unchanging which some peculiar working of Grace had endowed on the race of Manusvayambhu- Manu the self-born- in a modality so different to all other beings, for it was the mysterious oikonomia of giving and sacrifice, that divyadhvani, the divine revelation, though equally accessible to Gods, Demons and Animals, was soteriologically efficacious only for  human beings. All beings had to take birth in the race of Manu to hear that Revelation which ended rebirth.

Indeed, in the Eighties I recall reading what little of Chomsky that was accessible to my ignorance in a manner wholly devout and reverential.

Since then, of course, there has been an I.T revolution & being a crapulous Tambram cretin, I have passed much water under such bridges as I vigilantly troll.

One such bridge is that between Skinner- who was a good little atheist same as Noam- and Chomsky's vacating of the field- with respect to anything exterior about language- to leave it securely in the possession of his supposed foe.

The truth is, a faculty 's 'innateness' is just Evolution's economic way of doing 'operant conditioning'.  The Mind and the Will are co-evolved processes and thus feature indeterminacy because of the vast amount of substantive complexity their relatively simple interactions simulate which in turn impact on reproductive fitness- or, indeed, change its landscape.

Sixty years later, it is easy to see both Skinner & Chomsky- who weren't very different politically-simply as silly publicity mongers- responding perhaps to the shedloads of grant money involved in the  Pentagon's proposed 'Manhattan Project for the Social Sciences'- who achieved acclaim by pretending there were contributing to 'Psychology' & 'Linguistics' respectively by writing pseudo-intellectual shite instead of useful stuff- preferably books with titles like 'How to stop jerking off so much.' or 'Learn Mandarin while masturbating'.

Skinner is dead and doesn't appear to be writing articles anymore, but Chomsky isn't dead and is shitting out articles like crazy. This suggests that the after-life must be Paradise compared to our miserable mortal existence. Alternatively, we might say that Free Will is what makes Minds excrete shite till Death supervenes.

At any rate, it is my conclusion based on reading Chomsky's latest essay in the Inference Review. 

In it, he quotes Galileo-
But surpassing all stupendous inventions, what sublimity of mind was his who dreamed of finding means to communicate his deepest thoughts to any other person, though distant by mighty intervals of place and time! Of talking with those who are in India; of speaking to those who are not yet born and will not be born for a thousand or ten thousand years; and with what facility, by the different arrangements of twenty characters upon a page!
There was never such a mind. Rather there was convergent co-evolution by reason of similarity in the fitness landscape. Methods of record-keeping and specifying the terms of contracts and so forth arose independently and then linked up over a long span of time.

Chomsky takes a different view. He thinks that, if not one particular man, then Man, suddenly made a stupendous discovery- viz. language- and the Galilean challenge is to explain how this happened.

He thinks that little could be said about this till very recently because the tools were lacking. This seems strange. Boole had the tools. Why did nothing happen till almost a century had passed? &
Alonzo Church, Kurt Gödel, Emil Post, and Alan Turing... established the general theory of computability?
Binary notation meant that anything could be written down as a string of ones and zeros. Undecidability in set theory and algebra would have been discovered anyway. Moreover, no general theory of computability has been established as yet. A limited type of algorithmic theory, without concurrency or fault tolerance issues, was initiated in the early decades of the Twentieth Century. It has been considerably refined for a specific sort of deterministic algorithms. However, we don't have a general theory which will apply to quantum computers or non deterministic computation.
Their work demonstrated how a finite object like the brain could generate an infinite variety of expressions.
A finite object which is infinitely long lived might do so but only stochastically. An object with a finite life-time can't at all. In fairness, Chomsky may be thinking of 'the word problem for groups' and the conditions under which it is decidable. However, there is no reason to believe that 'word' or 'language' as used in a specific branch of Algebra has any relation at all to the ordinary meaning of 'word' or 'language'.

The variety of sensible expressions a brain can make tends to diminish after a certain point. Thus, Chomsky's brain, like mine, can only generate a very narrow range of tedious self-regarding shite which has some minimal semblance of coherence.

The work of Godel and Turing and Tarski was improved upon by Kripke who showed why even a super-brain or mega-computer using a language containing its own truth predicate wouldn't be able to say anything very interesting. On the other hand, we could certainly have a computer which writes Chomsky type shite precisely because it is now self-parody. However, it's output would have even less epistemic value than Chomsky's senile oeuvre. This is because, for a system of expression to retain univalent foundations while its range expands, a rich protocol bound 'intensional' Type theory is required. This co-evolves with its subject matter. It can't 'generate an infinite variety of expression' because it is constrained or channelised by its own fitness landscape. It has to pay for itself or else it gets unplugged coz computation is costly and scarcity is ubiquitous.

Chomsky believes there is a Galilean challenge- viz. how to explain our ability to do something we can't. Why not a Spiderman challenge- to shoot webbing from the palms of our jizz covered hands?

Before proceeding any further, as a matter of full disclosure, I must admit that my own vaunted disproof of the Pythagorean theorem relies upon Chomsky's path-breaking discoveries regarding the relative unimportance of 'externalized language'. The fact is my internalized language features triangles with four sides. This is because I like playing hide and seek but everybody else finds it boring so they just stand around rolling their eyes and saying 'oh no, you found us', however they are equally unanimous in urging me to carry on searching coz there's a fourth side to the triangle which is really good at hiding. Also, it has lots of toffees which it is sure to share with me if I am clever enough to find it.

Back in the Sixties, a fucked up decade when a lot of people were off their head on drugs, it was reasonable to suppose an 'i-language' might exist. There could be an ideal speaker and an intensional way to capture correct usage.  A computer could be taught grammar and then it would be able to talk to us like HAL in Kubrick's 2001. That was why Linguistics was a sexy subject. With the micro-computer revolution, we began dreaming of having a little chip implanted in our brain which would give us the ability to speak any language perfectly.

Sadly, this project was a pipe dream for reasons that Godel and Tarski and so forth had already clarified- viz recursion can be impredicative and thus Type Theory is needed for univalent foundations. Google Translate does not try to model an i-language. Instead it focuses on 'e-language'- i.e. extensional pragmatics- the collocations and idioms people actually use, studied from a statistical point of view.

Chomsky had been barking up the wrong tree- no shame in that, it is bound to happen if you are doing Science not bullshitting simply- but he couldn't bring himself to admit that he had been wrong not just in his 'Science' but also in the Moral or Political Philosophy he espoused on the basis of his faulty Epistemology.

In particular, his 'Principles & Parameters' approach meant that he completely sidestepped co-evolved processes or modularities which permit complexity to be tamed at an astonishing rate. However, this also means that neither principles nor parameters matter very much. An a priori, or substantivist approach is a short cut to babbling nonsense. Everything is idiographic and represents either channelisation or the damming up of capacitance diversity on a specific fitness landscape. There is no place for nomothetic grandstanding because it cashes out as making grandiloquent claims on the basis of a complete lack of understanding of the work of genuine savants in useful fields. In this essay, Chomsky is claiming to be using 'tools' which weren't invented for the use of linguistics (not even his own posited 'i-language) and wants us to believe something amazing could be constructed on that basis. This is a bit like me, when I inherited my sister's slide-rule, claiming to have used it to turn the cat into a giraffe. As a matter of fact, since we lived in Nairobi, there really was a giraffe in the garden. But it soon wandered off while pussy, it turned out, had been quietly napping on an expensive kanjeevaram sari in Mum's closet.

I gave up trying to use my slide-rule to make out I had super-powers. What about Chomsky? Let us see-
With these intellectual tools available, it becomes possible to formulate what we may call the Basic Property of human language.
Language is not algorithmic. How could tools developed for algorithmic computation help formulate a 'Basic Property' for something non deterministic?

Why not use the tools of linguistics to prove P is not equal to NP coz like there's an N in NP and there isn't an N in P?

What is the Basic Property of Maths? Whatever answer we give, there can be a useful type of Maths which violates that Basic Property.

If Math doesn't have a Basic Property why pretend it has tools which can formulate the thing for Language? After all, Maths can be a subset of Language but the reverse is not the case- even if a Mathesis Universalis exists unless it does not matter who says what- i.e there is no mimetic desire, and individuals are indiscernibly identical.
The language faculty of the human brain provides the means to construct a digitally infinite array of structured expressions; each is semantically interpreted as expressing a thought, and each can be externalized by some sensory modality, such as speech.
I have the language faculty in my admittedly sub-human brain. I don't have the means to 'construct a digitally infinite array of structured expressions'. Nobody can construct an infinite digital array because there just aren't enough atoms in the Universe to hold the information.

The same holds for semantic interpretation. The thing is costly. Anyway, it can't perform an infinite operation.

By contrast, co-evolved processes or modularities can find 'quick and dirty' means of simulating otherwise intractable computational problems. The Market is an example because something which is exponentially complex is computed using a simple rule in a disaggregated manner.

But the same thing can be said of language. There are many fields which have only recently come into existence. We find that their terminology and pragmatics very soon become stable such that some of their initiators strike us as odd because they cling to an outdated phraseology. This can happen even in the absence of any recognized Governing body or Trade Publication.

At the same time, there are discoordination games, such that language which originated in a lowly milieu gains normativity, in defiance of Tarde's Law, in refined circles. Thus under the second Bush everybody was talking about 'reach-arounds' as in 'when America makes you its bitch, Democracy is the reach-around.'

Semantic interpretation has an infinite range, not an infinite domain. It may well be that, when the current chief Prison shot-caller of the Aryan Brotherhood succeeds Trump in the White House, the term reach-around will have no obnoxious meaning. Rather it will refer to such erotic reveries as comfort the new and recently freed Leader of the Free World.

Chomsky believes the reverse. Thoughts are few and, like Trump in his second term, too morbidly obese to go abroad, yet they fashion themselves an infinite wardrobe.
The infinite set of semantically interpreted objects constitutes what has sometimes been called a language of thought.7
Or the Mind of God, or all pervading Akasa, or something along those lines.
Why would an infinite system bother with 'linguistic expression'? Only religious people, who think the Creator is a Personal Lord God and Savior, believe that an infinite and omniscient being would want to talk to humans so as to save them from sin and thus have them populate Paradise after their earthly demise.

However, as morality evolves, even the most devout don't want God to be rapping to them about Dietary Commandments or the Laws of Chemistry. They just want the 'sphota' of loving concern and comforting wisdom and understanding and forgiveness. This 'sphota' is 'divyadhvani' and has no structure. Thus it is 'insha' or imperative simply- not a short-cut to getting magical powers.

Perhaps Chomsky believes that there is a perfect Adam within us who, if only the Media would stop 'Manufacturing Consent', would be perfectly rational and the true Fuerbachian Deity of our own eusebia. The Holy Grail here is what Chomsky calls 'the language of thought'-
It is a system that is then capable of linguistic expression, and that enters into reflection, inference, planning, and other mental processes.
So Chomsky thinks thought has a language and that language is capable of linguistic expression- in other words, it is a language.

Sadly, Chomky's thought is gibberish. Why not simply say, 'Language is capable of linguistic expression and linguistic expressions can be expressed by linguistic expressions which in turn are capable of expression within Language'?
When externalized, it can be used for social interactions, although this is only a first approximation to what may properly be called thought. There is much more to say about this challenging topic.
But it would be obviously and equally shite.
There is good reason to suppose that the language faculty is common to the human species.
No shit Sherlock! However, there is no good reason to treat an individual who lacks the language faculty as alien to the human species.
There are no known group differences in language capacity, and individual variation is at the margins. Although speech is the usual form of sensorimotor externalization, we now know that signing, or even touching, does just as well. These are discoveries that require a slight reformulation of the Galilean challenge.
Was Galileo really ignorant of the existence of deaf mutes? Had he never seen foreigners communicating with natives using sign language? How stupid do you have to be to firstly think there is a Galilean challenge and then think it has to be reformulated coz Helen Keller existed?
A more fundamental qualification has to do with the way the challenge is formulated; traditional formulations were in terms of the production of expressions.
In which traditions were these formulations? Those of grammar and linguistics. But nobody has a high opinion of such pedants- except other pedants toiling for a miserable wage. As a Brahman myself, I might occasionally gas on about Panini and Patanjali. But, to pay the bills, I am forced to do something other people find useful.
In these terms, the challenge overlooked some basic issues. Like perception, production accesses an internal language, but cannot be identified with it.
Perception accesses an 'internal language' does it? So, the neighbor's cat, which is looking at me through the window, is saying something to itself as it does so. OMG, from the expression on its face, I can see it must be something really racist. I shall phone the police to complain about this wholly unprovoked feline internal 'hate speech'!

Cat has got up and is saying meow. This accesses an internal language but can't be identified with it. That's a good thing because I'd look a fool if I tell the police cat said 'meow'. I'd better stick to my story that, according to the great Noam Chomsky, Cat violated my human rights by saying 'Get a proper job you fat black bastard' in its internal language.
We must distinguish the internalized system of knowledge from the processes that access it.
Why must we? What good would it do? Why not talk about something useful or entertaining instead?
The theory of computability enables us to establish the distinction, which is an important one, familiar in other domains.
This is nonsense. People who have never heard of 'the theory of computability' easily distinguish between memory (which is an internalized system of knowledge) and remembering stuff. That's why we often say things like 'I know this. I remember this. Damn! I just can't bring it to mind at this moment. It will come back to me.'
In studying human arithmetical competence, we distinguish the internal system of knowledge from the actions that access it.
No we don't. If you get all the sums wrong, the examiner who is 'studying your arithmetical competence' gives you a big fat zero and suggests you attend Summer School. Your Mom may say to the Headmaster 'You must distinguish between my son's 'internal system of knowledge' which is actually very good, from his actions that access it, which are those of a moron. Please don't make him attend Summer School. We were planning to hire him out to a freak show.'
When multiplying numbers in our heads, we depend on many factors beyond our intrinsic knowledge of arithmetic.
Only if we are stupid or are trying to get the wrong answer. To get the right answer we must have arithmetical knowledge. However, it isn't intrinsic. It is something learned and cultivated.
Constraints of memory are the obvious example. The same is true of language. Production and perception access the internal language, but involve other factors as well, including short-term memory.
This 'internal language' does not exist save in a manner of speaking. Pretending otherwise causes us to talk nonsense. Stuff like-
When the Galilean challenge was addressed in the 1950s, these were matters that began to be studied with some care.
But the thing lead nowhere. The Pentagon wanted a universal translator or a voice activated jet fighter or stuff like that and so they funded a type of research which, for a while, seemed exciting. There was also some MK ULTRA type shite about how one could subtly change Language so Communism would become unthinkable and so forth. But, that stuff was as fucked in the head as 'men staring at goats' and hoping the goats' heads would explode.
There has been considerable progress in understanding the nature of the internal language, but its free creative use remains a mystery.
The progress amounts to saying this is a waste of time. E-language is important. I-language is for the birds.
This should come as no surprise. In a recent review of far simpler cases of voluntary action, neuroscientists Emilio Bizzi and Robert Ajemian remark, in the case of something so simple as raising one’s arm, that the detail of this complicated process, which critically involves coordinate and variable transformations from spatial movement goals to muscle activations, needs to be elaborated further.
Why? Coz having a better Structural Causal Model will enable Doctors to cure paralyzed or nerve damaged patients.

By contrast there is no point having a better Structural Causal Model of something which is merely a figure of speech and which has no real existence.
Phrased more fancifully, we have some idea as to the intricate design of the puppet and the puppet strings, but we lack insight into the mind of the puppeteer.
This is is the essence of the problem with the notion of an i-language.  We would need insight into the mind of an infinite, omniscient being- i.e. that of God. Maybe this helps Yogis meditating in a cave in the Himalayas but, for the rest of us, it is wholly useless.
The normal creative use of language is an even more dramatic example. This is the unique human capacity that so impressed the founders of modern science.
The fundamental task of inquiry into language is to determine the nature of the Basic Property, and so the genetic endowment that underlies the faculty of language.
'Inquiry into language' is about useful stuff like creating Dictionaries and Grammar books and phonetics and hermeneutics and so forth. It isn't about determining something which may have a description in the theory of Computation but which probably does not.

By contrast, the fundamental task of my own 'Inquiry into Porn' is to determine the nature of the Basic Property of bukake. Thus, I'm not a dirty old man and shouldn't be fired because determining this Basic Property would also determine the genetic endowment that underlies the faculty of a bunch of guys coming on a charming P. Chidambaram lookalike's face.
To the extent that its properties are understood, we can investigate particular internal languages, each an instantiation of the Basic Property, much as each individual visual system is an instantiation of the human faculty of vision.
We already investigate a particular persons 'internal language' as part of multiple choice School Examinations.  Generally the student is given a set of alternate grammatical constructions and has to pick the one she thinks is correct. This may give us information about her social class, reading level, or region of origin. It can tell us nothing about some supposed 'Basic Property' of language.
The Biolinguistic Program involves an investigation into how internal languages are acquired and used, and how they function in the human brain.
Maybe this is useful for speech therapy or some such thing. By all means construct a better Structural Causal Model if it can lead to better outcomes.
It is also committed to studying the evolution of the language faculty and its basis in human genetics.
Once again, this may involve improvements in gene therapy and so is welcome- provided it is done by genuine scientists not linguists who are bored with their own subject and want to appear super smart.
Universal Grammar is the theory of the genetically-based language faculty; Generative Grammar, the theory of each individual language.
Languages appear to be extremely complex, varying radically from one another. A standard belief among professional linguists sixty years ago was that languages can vary in arbitrary ways; each must be studied without preconceptions. Biologists held similar views about organisms. As recently as 1984, Gunther Stent, in a review of J. M. W. Slack’s From Egg to Embryo, remarked that, “…we cannot expect to discover a general theory of development; rather we are faced with a near infinitude of particulars, which have to be sorted out case by case.”
When understanding is thin, we expect to see extreme variety and complexity.
A great deal has been learned since then. It is now recognized that the variety of life is very limited, so much so that the hypothesis of a universal genome has been seriously advanced. My own feeling is that linguistics has undergone a similar development.
Gene therapy is a real thing. It is curing actual people. By contrast, linguistics has made no progress whatsoever by trying to shit higher than its arsehole and pretending it is a High I.Q enterprise like Computing.
The Basic Property takes language to be a computational system, which we therefore expect to respect general conditions on computational efficiency.
This is utterly foolish. Human Languages are a type of network protocol. It is merely a figure of speech to say that cognitive processes have an 'inner language' and thus linguists can muscle in on this territory. We may also say 'the Sun's fire weakens and grows dim'. This does not mean our neighborhood arsonist should be allowed to extort 'donations' in return for his expertise in getting the Sun to burn more fiercely.

Chomsky says the second part of the Galilean challenge has to do with the origin of words which he considers to be the unanlyzable atoms of a combinatorial system.
'Any combinatorial system begins with atoms: things that are unanalysable from the point of view of the combinatorial system.'
Words have been analyzed using Language since at least the time of Panini.  Thus, neither words nor anything word like can be the 'atoms' Chomsky finds mysterious.

Chomsky says-  'They're completely unknown in animal systems.' which is nonsense coz cats say meow and dogs say woof-woof and little piggies go weee.
 'We have no idea how they evolved, when they evolved, where they came from.'
Rubbish! We all know where the words like 'quark' and 'boson' came from. We can trace the evolution of a word like 'chav' from Estuary English to Romani back to Sanskrit.

Words are the solution to a coordination problem. So is language. Neither are mathematical in any way.

Chomsky says-'They're common to all humans. They all have unique meanings.'

There are no words common to all humans. None have unique meanings.

It seems the second part of the 'Galilean challenge' was to talk even more worthless gibberish than the first time round.

How? By pretending Language is a branch of Math applicable only to a narrow class of, soon to be obsolete, devices-
A computational system consists of a set of atomic elements, and rules to construct more complex structures from them.
But, as Wittgenstein discovered, there are no 'atomic propositions'. Thus linguistics can't find an i-language and generate a Tractatus which would tell us what we can say and what we must remain silent about.
In the generation of the language of thought, the atomic elements are word-like, though not exactly words; for each language, the set of these elements is its lexicon.
'The language of thought' is merely a figure of speech- like the dance of ideas, or the music of mindfulness. It is not factorizable, by any currently conceivable means, into word-like atomic elements. If it were, we could predict the future of thought and what people would be saying, and how they would be saying it, hundreds of years into the future.
Lexical items are commonly regarded as cultural products, varying widely with experience, and linked to extra-mental entities. The latter assumption is expressed in the titles of standard works, such as W. V. O. Quine’s influential study, Word and Object. Closer examination of even the simplest words reveals a very different picture, one that poses many mysteries.
But has doing this helped anyone? Gene therapy has. Computational theory has. But this bullshit has been a waste of resources.
In the analysis of the Basic Property, we are bound to seek the simplest computational procedure consistent with the data of language.
That's what Google Translate is doing. But it takes an 'e-language' approach coz Chomskian i-languages don't exist, or else require more computational power than the Universe affords.
Simplicity is implicit in the basic goals of scientific inquiry. It has long been recognized that only simple theories can attain a rich explanatory depth. “Nature never doth that by many things, which may be done by a few,” Galileo remarked, and this maxim has guided the sciences since their modern origins.
What could be simpler than Voodoo? The reason we give up Magic and pay for Science is coz Magic does not work.
Chomsky's shite is useless. Back in the Sixties, or even the early Seventies, this was not obvious. But now it is.
 It is the task of the scientist to demonstrate this, from the motion of the planets, to an eagle’s flight, to the inner workings of a cell, to the growth of language in the mind of a child.
Yes. Scientists can help kids with abnormal language development. Gene therapy may greatly benefit many people with inherited conditions which affect their ability to communicate.
Linguistics seeks the simplest theory for an additional reason: it must face the problem of evolvability.
Linguistics focused on making languages easier to learn and was very successful. It is true, there are lazy and stupid people like me who don't take advantage of the marvelous new techniques they have discovered- however billions of people across the planet are the opposite of me.

Modern linguistics did look at how different languages evolved and made some startling predictions. But this was because linguists studied actual languages and did useful stuff- like write grammar books and compile dictionaries. They didn't pretend to be Biologists or Computer mavens.
Not a great deal is known about the evolution of modern humans. The few facts that are well established, and others that have recently been coming to light, are rather suggestive. They conform to the conclusion that the language faculty is very simple; it may, perhaps, even be computationally optimal, precisely what is suggested on methodological grounds.
Interchanges between things which are co-evolved express very high Kolmogorov complexity though they themselves may operate according to simple rules.

Computational optimality is not directly linked to robustness or anti-fragility. However, Evolutionary Computation- which must be robust- isn't wasteful, indeed, it may be substantively better in the long run because co-evolved processes can change the fitness landscape.

However, this does not mean that the language faculty is simple any more than it means that the faculty of vision is simple.

What is highly unlikely is that all of a sudden one guy gets a 'language gene' mutation and then his mutated gene spreads rapidly. This was Chomsky's theory and it has been disproved. We now believe Neanderthals and Denisovans had language. Chomsky defensively shifts the date for his miracle further back in time. However, he is still not focusing on what really matters- viz. the fitness landscape.
One fact appears to be well established. The faculty of language is a true species property, invariant among human groups, and unique to humans in its essential properties.
Nonsense! We know of no reason why an alien species might not be able to learn our language and communicate with us.
It follows that there has been little or no evolution of the faculty since human groups separated from one another.
Rubbish! The fitness landscape keeps changing so there must have been evolution featuring co-evolved processes of a complex type.
Recent genomic studies place this date not very long after the appearance of anatomically modern humans about two hundred thousand years ago. It was at this time that the San group in Africa separated from other groups. There is little evidence of anything like human language, or symbolic behavior altogether, before the emergence of modern humans.
There is no evidence going the other way. We can't rule out the existence of language a million years up our own genealogical tree. Indeed, even two million years has been suggested.
That leads us to expect that the faculty of language emerged along with modern humans or not long after, a very brief moment in evolutionary time.
This leads Chomsky to expect this only because he had already committed to this silly notion. He isn't really interested in evolution. He is just trying to save face and pretend he didn't advance a foolish view.
It follows, then, that the Basic Property should indeed be very simple.
Coz otherwise Chomsky would be an ultracrepidarian tosser who write shite about stuff he doesn't understand coz he thinks he is super smart whereas the truth is he took the wrong turn many decades ago.
The conclusion conforms to what has been discovered in recent years about the nature of language, a welcome convergence.
Discoveries about the early separation of the San people are highly suggestive. Although Khoisan speakers appear to possess the general human language capacity, their languages are all and only those with phonetic clicks, with corresponding adaptations in the vocal tract. The most likely explanation for these facts, developed in detail by the Dutch linguist Riny Huijbregts, is that their possession of an internal language preceded their separation from other groups; this in turn preceded the externalization of their language.
Sheer nonsense. Xhosa and Zulu adopted clicks from the Khoi-San.  The vocal tract has some plasticity. I can't become a Tuvan throat singer but a White friend of mine who went and studied with them can perform quite creditably.

Why would the KhoiSan internal language have clicks? What about the first settlers on the Canary islands? Did their internal language feature whistling? My internal language features growling and cooing. I wasn't aware of this till baby came along and I found myself making these and various other more or less uncouth noises. But then we all end up sounding ridiculous when we hold a baby.

As we get older, we develop some curious verbal ticks. On the one hand, we try to adopt the language of the young- in my case, a 'Valley girl' Buffy the Vampire Slayer style idiom as well as 'Mean Girls' 'vocal fry'.

On the other hand, we start sounding like our Dads- or, indeed, grandfather's who died before we were born. Thus, when speaking to a young man in a service center in Chennai, I start sounding like a ninety year old Pujari who has never set foot outside the agraharam.

Subramaniyam Swamy has said that he easily mastered Mandarin because of the phonetic template provided by his Mum's 'aiyayo'. The truth is, he was a bright kid and may have overheard Chinese visitors to his father's Institute back in the days of 'Hindi Chini bhai bhai'. A friend of mine picked up Mandarin very fast and was rather pleased about it till his Gran informed him that he could speak the language as a small child because he was very fond of the Chinese family who lived next door to them in Calcutta.

Subliminal memories of this sort, however, represent extensional not intensional or internal language.

Indeed, belief in the later is utterly foolish. It leads to saying things like 'The San people were living with non San people and speaking to them without any clicks. But inside them an internal language with clicks was forming. Once they went their separate way they felt it safe to start externalizing the clicks.'

Come to think of it, this isn't such a crazy theory. I recall chatting to a fellow Tambram in Delhi airport a few decades back. He sounded just like me. But, unlike me, he'd got a scholarship to an American University. Thus, the moment we were outside Indian airspace the fellow turned into a typical Texan asking the hostess for a 'Bourbon & branch' in a good ol' boy accent. Clearly, he had an internalized language which was pukka American though, till that date, it had been nourished only on buttermilk and rasam and 'filter kaapi' .
 Other groups proceeded in somewhat different ways. Externalization seems to be associated with the first signs of symbolic behavior in the archaeological record.
Very true! The Neanderthals were completely mute till they got to Spain and started painting in caves. Then they started talking to each other about how they painted what they felt, not what they saw and like how all the critics ought to be fed to sabre tooth tigers.
We may be reaching a stage of understanding where the account of language evolution can be fleshed out in ways that were unimaginable until recently.
There has been plenty of imagining about how cavemen talked. We have reached a stage where we can rule out stupid imaginings- like that of Chomsky and his pal the Dutch generative grammar maven.
The genetic endowment for the computational system of language appears to be quite simple. It is a major challenge for research to show how the facts of language are explained in terms of a simple formulation of the Basic Property.
More especially because the thing is nonsense.
Whatever the formulation, it must appeal to the interaction of the Basic Property with specific experiences and language-independent principles. These will, no doubt, include principles of computational efficiency.
I'm watching the Brexit debate on TV at the moment. No principle of computational efficiency can explain this mishegos. 

When's the last time you attended a meeting or a cocktail party or tried to chat somebody up and came away saying 'that interchange was computationally efficient?'
In this regard, the child’s own experiences have only limited relevance, whether in acquiring the meaning of the simplest words, or the syntactic structures and the semantic properties of the language of thought.
This is good news if you want to study language acquisition in kids but are allergic to the presence of those cute, but germ ridden, little tykes.

More broadly, this sort of thinking enables you to be a great linguist without actually being able to speak any language, even your own, with any degree of coherence.

Many Universities already have Language Faculties composed entirely of babbling cretins. As Chomsky says-
Universal properties of the language faculty came to light as soon as serious efforts were undertaken to construct generative grammars.
That's right. Generative Grammar was what caused Linguists to stop studying languages and thus doing something useful.
These included simple properties that had never before been noticed, and that remain quite puzzling. One such property is structure-dependence. The rules that yield the language of thought appeal only to structural properties, ignoring properties of the externalized signal, even such simple properties as linear order. Take, say, the sentence The boy and the girl are here. With marginal exceptions, no one is tempted to say is here, even though the closest local relation is “girl + copula.”
I do say 'da boy 'n gel iz 'ere' in certain situations. Its coz I iz gangsta.
The bigram frequency, which measures the likelihood or frequency of word combinations, is far higher for phrases of the form girl is than girl are. Bigram frequency is a common measure used in computational cognitive science and Big Data analysis.
Right, but that type of research only pays for itself if it is extensional and doesn't sweat the small stuff.
Without instruction, we rely on structure not linearity, taking the phrase and not the local noun to determine agreement. Or take the sentence He saw the man with the telescope, which is ambiguous, depending on what we take to be the phrases, although the pronunciation and linear order do not change under either interpretation.
The ambiguity can be easily cleared up because it arises out of a lazy, colloquial, manner of speech. What we would say in a legal document, or sworn testimony, or other such context requiring higher diligence and accuracy is 'X used a telescope to see the man in question'.
To take a subtler example, consider the ambiguous sentence Birds that fly instinctively swim. The adverb “instinctively” can be associated with the preceding verb (fly instinctively), or the following one (instinctively swim). Suppose now that we extract the adverb from the sentence, forming Instinctively, birds that fly swim. The ambiguity is now resolved.
But the information content has changed. The first sentence has the meaning Birds that instinctively fly, instinctively swim. We can refute this by catching a sparrow and watching it drown as its feathers get waterlogged. Indeed, even cormorants have to renew their water-proofing using fish oil.

By contrast, we can't refute the statement 'birds that fly swim' because the guy who said it can say 'I meant birds that I educate in feather waterproofing and who go on to do a PhD in Avian Aquatics under my esteemed colleague, Prof. Penguin.'
The adverb is interpreted only with the linearly more remote but structurally closer verb swim, not the linearly closer but structurally more remote verb fly.
No it isn't. It would be silly to do so. We think of birds as instinctive creatures rather than as students attending courses on Avian aquatics.
The only possible interpretation—birds swim—is unnatural.
Why? Birds must be able to float- at least flying birds must. Since they get wet in the rain, it would be natural to suppose that feathers are water resistant to some degree. I didn't know, till I googled it, that cormorants can drown. But, that's Nature for you. It's full of surprising stuff which is why David Attenborough hasn't yet run of material.
That doesn’t matter. The rules apply rigidly, independent of meaning and fact, ignoring the simple computation of linear distance, and keeping to the far more complex computation of structural distance.
I've just shown that Chomsky's rules don't apply because my understanding of the sentences Chomsky quotes is better than his own despite the fact that I failed Mayo College's I.Q test. Mayo College! That's why my parents wanted me to take up finger-painting or Econometrics or puerile pastimes of that sort.
The property of structure-dependence holds for all constructions in all languages, and it is, indeed, puzzling.
This property is something imputed but is either meaningless or never holds at all. It is only puzzling in the sense that finding one's car keys in the microwave is mysterious to someone who does not acknowledge they have a drinking problem.
Furthermore, it is known without relevant evidence, as is clear in cases like the one I just gave, and innumerable others.
If it is known without relevant evidence it is a foolish dogma wholly unrelated to Alethia.
Experiments show that quite without instruction children understand structure-dependence by about the age of three.
The same experiments show that those kids understand the turbulent interaction of fourteen dimensional membranes within the Post-Kristevan Chora.
We can be quite confident that structure-dependence follows from principles of universal grammar deeply rooted in the human language faculty.
Only because 'structure dependence' means exactly the same thing as 'turbulent interaction of fourteen dimensional membranes within the Post-Kristevan Chora'.
Structure-dependence is one of the few non-trivial properties of language that usage-based approaches to language have sought to accommodate. The attempts have been reviewed in detail elsewhere. All fail. Totally. Few of them even ask the right question. Why does this property hold for all languages, and all constructions? Other cases fare no better.
Structure-dependence is a feature not of language but one possible mode of its reception. Encountering an idiom for the first time- e.g. raining cats and dogs- we may seek to parse it to tease out its meaning. But, we get nowhere fast. Once its significance is revealed, structure is irrelevant. We may say 'Cats & dogs it's raining yet!' or 'Woof Meow eh? eh? The rain- cats & fucking dogs or what?' Structure is irrelevant where there is 'sphota'- an explosion of expressive force.

  Chomsky admits that 'structure-dependence' can be found nowhere on earth, though he thinks it may be found deep within us- but not in our soul which is not of the Earth, coz Chosmky is a good little Atheist. But, if he is wholly of this world, then surely the right question to ask is, why bother with structure-dependence? The thing is useless. There is no way of looking at the structure of 'it is raining cats & dogs' which will get you to its meaning.

Chomsky won't admit that a generative grammar approach fails when confronted with 'sphota'. Why? Because it would mean admitting he has wasted his life. Thus he is obliged to go outside his own discipline to seek for moral support.
Other sources support the conclusion that structure-dependence is a true linguistic universal, deeply rooted in language design. Research conducted in Milan a decade ago, initiated by Andrea Moro, showed that invented languages keeping to the principle of structure-dependence elicit normal activation in the language areas of the brain.
So would invented languages, of similar complexity, which didn't.
Simpler systems using linear order in violation of these principles yield diffuse activation, implying that experimental subjects are treating them as a puzzle, not a language.
Because they sense the thing does not have enough complexity to be a language.
 Neil Smith and Ianthi-Maria Tsimpli had found similar results in their investigation of a cognitively deficient but linguistically gifted subject.
This is the famous polyglot savant, Christopher. It appears his exceptional memory allows him to use simple 'e-language' heuristics to quickly acquire basic proficiency in a language. However, grammatical complexity defeats him just as it defeats Google Translate. The brain has sufficient plasticity for its 'language areas' to be idiosyncratic.

Chomsky has chosen a bad example for two reasons. The first is it shows only e-language matters as does 'theory of mind' which is not linguistic at all. Second, complexity can be cracked by co-evolved processes which are themselves simple. Christopher's unfortunate condition is a barrier but his passion and commitment has made him a great inspiration to many people around the world.
They also made the interesting observation that normal subjects can solve the problem if it is presented to them as a puzzle, but not if it is presented as a language. If presented as a language, the language faculty, although presumably activated, would be unable to make sense of the data.
Puzzles can be solved. Language has to be understood. If we think 'this is a puzzle' we feel it is something rewarding and prestigious to figure out. If we think it is some foreign jibber-jabber we lose interest unless the thing is a formula for making your dick bigger or something of that sort.
Structure-dependence must be an innate property of the language faculty.
No. Any property that a language has must be something co-evolved, not innate. Otherwise, King James would have found that a kid shut up by itself and denied the opportunity to learn a language, was genuinely speaking the Adamic language. In reality, feral kids don't learn how to speak. This is because language is not innate.
Why this should be so? There is only one known answer. It is the answer we seek for general reasons. The computational operations of language are the simplest possible.
If you connect a couple of computers and they have to solve a coordination problem, they may soon hit on a 'language' which features cheap talk and costly signals and so forth. However, it wouldn't be the simplest possible. It would be hysteresis afflicted. It may be that under intense enough selective pressure with a big enough population you may get something pretty close to optimality. But, unless P=NP, it won't be the 'simplest possible'.
This is the outcome that we hope to reach on methodological grounds, and that is to be expected in the light of evidence about the evolution of language.
Still, why do it? If Biologists hit on a better Structural Causal Model, Doctors can find new cures. What good would come out of what Chomsky is proposing? The answer is he'd be able to say 'I told you I was right. Something magical happened and suddenly Human beings became the only species which has language. This means Linguistics is really important.' But, if this were true, Linguistics would not be important. Magic would.

Religious scholars once believed that the language in which their Revealed Scripture was written had some supernatural properties. They no longer do so. Instead, they concentrate on making themselves better people more useful to the needy within their community.
To see why this is the case, consider the simplest recursive operation, embedded in one or another way in all others. This operation takes two objects already constructed, say X and Y, and forms a new object Z, without modifying either X or Y, or adding any further structure to them. Z can be taken to be just the set {X, Y}.
No it can't. Let X be 'Y iff Z is Z' and Y be 'Y iff Z is not Z'. What is Z? Something if it isn't and nothing if it is. Only if there is no impredicativity or there is a Theory of Types can there be 'univalent foundations' of anything within the realm of Computation Theory. Computer proof checking is a similar program to that of specifying i-language. We believe that co-evolved complexity is so great that we chase a mirage. Still, useful stuff can get done by this otherwise futile quest.
Chomsky however believes that all this sort of thing proves there is a Grail and it must be the one he believes worked the magic of endowing Humans and Humans alone with Language.
Why? Simply so as to save face and continue to believe he's a smart guy.
In current work, the operation is called Merge. Since Merge imposes no order, the objects constructed, however complex, will be hierarchically structured, but unordered;
& therefore useless because they have no univalent foundations.
operations on them will necessarily keep to structural distance, ignoring linear distance.
Not necessarily. We can always specify operations which delete useless shite and give a well-ordering for useful stuff. Suppose I won the lottery. I could then pay a smart guy to go through my blog deleting all the worthless shite and presenting what remains in a coherent fashion. I know, the result would probably be a rather short sentence. But at least it won't be as foolish as this-
The linguistic operations yielding the language of thought must be structure-dependent, as indeed is the case.
If a thought has a language, then linguistic operations of the type performed by the smart guy I hire to prune shite out of my blog, would not be 'structure dependent' at all. They would completely destroy any existing structure, delete what isn't useful, and only present what is useful, that too in a coherent, action guiding, fashion.
An appeal to simplicity appears to answer the question why all human languages must exhibit structure-dependence.
An appeal to simplicity also tells us that we live in an Occassionalist universe- or maybe Reality is a hologram- or Life is but a dream.

Structure-dependence is not simple- in effect it means having an intensional theory of Types. This can be very useful- for e.g. for computer checking of mathematical proofs- but, for the moment, it isn't for Linguistics. As the saying (attributed to Einstein) goes- Everything should be as simple as possible, but not simpler. In particular, we have to reject a simpler theory if there is even one datum which contradicts it. This is what Chomsky fails to do.

The externalization of language maps internal structures into some sensorimotor modality, usually speech. The sensorimotor system requires linear order; we cannot speak in parallel.
Nonsense! Socrates invokes the palinode to show how speech can have 'dhvani'- i.e. echoes from the path it does not follow.
But the language of thought keeps to structural relations.
In which case, the above is not a thought. Either the language of thought has an intensional definition of 'structural relations'- in which case the thing is a proper subset- or it can't- in which case it has no means to keep within the bounds set by 'structural relations'.  In the former case, 'the language of thought' is merely the ergodics of 'structural relations'. But that means it can't be a language. It is a road, not a road-sign. In the latter case, it is not a language but a babble from which useless stuff is culled by the fitness landscape. 'Structural relations' then simply means the vicissitudes inflicted upon us by the world till we learn to 'carve up Reality along its joints'.
Externalization seems to be a peripheral aspect of language.
Which is why Linguists needn't bother learning languages. Instead they should pretend to know from Set Theory or the theory of Computation or of Evolution or anything else under the Sun except the stuff they are paid to study.
It does not enter into the core function of providing a language of thought. This is at odds with the traditional formulation of the Galilean challenge itself.
So, Chomsky admits that the 'Galilean challenge' was silly. We don't have infinite expressive capacity, just because we know the letters of the alphabet and can combine them as we like, anymore than we can't fly by flapping our arms in the air.
Thoughts exist. Why provide them a language? Let the buggers invent their own if they are so smart.
Perception yields further evidence in support of this conclusion.
Very true, I saw some dirty thoughts of mine lining up outside a College which teaches English to beautiful young au pairs.  Am I too old to hire an au pair? Maybe I could hire one to teach my dirty little thoughts Swedish.
The auditory systems of apes are quite similar to those of humans, even attuned to the phonetic features that are used in language. But the shared auditory-perceptual systems leave apes without anything remotely like the human faculty of language. Apes can, of course, gesture, but even with arduous training cannot use their gestural systems with even the most elementary properties of language, though as is now known, sign languages are developed spontaneously by humans even without any linguistic input. Similarly, recent research with dogs has found that they are attuned to the phonological and intonational features of human language. They may even have similar hemispheric specialization, but, of course, that provides them with no step at all towards the acquisition of language. Many such results support the conclusion that our internal language is independent of externalization, and that it evolved quite independently.
Our 'theory of mind' maybe. Perhaps it co-evolved with language in the way that dogs co-evolved certain useful and companionable traits alongside us.

'Internal language' however would be wholly useless until and unless the fitness landscape militated for the evolution of external language and then having internal language improved one's reproductive fitness.
It is language design that provides the most powerful evidence for this thesis.
Language design is a useful fiction like 'the social contract' or a mythological story about how a particular ethnicity came into existence. Sir Edward Coke used the method of 'legal fiction' or 'artificial reason' to claim that the Common Law was designed by Greek speaking Druids.
The linguistic universal of structure-dependence follows from the null hypothesis that the computational system is optimal.
but only because ex falso quodlibet. From nonsense any nonsense follows.
It is for this reason indifferent to linear order, which is, of course, the most elementary feature of externalization.
Linear order does not matter. Context does.
Not long ago it would have seemed absurd to propose that the operations of human language could be reduced to Merge, along with language-independent principles of computational efficiency.
No. Back in the Sixties, it wasn't absurd because many people hadn't read the later Wittgenstein and thus thought atomic propositions could exist. Indeed, there were Maoists who spoke of how the Chinese had invented a new language which, 'carved up Reality along its joints' and thus they were able to solve all coordination problems much more efficiently than the rest of us. Indeed, I recall hearing a story about how Chairman Mao would retaliate against any nuclear attack by getting all the Chinese people to jump into the air at just the right moment. Falling back to earth, they would temporarily knock the earth out of its orbit such that the US would get fried by Cosmic Rays.

Perhaps Chomsky keeps up with research of that sort- which is why he says
Work of the past few years has shown that quite intricate properties of language follow directly from such assumptions.
Sometime in the past, there was a miraculous mutation which spread by some miracle such that everybody developed an internal language and then independently made up external languages. If their internal language had clicks, then they set off for South Africa and then started speaking with clicks. If their internal language liked whistling, then the whistlers went off to the Canary islands where they could whistle at each other to their heart's content.

Work of the past few years- performed by maniacs- has proved all this. Fuck is wrong with you? Don't you understand that the fact that you don't have an internal language which says 'Chomsky is a genius' PROVES you are a fucking cretin and will vote for Trump and start whacking off to Fox News?
Displacement is a ubiquitous and puzzling property of language. Phrases are heard in one position but interpreted in two, both in their original position and in some other position that is silent, but grammatically possible. The sentence, “Which book will you read?” means roughly, “For which book x, you will read the book x,” with the nominal phrase book heard in one position but interpreted in two.
This is nonsense. The context determines the meaning of the sentence. When I go into the local library, the librarian naturally assumes I'm going to sit in a corner jerking off. So she says 'Which book will you read?' because I'm clearly illiterate though I will no doubt jizz on the panels of She-Hulk.

On the other hand, if this question is asked of me by a sympathetic Bengali the meaning is 'If you weren't a Tambram thicko, which book would you read?' and I'd reply 'That big book about y'know how the dinosaurs were y'know sexually harassed by Donald Trump and went extinct'. The Bengali will then gas on about that Uncle of his who was Professor of Paleontology at Harvard till his craving for sandesh overpowered him and he returned to Kolkata and met a Tambram who was the missing link between Humans and Chimpanzees. Anyway, the Iyengars bribed him with a life time supply of sandesh, to keep the thing quiet.
Displacement is never built into artificial symbolic systems for metamathematics, programming, or other purposes.
It isn't built into legal documents either. In contexts were a high duty of care is demanded, ambiguous language is avoided and all possible contexts are previsioned.
It had long been assumed to be a peculiar and puzzling imperfection of natural language.
The thing is purely economic. Precision is costly. There is no puzzle here.
Merge automatically yields displacement with copies—in this case, two copies of which book.
That's not what happened when Chomsky interpreted the sentence. He has two copies of not 'which book' but of 'book x'.
The correct semantic interpretation follows directly.
But not from 'Merge' because Chomsky didn't say 'Let x be the book you will read. What is the name of x?'
Anyway, as I showed, his semantic interpretation was faulty because the meaning is context dependent. In my case, the question is either dismissive and humiliating or invites me to indulge in the fantasy of being a smart guy.
Far from being an imperfection, displacement with copies is a predicted property of the simplest system. Displacement is, in some respects, even simpler than Merge, since it calls on far more limited computational resources.
If it is simpler, then get rid of Merge and settle for Displacement because your big shtick is 'simplicity rules, Ok!'
The same processes provide intricate semantic interpretations for such properties as referential dependence and quantifier-variable interaction. They also have further implications about the nature of language. Consider the sentence “the boys expect to see each other,” where “each other” refers to the boys, thus obeying an obvious locality condition of referential dependency. Consider now the sentence, “Which girls do the boys expect to see each other?” The phrase “each other” does not refer back to the closest antecedent, “the boys,” as such phrases universally do; rather, it refers back to the more remote antecedent, “which girls.” The sentence means “For which girls the boys expected those girls to see each other?”
That's not true. The expecting is on going. We may in the past have expected Jennifer to see Charmaine but currently we expect only Charmaine to see Jennifer coz Jennifer will have Charmaine's boyfriend's jizz in her eyes and thus  be temporarily blinded and unable to fend off Charmaine's furious bitch-slapping.
That is what reaches the mind under Merge-based computation with automatic copies, although not what reaches the ear.
We form a mental image of certain girls who are currently expected by a certain group of boys to see each other. This image can be quite vivid depending on the context. In a sit-com, like Friends, we feel the tension rise as the boys realize which of the girls are going to see each other and start gossiping and thus discover that if Chandler was coming out of Monica's bedroom in a tousled state at the same time that Monica was helping Joey make a pizza then...OMG Richard is banging Chandler! No wonder his mustache fell off!

This has nothing to do with computation and everything to do what we always suspected.
What reaches the ear violates the locality condition of referential dependency.
Like Chandler getting violated by Richard.
Deletion of the copy in externalization causes processing problems.
Monica is going to have problems processing this. So is Joey. Phoebe will be cool with it though. So much to talk about at the water-cooler tomorrow!
Such filler-gap problems, as they are called, can be become quite severe, and are among the major problems of automatic parsing and perception. In the sentence, for example, “Who do you think ____ left the show early?” the gap marks the place from which the interrogative has been moved, creating a long-distance dependency between the interrogative and the finite verb. If the interrogative copy were not deleted, the problem would be much reduced. Why is it deleted?
To build dramatic tension- d'uh! Guess who left the show just when Phoebe got up on stage and started singing 'Smelly Cat' ? Yup! It was Chandler overcome by shame at his malodorous man-pussy.
The principles of efficient computation restrict what is computed to the minimum. At least one copy must appear or there is no evidence that displacement took place at all. In English and languages like English, that copy must be structurally the most prominent one. The result is to leave gaps that must be filled by the hearer. This is a matter that can become quite intricate.
Coz intricacy builds tension or has some other pleasurable psychological effect.
These examples illustrate a significant general phenomenon. Language design appears to maximize computational efficiency, but disregards communicative efficiency.
Language is rewarding when it creates dramatic mental pictures.
Thus- 'Guess who left the show while Phoebe was halfway through her set?
            Hint. She was singing 'Smelly Cat'.
            That's right! It was Chandler. Could I have a smellier man-pussy!
is highly efficient though computationally intractable.
In every known case in which computational and communicative efficiency conflict, communicative efficiency is ignored.
Computation is multiply realisable. Very high Kolmogorov complexity can be factorized using 'co-evolved' modules which are simple. However, the thing is idiographic, not nomothetic. To get to 'univalent' foundations you are going to need a heck of a lot intensional Types. But that doesn't matter so long as the thing is useful. Voevodksky's work is useful. Chomsky's isn't.
These facts run counter to the common belief, often virtual dogma, that communication is the basic function of language.
This is not a common belief because it serves no good purpose. True, from time to time, we say 'fuck is wrong with you? Why can't you speak plainly?' but then we immediately realize that the other guy is either a cretin or a swindler or a swindling cretin whom everybody else already sees through. Still, why provoke the fellow? Just nod and smile and find someone else to talk to.
They also further undermine the assumption that human language evolved continuously from animal communication.
Humans are animals. We evolved continuously from other animals. It is not plausible that there was a sudden miracle such that humans and humans alone got an 'internal language' nor that, if the internal language featured clicks, those with that variety of the trait went off to South Africa where after some time they started speaking with clicking sounds.
And they provide further evidence that externalization, which is necessary for communication, is a peripheral aspect of language.
Communicational efficiency is about having the maximal impact using the simplest (i.e. most visceral) conceptual means. It is wholly context dependent. Wit requires brevity. Humor may be prolix. In the Drama, terseness builds tension. But, not always. In a Shakespearean soliloquy, you want baroque, sufflaminandus erat, excess because it builds tension past the point where it is bearable and provokes a cathartic purgation. Context or Custom rules over all.

Failure to see this would lead a linguist to think words don't matter. Instead of doing his own job he should pretend to be a biologist or computer maven or be doing other such sorts of useful and prestigious research.
There are methodological and evolutionary reasons to expect that the basic design of language will be quite simple, perhaps even close to optimal.
There is no reason to believe that ultracrepidarian cretins will ever discover even the simplest and most elementary fact about anything.
With regard to externalization, the same methodological arguments hold, as they always do, but the evolutionary arguments do not apply.
Quite mad! Evolution certainly determined how our vocal chords came to be as they are. Externalization is under selective pressure- which is why guys with a good line in chat get the girls while your truly ends up in the kitchen talking to other losers from the Engineering Dept.

By contrast, Chomsky's 'internal language' could not have been under selective pressure because he says it came into being before there was any external speech. It is reasonable to say this about 'theory of mind', but such a theory is not a language.
The externalization of language may not, in fact, involve evolution at all. The sensorimotor systems were in place long before the appearance of language.
Sez you. Where is the proof?
Mapping the language of thought to some sensorimotor system is a hard cognitive problem, one that involves coordinating a computationally efficient internal system and an unrelated sensory modality.
It is not a cognitive problem precisely because it is solved by an actual, not simulated, evolutionary process. Guys who are good at the gab pull girls and have babies. Boring sods like me stagger home alone and unloved.
The variety, complexity, and easy mutability of observed languages might lie primarily in externalization. It seems increasingly clear that this is the case—something that should be expected. Children know the principles of the internal language without evidence; as, indeed, they know a great deal more about language, including almost all semantic and most syntactic properties.
Children also know the principles of nuclear fusion coz they are touched by the Sun's rays.

If you use a word like knowledge to mean whatever you want it to mean, then your drivel has no semantic or syntactic properties. It is shite.
This is a matter of contention, but solidly established, I think.
Coz it suits your amour propre.
There is by now some reason to hope that the emerging science of neurolinguistics might identify the brain circuits that underlie the computational system. The neuroscientist Angela Friederici reviews a great deal of promising work in a forthcoming book. Publication is scheduled to coincide with the fiftieth anniversary of Eric Lenneberg’s classic study, Biological Foundations of Language. Friederici’s own work leads to some bold and challenging proposals. She provides evidence that a crucial element in linguistic computation is a white matter dorsal fiber tract that connects a specific region in Broca’s area, part of Brodmann area 44, to the posterior temporal cortex. She suggests that this pathway might be “the missing link which has to evolve in order to make the full language capacity possible.” Evidence indicates that this dorsal pathway is very weak in macaques and chimpanzees, and weak and poorly myelinated in newborns, but strong in adult humans with language mastery. The increasing strength of this pathway, Friederici remarks, “correlates directly with the increasing ability to process complex syntactic structures.” A variety of experimental results suggest that “[t]his fiber tract may thus be one of the reasons for the difference in the language ability in human adults compared to the prelinguistic infant and the monkey.” These structures, Friederici suggests, appear to “have evolved to subserve the human capacity to process syntax, which is at the core of the human language faculty.” Quite intriguing discoveries might be forthcoming in these domains.
Cool! This is useful stuff which can help a lot of good people with inherited abnormalities or who have suffered a brain injury or a stroke. Scientists doing this sort of work are welcome to speculate in a grandiose manner because their work more than pays for itself. Chomsky is doing nothing similar. He made a mistake many decades ago and only takes an interest in what genuine scientists are doing coz he clings to the tattered hope that, by some miracle, they will prove he was right.

Similarly, I like reading articles about penguins coz my sister told me I was born a penguin in a Zoo in Germany and my parents adopted me and raised me as a human being. I did write to Prime Minister Indira Gandhi mentioning this because I wanted her to claim Antarctica as sovereign Indian territory. She never wrote back coz she'd been dead for two decades. Anyway, that's why I gave up Indian citizenship.
Let us return to the second component of a computational system, its atomic elements. In the case of language, these will be its lexical items. The conventional view is that these are cultural products, and that the basic ones are associated with extra-mental entities. This representationalist doctrine has been almost universally adopted in the modern period. The doctrine appears to hold for animal communication. Monkey calls are associated with specific physical events. The doctrine is radically false for human language, something recognized in classical Greece.
Coz we have theory of mind and need to solve both coordination and discoordination games coz we are strategically deceptive, right?

Not right, says Chomsky coz he doesn't get that efficiency is what drives Schelling focality.
How can we cross the same river twice, asked Heraclitus? Why are two appearances understood to be two stages of the same river? When we look into the question, puzzles abound.
No they don't. Calling the Thames the Thames solves a coordination problem.
Suppose that the flow of the river has been reversed. It is still the same river. Suppose that what is flowing becomes ninety-five percent arsenic because of discharges from an upstream plant. It is still the same river. The same is true of other quite radical changes.
A radical change in the Society which needs a name for the river, may lead to a change in its name.
On the other hand, with very slight changes it will no longer be a river at all. If its sides are lined with fixed barriers and it is used for oil tankers, it is a canal, not a river.
No. It is still a river. A canal is man made from the get go.
If its surface undergoes a slight phase change and is hardened, a line is painted down the middle, and it is used to commute to town, then it is a highway, no longer a river.
That's not a 'slight phase change' at all.
 Exploring the matter further, we discover that what counts as a river depends on mental acts and constructions.
No shit, Sherlock!
The same is true of even the most elementary concepts: tree, water, house, person, London, or, in fact, any of the basic words of human language. Human language and thought systematically violate the representationalist doctrine.
This is because doctrines with fancy names invented by stupid pedants are worthless shite.
Our intricate knowledge of what even the simplest words mean is acquired virtually without experience. At peak periods of language acquisition, children acquire about a word an hour, often on one presentation. The rich meaning of even the most elementary words must be substantially innate.
So God must exist coz God is an elementary word.
The evolutionary origin of such concepts is a complete mystery.
In which case, Chomsky is wrong to be an Atheist.
The Galilean challenge must be reformulated to distinguish language from speech, and to distinguish production from internal knowledge.
I have frequently explained, in my blog posts, that the Nicaraguan horcrux of my neighbor's cat is responsible for everything that happens. I don't, however, reformulate the Galilean challenge to distinguish language from speech, because the Nicaraguan horcrux of my neighbor's cat already distinguished production from internal knowledge. Thus everybody knows- internally- that I'm a great genius even if they are constantly complaining that I'm an incontinent cretin.

Chomsky is fixated on internal knowledge coz  he thinks there is a possible world where some magical event happened such that everybody knows internally that he is right even though he is as wrong as it is possible to be. Of course, this would not apply to alien species. That is why he says knowledge is uniquely human. We were magically endowed with the language faculty only so as to be able to say- internally, at least- Chomsky has been right all along.
Our internal computational system yields a language of thought, a system that might be remarkably simple, conforming to what the evolutionary record suggests.
It is so simple that it produces only one result- viz. our internally knowing Chomsky is a genius even though he is a silly man.
Secondary processes map the structures of language to one or another sensorimotor system for externalization. These processes appear to be the locus of the complexity and variety of linguistic behavior, and its mutability over time.
But, if that is true, then externalization isn't 'peripheral' at all. It is the royal road to that internal language Chomsky believes in. Thus, a linguistics that pays for itself by doing useful stuff, would also teach us about the 'language of thought'.

However, the linguistics which pays for itself is extensional, not intensional. So Chomsky has been barking up the wrong tree- at least so far as we know.
The origins of computational atoms remain a complete mystery.
They can't exist save by retrofitting.
So does the Cartesian question of how language can be used in its normal creative way, in a manner appropriate to situations, but not caused by them, incited and inclined, but not compelled. The mystery holds for even the simplest forms of voluntary motion.
Mystery, for the Christian West, is linked to the notion of oikonomia. Economics- more particularly paradigms from evolutionary game theory- is what renders such mysteries, not sinister or occult, but sources of comfort and illumination and a wisdom deeper than tears.
A great deal has been learned about language since the Biolinguistic Program was initiated.
But not by Chomsky.
It is fair to say, I think, that more has been learned about the nature of language, and about a very wide variety of typologically different languages, than in the entire two-thousand-five-hundred-year prior history of inquiry into language. New questions have arisen, some quite puzzling. Some surprising answers lead us to revise what has long been believed about language, and mental processes generally. The more we learn, the more we discover what we do not know.
And the more puzzling it often seems.
But only 'seems'. The good news, according to Chomsky, is that you all have an internal language inside you which keeps saying - Chomsky, despite all evidence to the contrary, was is and will always be right. At least, for human beings- because of some special mutation we have. Is it a coincidence that Aliens started showing up and anally probing random American dudes just when Chomsky started pushing this theory? Clearly, those aliens were looking for the asshole who dissed their language. The Pentagon knows all about it, but have hushed the thing up coz of a selfish desire to get a good probing themselves. Still, sooner or later, the Aliens will find Chomsky and have their way with him. Then all these UFO sightings will stop. Shit. Just checked. Gillian Anderson has said no to a proposed X files re-reboot so we're gonna miss the Series finale where Chomsky gets reamed. Shame on you Gillian! Not that you're not smokin' in Sex Education but the Truth is out there- or would be if you came back to the Files and Chomsky got his comeuppance.

No comments:

Post a Comment