The language as a phenomena is obviously neither purely functional nor purely statistical. Purity is an abstract nonsense, an abstract category of abstractions.<p>It is obvious from serious psychological studies of the process of a language acquisition, that it is similar to training a neural network - there is some knowledge representation grows up in the brain, but the process of training/learning is possible due to having appropriate machinery in the brain.<p>It seems, like we have more that two apriory notions - of time and space, we, perhaps, have, apriory notions of a thing (noun), process (verb) and attribute (adjective) and even predicate at very least, as reflections of our perceptions of physical universe around us with sensory input procession machinery we happen to evolve.<p>It is a mutualy recursive process - we evolved our "inner representation" of reality constrained by senses, but nature selects, in some cases, those with more correct representations.<p>How these apriory notions maps to sounds - details of phonology and morphology is rather irrelevant - we evolved machinery for that. This is why, there is no fundamental, principal differences between human languages. The difference in in a degree, not in a kind.<p>It seems also that we learn not the rules (schools are very recent innovations), but "weights" by being exported to the medium of a local spoken language. Children do it on their own, at least in remote areas, like among nomads of Himalaya, no worse than Americans. This, by the way, is prof that we have everything we need to be Buddha or Einstein.<p>How exactly training occurs is absolutely unknown but it has nothing to do with probabilities. Nature knows nothing about probabilities, but it obviously "knows" rates - how often something happen. Animals "know" how often something happen.<p>Probabilities is an invention of the mind, which leads to so many errors in cases where not all possible outcomes and its caused are know, which is almost always the case. Nature could not rely on such faulty tool.<p>So, like every naturally complex system, it has both "procedures" and "weighted" data. Language capacity is hardwired, but grammar "grows" according to exposure.<p>To speak about hows, and especially how-exactlys in terms of either pure procedures or pure statistics is misleading. It is both.<p>And Mr.Chimsky is right - mere data, leave alone probabilistic models, describe nothing about principles behind what is going on. They does not even describe what's going on correctly, only some approximation to an overview of something unknown being partially observed.<p>The more or less correct model, as a philosophy, must be grounded in reality, especially in that part of it which we call the mind. It has been pointed out, that mind itself is possible because of hardwired apriory notions (grounded in physical universe) of succession and distance, so models should be augmented with these notions too. Pure statistics is nothing.