Linguistics of Noam Chomsky

The basis of Chomsky's linguistic theory lies in biolinguistics, the linguistic school that holds that the principles underpinning the structure of language are biologically preset in the human mind and hence genetically inherited.[2] He argues that all humans share the same underlying linguistic structure, irrespective of sociocultural differences.[3] In adopting this position Chomsky rejects the radical behaviorist psychology of B. F. Skinner, who viewed speech, thought, and all behavior as a completely learned product of the interactions between organisms and their environments. Accordingly, Chomsky argues that language is a unique evolutionary development of the human species and distinguished from modes of communication used by any other animal species.[4][5] Chomsky's nativist, internalist view of language is consistent with the philosophical school of "rationalism" and contrasts with the anti-nativist, externalist view of language consistent with the philosophical school of "empiricism",[6] which contends that all knowledge, including language, comes from external stimuli.[1]

What started as purely linguistic research ... has led, through involvement in political causes and an identification with an older philosophic tradition, to no less than an attempt to formulate an overall theory of man. The roots of this are manifest in the linguistic theory ... The discovery of cognitive structures common to the human race but only to humans (species specific), leads quite easily to thinking of unalienable human attributes.

Edward Marcotte on the significance of Chomsky's linguistic theory[1]

Universal grammar

Since the 1960s, Chomsky has maintained that syntactic knowledge is at least partially inborn, implying that children need only learn certain language-specific features of their native languages. He bases his argument on observations about human language acquisition and describes a "poverty of the stimulus": an enormous gap between the linguistic stimuli to which children are exposed and the rich linguistic competence they attain. For example, although children are exposed to only a very small and finite subset of the allowable syntactic variants within their first language, they somehow acquire the highly organized and systematic ability to understand and produce an infinite number of sentences, including ones that have never before been uttered, in that language.[7] To explain this, Chomsky reasoned that the primary linguistic data must be supplemented by an innate linguistic capacity. Furthermore, while a human baby and a kitten are both capable of inductive reasoning, if they are exposed to exactly the same linguistic data, the human will always acquire the ability to understand and produce language, while the kitten will never acquire either ability. Chomsky referred to this difference in capacity as the language acquisition device, and suggested that linguists needed to determine both what that device is and what constraints it imposes on the range of possible human languages. The universal features that result from these constraints would constitute "universal grammar".[8][9][10] Multiple scholars have challenged universal grammar on the grounds of the evolutionary infeasibility of its genetic basis for language,[11] the lack of universal characteristics between languages,[12] and the unproven link between innate/universal structures and the structures of specific languages.[13] Scholar Michael Tomasello has challenged Chomsky's theory of innate syntactic knowledge as based on theory and not behavioral observation.[14]

Although it was influential from 1960s through 1990s, Chomsky's nativist theory was ultimately rejected by the mainstream child language acquisition research community owing to its inconsistency with research evidence.[15][16] It was also argued by linguists including Robert Freidin, Geoffrey Sampson, Geoffrey K. Pullum and Barbara Scholz that Chomsky's linguistic evidence for it had been false.[17]

Transformational-generative grammar

Transformational-generative grammar is a broad theory used to model, encode, and deduce a native speaker's linguistic capabilities.[18] These models, or "formal grammars", show the abstract structures of a specific language as they may relate to structures in other languages.[19] Chomsky developed transformational grammar in the mid-1950s, whereupon it became the dominant syntactic theory in linguistics for two decades.[18] "Transformations" refers to syntactic relationships within language, e.g., being able to infer that the subject between two sentences is the same person.[20] Chomsky's theory posits that language consists of both deep structures and surface structures: Outward-facing surface structures relate phonetic rules into sound, while inward-facing deep structures relate words and conceptual meaning. Transformational-generative grammar uses mathematical notation to express the rules that govern the connection between meaning and sound (deep and surface structures, respectively). By this theory, linguistic principles can mathematically generate potential sentence structures in a language.[1]

A set of 4 ovals inside one another, each resting at the bottom of the one larger than itself. There is a term in each oval; from smallest to largest: regular, context-free, context-sensitive, recursively enumerable.
Set inclusions described by the Chomsky hierarchy

Chomsky is commonly credited with inventing transformational-generative grammar, but his original contribution was considered modest when he first published his theory. In his 1955 dissertation and his 1957 textbook Syntactic Structures, he presented recent developments in the analysis formulated by Zellig Harris, who was Chomsky's PhD supervisor, and by Charles F. Hockett.[lower-alpha 1] Their method is derived from the work of the Danish structural linguist Louis Hjelmslev, who introduced algorithmic grammar to general linguistics.[lower-alpha 2] Based on this rule-based notation of grammars, Chomsky grouped logically possible phrase-structure grammar types into a series of four nested subsets and increasingly complex types, together known as the Chomsky hierarchy. This classification remains relevant to formal language theory[21] and theoretical computer science, especially programming language theory,[22] compiler construction, and automata theory.[23]

Transformational grammar was the dominant research paradigm through the mid-1970s. The derivative[18] government and binding theory replaced it and remained influential through the early 1990s, [18] when linguists turned to a "minimalist" approach to grammar. This research focused on the principles and parameters framework, which explained children's ability to learn any language by filling open parameters (a set of universal grammar principles) that adapt as the child encounters linguistic data.[24] The minimalist program, initiated by Chomsky,[25] asks which minimal principles and parameters theory fits most elegantly, naturally, and simply.[24] In an attempt to simplify language into a system that relates meaning and sound using the minimum possible faculties, Chomsky dispenses with concepts such as "deep structure" and "surface structure" and instead emphasizes the plasticity of the brain's neural circuits, with which come an infinite number of concepts, or "logical forms".[5] When exposed to linguistic data, a hearer-speaker's brain proceeds to associate sound and meaning, and the rules of grammar we observe are in fact only the consequences, or side effects, of the way language works. Thus, while much of Chomsky's prior research focused on the rules of language, he now focuses on the mechanisms the brain uses to generate these rules and regulate speech.[5][26]

Selected bibliography

  1. Baughman et al. 2006.
  2. Lyons 1978, p. 4; McGilvray 2014, pp. 2–3.
  3. Lyons 1978, p. 7.
  4. Lyons 1978, p. 6; McGilvray 2014, pp. 2–3.
  5. Brain From Top To Bottom.
  6. McGilvray 2014, p. 11.
  7. Dovey 2015.
  8. Chomsky.
  9. Thornbury 2006, p. 234.
  10. O'Grady 2015.
  11. Christiansen & Chater 2010, p. 489; Ruiter & Levinson 2010, p. 518.
  12. Evans & Levinson 2009, p. 429; Tomasello 2009, p. 470.
  13. Tomasello 2003, p. 284.
  14. Tomasello 1995, p. 131.
  15. Fernald & Marchman 2006, pp. 1027–1071.
  16. de Bot 2015, pp. 57–61.
  17. Pullum & Scholz 2002, pp. 9–50.
  18. Harlow 2010, p. 752.
  19. Harlow 2010, pp. 752–753.
  20. Harlow 2010, p. 753.
  21. Butterfield, Ngondi & Kerr 2016.
  22. Knuth 2002.
  23. Davis, Weyuker & Sigal 1994, p. 327.
  24. Hornstein 2003.
  25. Szabó 2010.
  26. Fox 1998.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.