Find the word definition

Wikipedia
Generative grammar

Generative grammar is a linguistic theory that considers grammar to be a system of rules that is intended to generate exactly those combinations of words which form grammatical sentences in a given language. The term was originally used in relation to the theories of grammar developed by Noam Chomsky, beginning in the late 1950s. Linguists who follow the generative approach, originated by Chomsky, have been called generativists. The generative school has focused on the study of syntax, but has also addressed other aspects of a language's structure, including morphology and phonology.

Early versions of Chomsky's theory were called transformational grammar, and this is still used as a general term that includes his subsequent theories. The most recent is the Minimalist Program, from which Chomsky and other generativists have argued that many of the properties of a generative grammar arise from a universal grammar which is innate to the human brain, rather than being learned from the environment (see the poverty of the stimulus argument).

There are a number of competing versions of generative grammar currently practiced within linguistics. A contrasting approach is that of constraint-based grammars. Where a generative grammar attempts to list all the rules that result in all well-formed sentences, constraint-based grammars allow anything that is not otherwise constrained. Constraint-based grammars that have been proposed include certain versions of dependency grammar, head-driven phrase structure grammar, lexical functional grammar, categorial grammar, relational grammar, link grammar, and tree-adjoining grammar. In stochastic grammar, grammatical correctness is taken as a probabilistic variable, rather than a discrete (yes vs. no) property.

Usage examples of "generative grammar".

When the Bene Gessent analyzed syntax, they used a system called generative grammar a system alread} ancient when the empire was founded It was adapted by the B G as much for its philosophy as for its usefulness in explaining linguistic phenomena Its basic tenet was that the most ordinary speaker of any human language was a storehouse of creativity, capable of deriving an infinite number of unique sentences from a limited number of words and grammatical structures These words and structures were combined and changed according to a finite number of rules, some of which were called transfer mations The .