Студопедия

КАТЕГОРИИ:


Архитектура-(3434)Астрономия-(809)Биология-(7483)Биотехнологии-(1457)Военное дело-(14632)Высокие технологии-(1363)География-(913)Геология-(1438)Государство-(451)Демография-(1065)Дом-(47672)Журналистика и СМИ-(912)Изобретательство-(14524)Иностранные языки-(4268)Информатика-(17799)Искусство-(1338)История-(13644)Компьютеры-(11121)Косметика-(55)Кулинария-(373)Культура-(8427)Лингвистика-(374)Литература-(1642)Маркетинг-(23702)Математика-(16968)Машиностроение-(1700)Медицина-(12668)Менеджмент-(24684)Механика-(15423)Науковедение-(506)Образование-(11852)Охрана труда-(3308)Педагогика-(5571)Полиграфия-(1312)Политика-(7869)Право-(5454)Приборостроение-(1369)Программирование-(2801)Производство-(97182)Промышленность-(8706)Психология-(18388)Религия-(3217)Связь-(10668)Сельское хозяйство-(299)Социология-(6455)Спорт-(42831)Строительство-(4793)Торговля-(5050)Транспорт-(2929)Туризм-(1568)Физика-(3942)Философия-(17015)Финансы-(26596)Химия-(22929)Экология-(12095)Экономика-(9961)Электроника-(8441)Электротехника-(4623)Энергетика-(12629)Юриспруденция-(1492)Ядерная техника-(1748)

Transformational model of sentence analysis. Types of transformation




Structural models of sentence analysis. Distributional model and types of distribution. IC-model.

Methods of of structural linguistics are based on the notions of position, co-occurrence and substitution (substitutability).

The total set of environments of a certain element is its distribution. The term distribution denotes the occurrence of an element relative to other elements. Elements may be in:

1) non-contrastive distribution (the same position, no difference in meaning; variants of the same element): hoofs - hooves;

2) contrastive distribution (the same position, different meanings): She is charming. She is charmed.

3) complementary distribution (mutual exclusiveness of pairs of forms in a certain environment; the same meaning, different positions; variants of the same element): cows - oxen.

The distributional model (Ch.Fries) shows the linear order of sentence constituents. The syntactic structure of the sentence is presented as a sequence of positional classes of words. Showing the linear order of classes of words the model does not show the syntactic relations of sentence constituents. It does not show the ambiguity of sentence.

This drawback is overcome by the IC-model. A sentence is a structured string of words, grouped into phrases. So sentence constituents are words and word-groups. The basic principle for grouping words into phrases (endo- or exocentric) is cohesion, or the possibility to substitute one word for the whole group without destroying the sentence structure. The sentence is built by 2 immediate constituents: NP+VP, each of which may have constituents of its own. Constituents which cannot be further divided are called ultimate (UC). The IС model exists in 2 main versions: the analytical model and the derivation tree. The analytical model divides the sentence into IC-s and UC-s. The derivation tree shows the syntactic dependence of sentence constituents.

So the IC-model shows both the syntactic relations and the linear order of elements.

 

 

Different sentence types are structurally and semantically related. So the syntactic structure of a given sentence may be described by making these relations explicit. Sentences, in which all constituents are obligatory, are called basic structures (= elementary sentences = kernel sentences). Linguists single out from 2 to 7 kernel sentences: 1) NV 2) NVN 3) NVPrepN 4) N is N 5) N is A 6) N is Adv. 7) N is PrepN. The structure of all other sentences is a result of certain transformations of kernel structures. This analysis, showing derivational relations of sentences, is called transformational (N.Chomsky). TM is based on IC-model and it goes further showing semantic and syntactic relations of different sentence types. TM describes paradigmatic relations of basic and derived structures, or the relations of syntactic derivation. Kernel sentences, which serve as the base for deriving other structures, are called deep (= underlying structures), opposed to surface structures of derived sentence types (= transforms). So both the deep and the surface structure belong to the syntactic level of analysis.

Transformations may be subdivided into intramodel = single-base (changing the kernel structure) and two-base (combining 2 structures).

Single-base transformations:

1) modifying the kernel structure: She is working hard. → She is not working hard

2) changing the kernel structure: (2) She is working hard. → Her working hard. → Her hard work.

Some basic types of intramodel transformations:

1) substitution, deletion: Have you seen him? → Seen him?;

2) permutation or movement: He is here. → Is he here?;

3) nominalization: He arrived → His arrival;

4) two-base transformations:

- embedding: know that he has come,

- word-sharing: saw him cross the street.

TM shows that sentences with different surface structures paraphrase, because they are derived from the same deep structure: He arrived → his arrival → for him to arrive → his arriving.

TM shows that some sentences are ambiguous, because they derive from distinct deep structures: Flying planes can be dangerous. → 1. Planes are dangerous. 2. Flying is dangerous. So TM is an effective method of deciding grammatical ambiguity.

A grammar which operates using TM is a transformational grammar (TG). In TG the IC-analysis is supplemented with rules for transforming one sentence into another. TG became an extremely influental type of generative grammatical theory, also called generative grammar.

 

 

25. Semantic structure of the sentence ( Ch. Fillmore ).

Generative semantics. Case Grammar.

In Case Grammar deep (underlying) structure is semantic and surface structure is syntactic. Deep structure has 2 main constituents:

1) modality (features of mood, tense, aspect, negation, relating to the sentence as a whole);

2) proposition (a tenseless set of relationships): “S → M + Pr”.

The proposition is constituted by the semantic predicate (the central element) and some nominal elements, called arguments or participants: “P → V + N1 + N2 + N3 …” The proposition is a reflection of situations and events of the outside world. The semantic predicate determines the number of arguments, or opens up places for arguments. Accordingly we may distinguish

- one-place predicates (She sang),

- two-place predicates (She broke the dish) and so on.

Arguments are in different semantic relations to the predicate. These relations are called semantic roles or deep cases (P+V+C1+C2+C3...). The choice of semantic roles depends on the nature of the predicate.

W.Chafe divides predicates into

1) states

2) non-states (events):

- actions

- processes:

1. The wood is dry. - state

2. She sang. (What did she do?) - action

3. The wood dried. (What happened?) - process.

Semantic roles (deep cases) are judgements about the events.

The most general roles are agent (doer of the action) and patient (affected by the action or state). Actions are accompanied by agents and states. Processes - by patients. Predicates, denoting both actions and processes - by agents and patients: She broke the dish.

The original set of deep cases includes 6 cases (by Ch.Fillmore): agentive, objective, beneficiary, instrument, locative, factitive. E.g.: 1. He dug the ground. (Objective). 2. He dug a hole. (Factitive). Sentences (1) and (2) have the same surface structure, but different deep structure.

On the other hand different syntactic structures may refer to the same deep structure:

1. John opened the door with the key.

2. The door was opened by John.

3. John used the key to open the door.

4. The key opened the door.

 

 




Поделиться с друзьями:


Дата добавления: 2014-12-29; Просмотров: 6844; Нарушение авторских прав?; Мы поможем в написании вашей работы!


Нам важно ваше мнение! Был ли полезен опубликованный материал? Да | Нет



studopedia.su - Студопедия (2013 - 2024) год. Все материалы представленные на сайте исключительно с целью ознакомления читателями и не преследуют коммерческих целей или нарушение авторских прав! Последнее добавление




Генерация страницы за: 0.011 сек.