lOMoARcPSD|3958114
CHAPTER 1. SEMANTICS IN LIGUISTICS
Semantics is the study of meaning communicated through language.
One of the insights of modern linguistics is that speakers of a language have different
types of linguistic knowledge, including how to pronounce words, how to construct
sentences, and about the meaning of individual words and sentences. To reflect this,
linguistic description has different levels of analysis. So phonology is the study of what
sounds a language has and how these sounds combine to form words; syntax is the study
of how words can be combined into sentences; and semantics is the study of the meanings
of words and sentences.
We can take entailment to mean a relationship between sentences so that if a sentence A
entails a sentence B, then if we know A we automatically know B. Or alternatively, it
should be impossible, at the same time, to assert A and deny B.
The process of creating and interpreting symbols, sometimes called signification, is far
wider than language. Scholars like Ferdinand de Saussure (1974) have stressed that the
study of linguistic meaning is a part of this general study of the use of sign systems, and
this general study is called semiotics. Semioticians investigate the types of relationship
that may hold between a sign and the object it represents, or in Saussure’s terminology
between a signifier and its signified. One basic distinction, due to C. S. Peirce, is between
icon, index, and symbol. An icon is where there is a similarity between a sign and what
it represents. An index is where the sign is closely associated with its signified, often in
a causal relationship; thus smoke is an index of fire. Finally, a symbol is where there is
only a conventional link between the sign and its signified.
Definitions theory. As soon as we begin our task of attaching definitions to words, we
will be faced with a number of challenges. Three in particular prove very tricky for our
theory. Circularity, a distinction between linguistic knowledge (about the meaning of
words) and encyclopedic knowledge (about the way the world is) and what particular
utterances mean in context.
To cope with the problem of circularity, one solution is to design a semantic
metalanguage with which to describe the semantic units and rules of all languages. It’s
a tool of description.
The knowledge a speaker has of the meaning of words is often compared to a mental
lexicon or dictionary.
In tackling the third problem, of context, one traditional solution has been to assume a
split in an expression’s meaning between the local contextual effects and a context-free
element of meaning, which we might call conventional or literal meaning.
One central issue is the relationship between word meaning and sentence meaning.
Knowing a language, especially one’s native language, involves knowing thousands of
words. Some linguists call the mental store of these words a lexicon, making an overt
parallel with the lists of words and meanings published as dictionaries. In this view, the
mental lexicon is a large but finite body of knowledge, part of which must be semantic.
This lexicon is not completely static because we are continually learning and forgetting
1
Descargado por Ainhoa Puebla Pueyo ()
, lOMoARcPSD|3958114
words. It is clear though that at any one time we hold a large amount of semantic
knowledge in our memory.
Phrases and sentences also have meaning of course, but an important difference between
word meaning on the one hand, and phrase and sentence meaning on the other, concerns
productivity. It is always possible to create new words, but this is a relatively infrequent
occurrence. On the other hand, speakers regularly create sentences that they have never
used or heard before, confident that their audience will understand them. Noam Chomsky
in particular has commented on the creativity of sentence formation (e.g. Chomsky 1965:
7–9). It is one of generative grammar’s most important insights that a relatively small
number of combinatory rules may allow speakers to use a finite set of words to create a
very large, perhaps infinite, number of sentences. To allow this the rules for sentence
formation must be recursive, allowing repetitive embedding or coordination of syntactic
categories.
Semanticists often describe this by saying that sentence meaning is compositional. This
term means that the meaning of an expression is determined by the meaning of its
component parts and the way in which they are combined.
An utterance is created by speaking (or writing) a piece of language. If I say Ontogeny
recapitulates phylogeny, this is one utterance. If another person in the same room also
says Ontogeny recapitulates phylogeny, then we would be dealing with two utterances.
Sentences, on the other hand, are abstract grammatical elements obtained from
utterances. Sentences are abstract because if a third and fourth person in the room also
say Ontogeny recapitulates phylogeny with the same intonation, we will want to say that
we have met four utterances of the same sentence.
One further step of abstraction is possible for special purposes: to identify propositions.
In trying to establish rules of valid deduction, logicians discovered that certain elements
of grammatical information in sentences were irrelevant; for example, the difference
between active and passive sentences.
To sum up: utterances are real pieces of speech. By filtering out certain types of
(especially phonetic) information we can get to abstract grammatical elements,
sentences. By going on to filter out certain types of grammatical information, we can get
to propositions, which are descriptions of states of affairs and which some writers see as
a basic element of sentence meaning.
Non-literal uses of language are traditionally called figurative and are described by a host
of rhetorical terms including metaphor, irony, metonymy, synecdoche, hyperbole, and
litotes. what we can call the literal language theory, metaphors, and other non-literal uses
of language require a different processing strategy than literal language.
One way of talking about this is to distinguish between sentence meaning and speaker
meaning. This suggests that words and sentences have a meaning independently of any
particular use, which meaning is then incorporated by a speaker into the particular
meaning she wants to convey at any one time. In this view semantics is concerned with
sentence meaning and pragmatics with speaker meaning.
2
Descargado por Ainhoa Puebla Pueyo ()
, lOMoARcPSD|3958114
CHAPTER 2. MEANING, THOUGHT AND REALITY
All languages allow speakers to describe, or as we might say model, aspects of what they
perceive. In semantics this action of picking out or identifying with words is often called
referring or denoting.
Firstly there are linguistic expressions which can never be used to refer, for example the
words so, very, maybe, if, not, all. The second use of the distinction referring/non-
referring concerns potentially referring elements like nouns: it distinguishes between
instances when speakers use them to refer and instances when they do not.
Expressions like the Pacific Ocean are sometimes described as having constant
reference, while expressions like I, you, she, etc. are said to have variable reference.
We use the term referent of an expression for the thing picked out by uttering the
expression in a particular context; so the referent of the capital of Nigeria would be, since
1991, the city of Abuja. Similarly, the referent of a toad in I’ve just stepped on a toad
would be the unfortunate animal on the bottom of my shoe.
The term extension of an expression is the set of things which could possibly be the
referent of that expression. So the extension of the word toad is the set of all toads. As
mentioned earlier, in the terminology of Lyons (1977), the relationship between an
expression and its extension is called denotation.
If we adopt the hypothesis that the meaning of, say, a noun is a combination of its
denotation and a conceptual element, then from the point of view of a linguist, two basic
questions about the conceptual element are:
1. What form can we assign to concepts?
2. How do children acquire them, along with their linguistic labels?
We can look at some answers to these questions. In our discussion we will concentrate
on concepts that correspond to a single word, that is, they are lexicalized. Of course not
all concepts are like this: some concepts are described by phrases, as with the underlined
concept in 2.37 below:
2.37 On the shopping channel, I saw a tool for compacting dead leaves into garden
statuary.
We can speculate that the reason why some concepts are lexicalized and others not is
utility. If we refer to something enough it will become lexicalized. Possibly somebody
once said something like 2.38 below:
2.38 We’re designing a device for cooking food by microwaves.
When we talk of children acquiring concepts we have to recognize that their concepts
may differ from the concepts of adults. Work in developmental psychology has shown
that children may operate with concepts that are quite different: students of child language
describe children both underextending concepts, as when for a child dog can only be used
3
Descargado por Ainhoa Puebla Pueyo ()
, lOMoARcPSD|3958114
for their pet, not the one next door; and overextending concepts, where a child uses daddy
for every male adult, or cat for cats, rabbits, and other pets.
One traditional approach to describing concepts is to define them by using sets of
necessary and sufficient conditions. Because of problems with necessary and sufficient
conditions, or definitions, several more sophisticated theories of concepts have been
proposed. One influential proposal is due to Eleanor Rosch and her co-workers (e.g.
Rosch 1973, 1975, Rosch and Mervis 1975, Rosch et al. 1976, Mervis and Rosch 1981)
who have suggested the notion of prototypes. This is a model of concepts which views
them as structured so that there are central or typical members of a category, such as
BIRD or FURNITURE, but then a shading off into less typical or peripheral members.
So chair is a more central member of the category FURNITURE than lamp, for example.
Or sparrow a more typical member of the category BIRD than penguin.
CHAPTER 3. WORD MEANING
Word meaning, or lexical semantics. The traditional descriptive aims of lexical semantics
have been: (a) to represent the meaning of each word in the language; and (b) to show
how the meanings of words in a language are interrelated.
Different categories of words must be given different semantic descriptions. To take a
few examples: names, common nouns, pronouns, and what we might call logical words.
a. names e.g. Fred Flintstone
b. common nouns e.g. dog, banana, tarantula
c. pronouns e.g. I, you, we, them
d. logical words e.g. not, and, or, all, any
Looking at these types of words, we can say that they operate in different ways: some
types may be used to refer (e.g. names), others may not (e.g. logical words); some can
only be interpreted in particular contexts (e.g. pronouns), others are very consistent in
meaning across a whole range of contexts (e.g. logical words); and so on. It seems too
that semantic links will tend to hold between members of the same group rather than
across groups. So that semantic relations between common nouns like man, woman,
animal, and so on, are clearer than between any noun and words like and, or, not.
We will follow general linguistic tradition and assume that we must have a list of all the
words in a language, together with idiosyncratic information about them, and call this
body of information a dictionary or lexicon. Our interest in semantics is with lexemes
or semantic words. But first we should examine this unit word. Words can be identified
at the level of writing, where we are familiar with them being separated by white space,
where we can call them orthographic words. They can also be identified at the levels of
phonology, where they are strings of sounds that may show internal structuring which
does not occur outside the word, and syntax, where the same semantic word can be
represented by several grammatically distinct variants. Thus walks, walking, walked in
3.6 below are three different grammatical words:
a. He walks like a duck.
b. He’s walking like a duck.
c. He walked like a duck.
4
Descargado por Ainhoa Puebla Pueyo ()
CHAPTER 1. SEMANTICS IN LIGUISTICS
Semantics is the study of meaning communicated through language.
One of the insights of modern linguistics is that speakers of a language have different
types of linguistic knowledge, including how to pronounce words, how to construct
sentences, and about the meaning of individual words and sentences. To reflect this,
linguistic description has different levels of analysis. So phonology is the study of what
sounds a language has and how these sounds combine to form words; syntax is the study
of how words can be combined into sentences; and semantics is the study of the meanings
of words and sentences.
We can take entailment to mean a relationship between sentences so that if a sentence A
entails a sentence B, then if we know A we automatically know B. Or alternatively, it
should be impossible, at the same time, to assert A and deny B.
The process of creating and interpreting symbols, sometimes called signification, is far
wider than language. Scholars like Ferdinand de Saussure (1974) have stressed that the
study of linguistic meaning is a part of this general study of the use of sign systems, and
this general study is called semiotics. Semioticians investigate the types of relationship
that may hold between a sign and the object it represents, or in Saussure’s terminology
between a signifier and its signified. One basic distinction, due to C. S. Peirce, is between
icon, index, and symbol. An icon is where there is a similarity between a sign and what
it represents. An index is where the sign is closely associated with its signified, often in
a causal relationship; thus smoke is an index of fire. Finally, a symbol is where there is
only a conventional link between the sign and its signified.
Definitions theory. As soon as we begin our task of attaching definitions to words, we
will be faced with a number of challenges. Three in particular prove very tricky for our
theory. Circularity, a distinction between linguistic knowledge (about the meaning of
words) and encyclopedic knowledge (about the way the world is) and what particular
utterances mean in context.
To cope with the problem of circularity, one solution is to design a semantic
metalanguage with which to describe the semantic units and rules of all languages. It’s
a tool of description.
The knowledge a speaker has of the meaning of words is often compared to a mental
lexicon or dictionary.
In tackling the third problem, of context, one traditional solution has been to assume a
split in an expression’s meaning between the local contextual effects and a context-free
element of meaning, which we might call conventional or literal meaning.
One central issue is the relationship between word meaning and sentence meaning.
Knowing a language, especially one’s native language, involves knowing thousands of
words. Some linguists call the mental store of these words a lexicon, making an overt
parallel with the lists of words and meanings published as dictionaries. In this view, the
mental lexicon is a large but finite body of knowledge, part of which must be semantic.
This lexicon is not completely static because we are continually learning and forgetting
1
Descargado por Ainhoa Puebla Pueyo ()
, lOMoARcPSD|3958114
words. It is clear though that at any one time we hold a large amount of semantic
knowledge in our memory.
Phrases and sentences also have meaning of course, but an important difference between
word meaning on the one hand, and phrase and sentence meaning on the other, concerns
productivity. It is always possible to create new words, but this is a relatively infrequent
occurrence. On the other hand, speakers regularly create sentences that they have never
used or heard before, confident that their audience will understand them. Noam Chomsky
in particular has commented on the creativity of sentence formation (e.g. Chomsky 1965:
7–9). It is one of generative grammar’s most important insights that a relatively small
number of combinatory rules may allow speakers to use a finite set of words to create a
very large, perhaps infinite, number of sentences. To allow this the rules for sentence
formation must be recursive, allowing repetitive embedding or coordination of syntactic
categories.
Semanticists often describe this by saying that sentence meaning is compositional. This
term means that the meaning of an expression is determined by the meaning of its
component parts and the way in which they are combined.
An utterance is created by speaking (or writing) a piece of language. If I say Ontogeny
recapitulates phylogeny, this is one utterance. If another person in the same room also
says Ontogeny recapitulates phylogeny, then we would be dealing with two utterances.
Sentences, on the other hand, are abstract grammatical elements obtained from
utterances. Sentences are abstract because if a third and fourth person in the room also
say Ontogeny recapitulates phylogeny with the same intonation, we will want to say that
we have met four utterances of the same sentence.
One further step of abstraction is possible for special purposes: to identify propositions.
In trying to establish rules of valid deduction, logicians discovered that certain elements
of grammatical information in sentences were irrelevant; for example, the difference
between active and passive sentences.
To sum up: utterances are real pieces of speech. By filtering out certain types of
(especially phonetic) information we can get to abstract grammatical elements,
sentences. By going on to filter out certain types of grammatical information, we can get
to propositions, which are descriptions of states of affairs and which some writers see as
a basic element of sentence meaning.
Non-literal uses of language are traditionally called figurative and are described by a host
of rhetorical terms including metaphor, irony, metonymy, synecdoche, hyperbole, and
litotes. what we can call the literal language theory, metaphors, and other non-literal uses
of language require a different processing strategy than literal language.
One way of talking about this is to distinguish between sentence meaning and speaker
meaning. This suggests that words and sentences have a meaning independently of any
particular use, which meaning is then incorporated by a speaker into the particular
meaning she wants to convey at any one time. In this view semantics is concerned with
sentence meaning and pragmatics with speaker meaning.
2
Descargado por Ainhoa Puebla Pueyo ()
, lOMoARcPSD|3958114
CHAPTER 2. MEANING, THOUGHT AND REALITY
All languages allow speakers to describe, or as we might say model, aspects of what they
perceive. In semantics this action of picking out or identifying with words is often called
referring or denoting.
Firstly there are linguistic expressions which can never be used to refer, for example the
words so, very, maybe, if, not, all. The second use of the distinction referring/non-
referring concerns potentially referring elements like nouns: it distinguishes between
instances when speakers use them to refer and instances when they do not.
Expressions like the Pacific Ocean are sometimes described as having constant
reference, while expressions like I, you, she, etc. are said to have variable reference.
We use the term referent of an expression for the thing picked out by uttering the
expression in a particular context; so the referent of the capital of Nigeria would be, since
1991, the city of Abuja. Similarly, the referent of a toad in I’ve just stepped on a toad
would be the unfortunate animal on the bottom of my shoe.
The term extension of an expression is the set of things which could possibly be the
referent of that expression. So the extension of the word toad is the set of all toads. As
mentioned earlier, in the terminology of Lyons (1977), the relationship between an
expression and its extension is called denotation.
If we adopt the hypothesis that the meaning of, say, a noun is a combination of its
denotation and a conceptual element, then from the point of view of a linguist, two basic
questions about the conceptual element are:
1. What form can we assign to concepts?
2. How do children acquire them, along with their linguistic labels?
We can look at some answers to these questions. In our discussion we will concentrate
on concepts that correspond to a single word, that is, they are lexicalized. Of course not
all concepts are like this: some concepts are described by phrases, as with the underlined
concept in 2.37 below:
2.37 On the shopping channel, I saw a tool for compacting dead leaves into garden
statuary.
We can speculate that the reason why some concepts are lexicalized and others not is
utility. If we refer to something enough it will become lexicalized. Possibly somebody
once said something like 2.38 below:
2.38 We’re designing a device for cooking food by microwaves.
When we talk of children acquiring concepts we have to recognize that their concepts
may differ from the concepts of adults. Work in developmental psychology has shown
that children may operate with concepts that are quite different: students of child language
describe children both underextending concepts, as when for a child dog can only be used
3
Descargado por Ainhoa Puebla Pueyo ()
, lOMoARcPSD|3958114
for their pet, not the one next door; and overextending concepts, where a child uses daddy
for every male adult, or cat for cats, rabbits, and other pets.
One traditional approach to describing concepts is to define them by using sets of
necessary and sufficient conditions. Because of problems with necessary and sufficient
conditions, or definitions, several more sophisticated theories of concepts have been
proposed. One influential proposal is due to Eleanor Rosch and her co-workers (e.g.
Rosch 1973, 1975, Rosch and Mervis 1975, Rosch et al. 1976, Mervis and Rosch 1981)
who have suggested the notion of prototypes. This is a model of concepts which views
them as structured so that there are central or typical members of a category, such as
BIRD or FURNITURE, but then a shading off into less typical or peripheral members.
So chair is a more central member of the category FURNITURE than lamp, for example.
Or sparrow a more typical member of the category BIRD than penguin.
CHAPTER 3. WORD MEANING
Word meaning, or lexical semantics. The traditional descriptive aims of lexical semantics
have been: (a) to represent the meaning of each word in the language; and (b) to show
how the meanings of words in a language are interrelated.
Different categories of words must be given different semantic descriptions. To take a
few examples: names, common nouns, pronouns, and what we might call logical words.
a. names e.g. Fred Flintstone
b. common nouns e.g. dog, banana, tarantula
c. pronouns e.g. I, you, we, them
d. logical words e.g. not, and, or, all, any
Looking at these types of words, we can say that they operate in different ways: some
types may be used to refer (e.g. names), others may not (e.g. logical words); some can
only be interpreted in particular contexts (e.g. pronouns), others are very consistent in
meaning across a whole range of contexts (e.g. logical words); and so on. It seems too
that semantic links will tend to hold between members of the same group rather than
across groups. So that semantic relations between common nouns like man, woman,
animal, and so on, are clearer than between any noun and words like and, or, not.
We will follow general linguistic tradition and assume that we must have a list of all the
words in a language, together with idiosyncratic information about them, and call this
body of information a dictionary or lexicon. Our interest in semantics is with lexemes
or semantic words. But first we should examine this unit word. Words can be identified
at the level of writing, where we are familiar with them being separated by white space,
where we can call them orthographic words. They can also be identified at the levels of
phonology, where they are strings of sounds that may show internal structuring which
does not occur outside the word, and syntax, where the same semantic word can be
represented by several grammatically distinct variants. Thus walks, walking, walked in
3.6 below are three different grammatical words:
a. He walks like a duck.
b. He’s walking like a duck.
c. He walked like a duck.
4
Descargado por Ainhoa Puebla Pueyo ()