7  Meaning and function

7.1 Introduction

The Saussurean sign is composed of a form and a meaning, and implies that linguistics should concern itself with both. However, much of linguistics prior to the 1970s had focused on the structure of language and avoided the study of meaning, either because meaning was unstructured and unamenable to methodical study (e.g., Chomsky, 1957), or because it was not central to the object of linguistic study (e.g., Chomsky, 2012). Conversely, it was philosophers who were first interested in meaning, truth, and language, and the linguistic turn around the end of the nineteenth century resulted in the formalisation of concepts related to meaning. Key philosophers in this field include Frege, who suggested that truth is fundamental to the meaning of a sentence via reference (Frege, 1879); Wittgenstein, who formalised truth-conditional semantics (the idea that the meaning of a sentence is exactly or reducible to its truth conditions; Wittgenstein, 1921); and Carnap, who clarified the distinction between extensions (truth-value of a sentence or referent of a name) and intensions (functions from indices, or context markers, to extensions; Carnap, 1942). Nonetheless, these philosophers’ works were largely couched in artificial languages, resulting in a lack of interest among linguisticians (Chomsky, 1955).

7.2 Formal semantics

A critical development, however, came from the work of Montague, which can arguably be construed as the origins of formal semantics proper. Building on the idea of a homeomorphism (i.e., structure-preserving mapping) between syntax and semantics, Montague suggested that each syntactic rule (regarding constituent combination) had an associated semantic rule, which would specify how the meaning of the whole was formed from the meanings of the parts (Montague, 1973). He formalised this using typed intensional logic, in which categories of constituents are expressed in terms of combinations of entity expressions (i.e., expressions for an individual; denoted e) and truth-value expressions (i.e., declarative sentences; denoted t). For example, sentences are type t, and nouns are type e, thus intransitive verbs (which take a single noun as the sole argument) must be type <e t>—objects that, when combined with one of type e, returns an object of type t. This system, later dubbed Montague grammar by Partee (1975), could explain a variety of semantic phenomena in English, including transitive verbs, verbs of propositional attitude, and modal expressions. This supported Montague’s thesis that natural and formal languages could be treated similarly and could undergo similar analyses.

Following Montague, a robust cooperation between philosophy and linguistics began, thereby developing the field of formal semantics. This area of research includes further work on truth-conditional semantics: for example, Lewis demonstrated that possible-worlds semantics (that the intension of a sentence is a function from a possible world to a truth-value) was a viable framework for understanding natural language (Lewis, 1970). Another overlapping area of interest is reference, which is concerned with the relationships between signs and their referents; common topics in this issue include proper names, indexicals, and definite descriptions. Several models of reference have thus been proposed, including the descriptivist view, proposing that reference occurs through the specific descriptive content associated with the sign, with proponents such as Russell (1905). In contrast, Kripke introduced a theory about how linguistic forms come to be associated with their referents—linked to an original act of naming via a causal chain (Kripke, 1980). These concepts provided a theoretical framework to ground abstract linguistic meaning in objects, and brought clarity into the study of meaning by introducing logic-based techniques and vocabulary.

Another field of investigation relates to the nature and structure of the conceptual space. This includes lexical semantics, which is interested in the meaning of lexical units, including their classification and decomposition. For example, Jackendoff proposed a componential approach towards semantics, which decomposes the concept of a lexical entry into abstract conceptual primitives (e.g., wife > [+female] [+adult] [+human] [+married]; Jackendoff, 1976). Other topics include lexical relations (semantic relationships between different lexical units; Fellbaum, 2015) and semantic field theory (organisation of concepts into domains; e.g., Lyons, 1977). The work in this area thus demonstrated that meaning can be conceptualised as structured and analysable, rather than amorphous and unintelligible.

The development of a systematic, compositional, and robust semantics also affected its sister field of syntax. Part of the impetus for transformational grammar is the observation that certain correspondences between structures are essentially meaning-preserving (e.g., passivisation), suggesting that there may be some common “deep” or “base” form from with both are obtainable as variant surface forms. However, Partee has observed that “with a real semantics, we don’t need sameness at any syntactic level … to capture sameness of meaning”; more generally, she noted that “once we have a semantics that can do some real work, then syntax doesn’t have to try to solve problems that may be semantic in nature” (Partee, 2014). Such a “division of labour” has thus given rise to a number of other grammatical frameworks that do not posit transformations, including Lexical-Functional Grammar (Dalrymple et al., 2019; Kaplan & Bresnan, 1982) and Head-driven Phrase Structure Grammar (Müller et al., 2024; Pollard & Sag, 1987), in which semantics is handled by separate representations or formalisations (e.g., Glue semantics; Dalrymple et al., 1993).

7.3 Pragmatics and discourse analysis: Language in use

Despite the advances made by formal semanticists, some scholars remained unconvinced of its applicability given the (ostensible) inaccuracies in the mapping between the compositional semantics of a sentence and its understood ‘meaning’ (in the pre-theoretic sense) by language users. In this regard, Grice was particularly instrumental in bridging the gap between formalists (who advocated the use of formal language for its unambiguity and precision) and informalists (who were concerned about actual uses of natural language); he suggested instead that “the common assumption of the contestants that the divergences do in fact exist is (broadly speaking) a common mistake, and that the mistake arises from an inadequate attention to the nature and importance of the conditions governing conversation” (Grice, 1975). He thus introduced the notion of implicature, which refers to the act of implying something without it being either explicitly expressed or directly inferable from the content of the utterance. Implicature related to features of discourse (i.e., conversational implicature) arises from the tacit assumption that participants in a conversation are cooperative, and thus apparent deviances from cooperativity can be exploited to convey additional implicated meanings (in the absence of uncooperative behaviour such as lying, or explicit opting out of some aspect of cooperativity). A more recent extension of the cooperativity principle is the Rational Speech Act model (Frank & Goodman, 2012), which explicitly formalised the speaker’s mental simulation of their interlocuter’s interpretation of possible utterances, such that speakers choose the utterance that best communicates their intended meaning. This approach has allowed for more quantitative modelling of language use to account for a range of phenomena including metaphor (Kao et al., 2014) and politeness (Yoon et al., 2016). This analysis of meaning in embodied utterances became a foundation of the field of pragmatics, the study of the communicative function of language.

The other foundational component of pragmatics is speech-act theory, expounded by Austin (Austin, 1962) and Searle (Searle, 1969). This theory suggests that performative utterances can be analysed in three levels: the locutionary act (the actual utterance and its content), the illocutionary act (the intended result of making an utterance, e.g., promising, threatening, requesting), and the perlocutionary act (the actual effect of making an utterance, e.g., someone else performing an action). The fact that such actions are not truth-valuable means that they do not lie within the domain of semantics, but it seems that they are nonetheless central to the function of language as communication. Ongoing research continues into the conditions and interpretations for performative utterances in linguistics and philosophy (e.g., Condoravdi & Lauer, 2012; Green, 2021; Kamp, 1978), as well as in the realm of legal interpretation (e.g., Cao, 2007).

Another facet of such a functional approach is the acknowledgement that language use does not employ isolated sentences, but strings of sentences, non-sentence strings, discourse markers, and other elements that together form a conversation. This observation leads to the subfield known as discourse analysis, which is concerned with the contexts, structures, and processes related to linguistic units larger than individual sentences. An important early scholar in this field is Hymes, who proposed the notion of communicative competence (Hymes, 1962), referring to the knowledge a speaker must possess to appropriately use language in social contexts (i.e., relating to conventions about language use). Communicative competence involves aspects of communication such as turn-taking, participant relationships, and discourse genres, all of which affect the form and nature of the linguistic interaction. Such factors also affect the meaning of component utterances within the discourse; Lewis thus suggested that conversation can be considered a sort of language game, such that “[s]entences depend for their truth value, or for their acceptability in other respects, on the components of conversational score at the stage of conversation when they are uttered” (Lewis, 1979).

As such, deriving the actual meaning of an incarnated utterance depends on pragmatic and discourse factors, demonstrating the need for analyses of such factors to understand actual language use. In fact, pragmatic and discourse functions also affect the syntactic structure of sentences via informational structure (e.g., topic and focus) (e.g., Roberts, 2012). These phenomena highlight the fact that there may not be a hard distinction between semantics proper and either pragmatics or discourse (e.g., Bach, 2003), and indeed some extreme functionalists believe that all meaning is context (e.g., Hopper, 1987). More generally, these fields focus on the communicative function of language, and thereby fall under the umbrella of functional linguistics, in contrast with the structure-emphasising formal linguistics (Van Valin, 2017).

7.4 Cognitive linguistics: The linguistics wars and beyond

As interest in the study of meaning grew, a group of Chomsky’s colleagues and students began work in generative semantics, which aimed to understand meaning from a generativist point of view. They began with the Katz–Postal hypothesis (Katz & Postal, 1964), which suggests that transformations are meaning-preserving; taking this to its logical conclusions, it implied that language has a core component in which syntax and semantics are closely interrelated. This proposition contrasted with Chomsky’s own position, which supported the autonomy of syntax, suggesting instead that syntax is the sole generative component in language, and that semantics is only applied once syntactic structures have been formed (see Jackendoff, 1972). The disagreement between these two factions became known as the linguistic wars, which was a protracted dispute that was not just conceptual but also rhetorical and academic in nature (Harris, 1995).

Eventually, the logical consequences of generative semantics meant that it became distinctly not generativist in flavour—for example, Lakoff and Ross questioned the need for a deep structure altogether (Lakoff & Ross, 1976). The group thus disbanded, with its members pursuing different—albeit all non-Chomskyan—frameworks for understanding language. One important school that emerged is cognitive linguistics, which (contra Chomsky) suggested that language relies on general cognitive processes and principles, rather than being an autonomous and independently organised unit in the brain (Croft & Cruse, 2004). The form of language, then, follows from its function (rather than being independent of it).

Cognitive linguistics has thus been influenced by cognitive psychological ideas regarding conceptual organisation, including polysemy, categorisation, prototype theory, and metaphor. This last notion refers not just to linguistic metaphors, but conceptual metaphors—the use of one domain to represent another typically more abstract domain as a mode of thought (Lakoff & Johnson, 1980). Lakoff and Johnson proposed that this principle is supported by image schemata, which are cognitive structures that enable abstract reasoning by identifying recurring patterns (e.g., the schema of “containment,” which is employed in spatial senses of the English word out; Johnson, 1987). Metaphors can thus exploit such schemata for other concepts (e.g., non-spatial senses of out, as in leave out).

Such conceptual structures are also relevant to linguistic forms, which are symbolically linked to these meanings (Langacker, 1991). Under the cognitivist notion that form reflects function, semantics is linked not only to lexical items, but also to grammatical arrangements (recalling earlier work by Bloomfield and Pike). This proposal is the premise of construction grammar, which suggests that language comprises constructions (i.e., pairings of form and meaning; see e.g., Fried & Nikiforidou, 2025; Hoffmann & Trousdale, 2013; Lakoff, 1977). Constructions arise whenever some aspect of its form or meaning is non-compositional, including not only morphemes and words, but also multi-word expressions (e.g., jog X’s memory), idioms, and abstract grammatical “rules” such as passive voice. For example, a key early paper explored the interaction between idiomaticity and productivity in the English phrase let alone (Fillmore et al., 1988). As a result, under this framework, there is no strict distinction between syntax and lexicon, and semantics is inextricably linked with morphology and syntax. Such a cognitive approach thus provides a paradigm that is radically different from Chomskyan generativism, and continues to evolve in tandem with developments in cognitive psychology.

7.5 Conclusion

Investigations into meaning from the late twentieth century onwards have demonstrated that meaning is systematic and structured, and is an important part in language. The relationship between meaning and the communicative function of linguistics has also led to the development of functional linguistics, providing an alternative to more formal approaches. These efforts have also highlighted the importance of interdisciplinary scholarship, as insights from philosophy and psychology have resulted in entirely novel paradigms in linguistics. In summary, these developments have helped to bridge the abstract structures of language and its real-world use, permitting a richer conception of linguistic signs.