Definition

Given a law that holds in some object area; then some particular state of affairs that is not in conformity with the law constitutes an exception to it.

Formulated in logical terms, this means: Given a proposition – call it the law-statement – of the following form:

(∀x) P(x)

and another proposition – call it the exception-statement – of the following form

(∃x) ¬P(x)

then the combination (logically: the conjunction by ) of these two propositions leads to a contradiction. In such a situation, the exception-statement formulates an exception to the law-statement.

As the formalization shows, a theory cannot contain such combinations of propositions, i.e. it cannot accept exceptions as such. They must be analyzed with the goal of restoring consistency to the theory. The following are common methods to “get rid of” exceptions:

Statistical hedging

By the usual statistical procedures (taking a sample of the x in question, analyzing the distribution of their properties) one determines the percentage of x of which P holds and the complementary percentage of which P does not hold. Then one reformulates the law-statement as a statistical statement, e.g.

(95% x) P(x)

or equivalently (where p is ‘probability’, used here as a pseudo-quantifier):

(p=0.95 x) P(x)

This solution to the problem involves no further analysis. It weakens the theory, since this now only accounts for 95% instead of 100% of the x. In relation to the theory, the only purpose and achievement of statistical hedging is to save the theory from falsification, which is not a goal to be pursued in science (see elsewhere). In relation to application of the theory, the practitioner is told that if he deals with elements x, he cannot take for granted that they will be P but can expect it only with a certain probability.

There is an even easier way out: Instead of daring the above law-statement and instead of making the statistical analysis, the scientist formulates something like ‘with a couple of exceptions / with much greater than chance frequency, (∀x) P(x)’. This is even worse than the statistical solution, since it is not even made explicit how many “a couple of exceptions” are or how great “much greater” is. This is, nevertheless, one of the way-outs most frequently taken in the literature. It is usually admitted in situations where a couple is taken to mean something like ‘at most 5%’ and nothing in the argument hinges upon the difference between 95% and 100%.

Delimiting the law

By analyzing the properties – other than P – of those x for which P(x) and those x for which ¬P(x), one finds out that ¬P(x) only if also Q(x). Then one restricts the law-statement as follows:

(∀x) ¬Q(x) → P(x)

The law-statement now no longer holds for the set of x, but only for that subset of x for which ¬Q(x). While this does weaken the initial law-statement, too, it may be closer to the truth and, at the same time, may provide insight into the nature of x and some intrinsic connection between P and Q.

This is the established method in grammatical analysis. For instance, let the law-statement be the following proposition as part of a grammar of German: ‘in subordinate clauses, the finite verb occupies the last position’. An exception to this statement is constituted by sentences of the kind hast du was, bist du was ‘if you have something, you are something’. By further analysis, one finds out that in this kind of exception, the subordinate clause is not introduced by a conjunction. Then one modifies the law as follows: ‘in subordinate clauses that are introduced by a conjunction, the finite verb occupies the last position’.1 Further analysis may then show that subordinate clauses not introduced by a conjunction are subject to laws, too; and in the most fortunate case, these laws form a coherent set with the afore-mentioned law.

Embedding the law into an implicational hierarchy

An implicational hierarchy gives a statement a place in a set of similar statements, but of different range. Take the following statement as part of a theory of linguistic borrowing: ‘inflectional affixes are never borrowed’. This statement meets with a couple of exceptions. However, there is a theoretical background to the law-statement which is, roughly: ‘lexical items are easily borrowed; and quite in general, free items are easily borrowed, while bound items are not normally borrowed’. If that is the idea, then the solution to the problem posed by the exceptions is to embed the law-statement into an implicational hierarchy of the following form:

Borrowability hierarchy
lexical itemsfree grammatical morphemesinflectional affixes...

whose interpretation is: If a language borrows items at one point of the hierarchy, then it borrows items from all the points left to it in the hierarchy. In the optimal case, this hierarchy is exceptionless. Moreover, it formalizes the more general intuition that free material is more easily borrowed than bound material and that lexical material is more easily borrowed than grammatical material.

The implicational hierarchy shares with the delimitation of the law-statement the formal property that the law-statement is part of an implication. In contradistinction to the former case, here the law-statement is not the implicatum, but the implicans. The implicational hierarchy expresses the fact that phenomena near its right end are rare and up to non-existent because they are relatively complex in comparison with much simpler phenomena in the same functional domain which are usually sufficient to attain the relevant goals.

Reconceiving the entire theory

In the development of a science, it happens not so rarely that some law-statement assumes the role of a dogma despite the recognition of “a couple” of exceptions, just because it is an essential constituent of some doctrine that corresponds to a paradigm widely ranging in the scientific community. It may then take a fresh look at the facts and an independent conception of their nature in order to see the extent to which the law-statement does not hold, i.e. in order to see that one is not actually dealing with a couple of exceptions, but with an entirely different state of affairs.

Take the arbitrariness of the language sign as an example. Since this was formulated in Saussure 1916, it assumed the role of a dogma in linguistics. It got a central theoretical and even methodological position not only in structural, but also in historical-comparative linguistics. This happened despite the fact that even Saussure had recognized a set of exceptions, viz. in onomatopeia. It was not until the publication of Jakobson 1966 that the dogma was overthrown by the recognition that the motivation of the language sign is something equally true and equally important as its arbitrariness and that it is an empirical task of linguistics to find out where and under what conditions each of them prevails. Now motivated signs no longer have the status of exceptions. Instead, the theory states as an essential feature of human language that different kinds of linguistic signs are motivated (or arbitrary) to different extents under different conditions; and the details remain to be investigated. That is, the methods of delimiting the law-statement and of embedding it in an implicational hierarchy remain to be applied.

It is obvious that this is a costly solution because much scientific work done theretofore loses its theoretical – and in this case, methodological – foundation. It may amount to a ‘paradigm change’ (Kuhn 1996), something that does not happen so often in the history of a science.

Renouncing to consistency of the theory

The statistical solution of §2 is, in principle, always available. However, it tends to imply that if one would seriously analyze the distribution of the properties in the object area, one would be able to replace the statistical reformulation by a delimitation of the law as in §3. In the humanities, however, the solution of delimiting the law is often not available in principle because exceptions are erratic. What one wants to express is that no law in the world is sufficient to entirely predict human “behavior” and no law in the world can prevent a human being from acting in ways not foreseen by science. I.e., one does not accept determinism and therefore renounces to consistency of (non-probabilistic) theories that are meant to account for human behavior and activity.

Degrammaticalization is a relevant example. As explained elsewhere, grammaticalization is a directed process whose logical inverse is almost not observed to occur. Unidirectionality is therefore an important law in the theory of grammaticalization. There are, however, a “couple of exceptions”. A relatively indisputed example is that of the detachment of the numeral suffix -anta in Italian (as in sentences like lei è già negli anta ‘she is already over forty’). Examples of degrammaticalization have not so far been found to share any relevant properties beside those that are definitional. That is, the solution of delimiting the law-statement is not available. The most one can say is that language is an activity of a free being and sometimes people are more creative than scientists. An adequate theory of human language has to treat it as a free human activity; and if that is so, the theory will not be consistent.2

As may be seen, this is the costliest solution in so far as it may damage the status of the discipline as a science. It then depends on the scientific community and on the society at large what significance and prestige it assigns to consistent theories.


1 For the purpose of this illustration, it does not matter that the statement suffers from other kinds of exceptions, too.

2 “If one is interested in generalization rather than arbitrary facts, one must put aside the exceptions, because unless they can be subsumed under some further generalization, they cannot be explained. [ ... ] Exceptions cannot be understood by definition; they are the residue that resists explanation.” (Haspelmath 2004: 23)


References

Haspelmath, Martin 2004, "On directionality in language change with particular reference to grammaticalization." Fischer, Olga & Norde, Muriel & Peridon, Harry (eds.), Up and down the cline. The nature of grammaticalization. Amsterdam & Philadelphia: J. Benjamins (Typological Studies in Language, 59); 17-44.

Jakobson, Roman 1966, „Quest for the essence of language.“ Diogenes 51: 21-37

Saussure, Ferdinand de 1916, Cours de linguistique générale, publié par Charles Bally et Albert Séchehaye avec la collaboration de Albert Riedlinger. Paris: Payot.