The Razor Ockham *Should* Have Proposed

Ockham’s Razor is the heuristic sometimes known as the lex parsimoniae: the Law of Parsimony. As he actually proposed it:

Numquam ponenda est pluralitas sine necessitate: Do not posit pluralities beyond necessity.

Ockham’s Razor as it is usually rendered:

Entia non sunt multiplicanda praeter necessitatem: Do not multiply entities beyond necessity.

The entities of a theory are its terms. They are not actual entities, but formal only. So the Razor is often rendered:

Do not multiply terms beyond necessity.

This makes it easy to compare theories and see which one is more parsimonious – especially if they are mathematically formalized. F = ma, for example, clearly  invokes three terms, that terminate on three sorts of properties of things. The basic idea of course is that as between two theories that adequately explain some phenomenon, the simpler is more likely to be more accurate. But why?

In Finality Revived: Powers & Intentionality (Synthese, March 2016), David Oderberg suggests in passing that Ockham should instead have proposed:

Do not multiply mysteries beyond necessity.

There are two ways to multiply mysteries: to increase terms beyond adequacy, or to decrease them beneath adequacy. Oderberg’s version of the Razor covers both.

If you decrease terms too much, your explanation is inadequate. One common result is that you end up failing to explain all sorts of related things. For example, the modern rejection of final and formal causation can’t account for mind, freedom, complexity, or order; whereas under the Aristotelico-Thomistic Grand Synthesis that modernism had displaced, they had posed no particular problems.

If your explanation produces mysteries that your former explanation understood, then you are on the wrong track.

If on the other hand you increase terms too much, you introduce new mysteries; for, when it is first noticed, every new mystery is denoted by a new term; and vice versa. So, to introduce a term just is to introduce a new mystery, or rather to notice it. Every new term notices and relevates a new explanandum, each of which calls out for a new explanans, and bedevils us until we get it. This is a formalization of the fact that each new thing we learn increases the surface area of our ignorance; increases the precision of our understanding of how much we have still to learn.

Viz.: F = ma raises the question: what are Force, mass, and acceleration? These are defined in terms of other entities, and each such term must itself be explicated in terms of yet others. That’s not always difficult to do, but it must be done, if the explanation is to amount to more than hand waving.

So: don’t add a new mystery by adding a new term if in so doing you are not clearing up at least one old mystery.

***

God is the only term that does not introduce another explanandum – although he does of course introduce another mystery. For, while he needn’t be explained, nor can he be explained. Nevertheless no explanation that does not ultimately terminate and rest upon him can be complete, or therefore satisfactory; for it cannot otherwise rest, or therefore explain. Notions alienate from the Absolute cannot ultimately work; they break down; they fail; they stultify; they negate; they kill.

In the perfect and complete explanation, only one mystery remains: God.

15 thoughts on “The Razor Ockham *Should* Have Proposed

  1. Pingback: The Razor Ockham *Should* Have Proposed | @the_arv

  2. Brilliant.

    Explains… well … everything.

    The “decreasing” terms creating an explanandum beneath adequacy explains the failures of Marxism fully, while the “increasing” of terms explains the phenomenologists, and their “magic thinking” completely.

    So in a post of approximately 300 words, the Cartesian “enlightenment” is debunked while another Thomistic proof for the existence of God is provided.

    Numquam ponenda est pluralitas sine necessitate!

    • Gosh; thanks. And here I thought I was just nobbling away at a fairly recondite point in the history & philosophy of science.

      You are of course correct that the Marxian reduction of human life to mere conflict over material resources is radically inadequate. You are right furthermore that in its earnest attempt to be true to the data of inner life much of phenomenology is rather redolent of the epicycles upon epicycles of latter day Ptolemaic astronomy. And all we need do to see how supererogatory terms redound to practical detriment is consider the rococo self-referential ramifications of public policy or modern “art,” where each new thing demands yet another, because none of them quite work.

      The “error term” of so many recent theories testifies to their terminological misprision: it is needed either because they specify too much, or too little. The error term is a fudge. It is a way of saying, “here be dragons.”

      • …. yes, yes, yes and….

        …going from the theoretical down to the “rococo self-referential” practical determinants as to why this is the case, I have a theory…

        (Question: are you familiar with r/K reproduction selection theory?)

        I think it all has to do with the material needs of acquisition of resources and the corollary reproductive strategy chosen. There is something called the r/K selection theory whereby humans subconsciously select the reproduction strategy dependant on their material situation. (This is a process that begins in the womb, so it’s biologically determined to a large part) Quickly, if there are unlimited resources (unlimited grass for rabbits let’s say – or access to easy money for humans) a large part of the population will chose a r strategy. On the other hand, when there are limited resources (small number of rabbits to support a large wolf population – or tight credit condition, such as in a barter economy or under a gold standard) the population will chose a K selection strategy.

        Presently, with the rise of debt financing since the end of WWII, access to resources (money/debt) is easy so r selection is being chosen by a larger portion of the population then under “normal/natural” (naturally sustainable) conditions. This r selection then germinates into a state of consciousness that everything is free (think rabbit and grass). This then turns into “magic thinking” in the subconscious of the r population since they think everything is free and voila, we have what we have today.

        The snowflake society!

        And when a guy like Trump comes along and wants to limit the access of the r population to “free resources” , and all hell breaks loose.
        Kind of like we are seeing today.

        Here is some background (excuse the self promotion) : Lex Armaticus, r/K Theory And The Rabbit Papacy

      • I am indeed familiar with r/K theory. It seems like common sense. TANSTAAFL is K; ¬ TANSTAAFL is r. But TANSTAAFL is true. So, reality is K.

        My quibble with the theory is with the notion that the historical outcomes are biologically determined. That strikes me as verging on the same sort of reductive error to which any sort of materialist analysis is prone, insofar as it is unbaptized (and, so, ignorant of the spiritual aspect of all matter). The historical outcomes are rather, I think, dependent upon the mind’s apprehension of the nature of the relevant solution space. The mind gauges environmental plenty, and works its way out to a moral and reproductive strategy that is more or less apt to its proximal circumstances. The wealthy prodigal son is not crazy to opt for prodigality, until he is. So, his judgements at every point in his career are not determined, but rather rational, and free.

        The key thing is to remember that the r strategy is not irrational, but rather only, merely, errant in its apprehensions of reality, so wicked, and wrong.

      • I wholeheartedly agree.

        Since God created man in His image, man possesses free will. He therefore can choose, so no biologically determined outcome is absolute.

  3. Pingback: The Razor Ockham *Should* Have Proposed | Reaction Times

  4. Pingback: What Is, That It Is | The Deus Ex Machina Blog

  5. “[T]he modern rejection of final and formal causation…”

    Not only does modern science dispense with the notion of final causes but, at its most successful, with the very notion of causality.

    Newton did not, as is commonly supposed, ask himself what caused the apple to fall; he asked how fast it fell. This is something that is not only observable, but measurable. That measurement he was able to correlate with others: measurements of mass, distance, time. By using them as variables in differential equations, the constant relationship between them can be expressed and predictions can be made.

    This relationship is functional, not causal. When he speaks of “force,” this is not, in this context, a causal term; “force” is defined as the product of the mass and the acceleration and it is simply the name given to one of the variables used in the equations.

    Thus, instead of “causes,” we have “laws.” These laws can be generalised, as Bl John Henry Newman noted, “phenomena, which seem very different from each other, admit of being grouped together as modes of the operation of one hypothetical law, acting under varied circumstances. For instance, the motion of a stone falling freely, of a projectile, and of a planet…”

    “Thou hast ordered all things in measure, and number, and weight” (Wis XI:21)

    • Not only does modern science dispense with the notion of final causes but, at its most successful, with the very notion of causality.

      It tries, and thinks it succeeds; but it fails.

      The laws smuggle formal and final causation back into the metaphysical mix under cover. So doing they save efficient and material causation (which logically reduce to formal and final causation, which are then mutually reducible). So they rescue causation per se under different terms. They are a different way of characterizing causal relations among events. Instead of saying that x causes y, they say that x & y are related in certain regular – ‘lawful’ – ways. But either their lawful relation is necessary and eternal, or it is contingent; and to the extent that it is contingent, it is causal. In that event, there is true affection between events, and so true eventuation. Only thus might anything unnecessary – such as the thoughts of Isaac Newton – actually happen.

      Note that I am not disagreeing with you here.

    • Has modern science been as clever as you claim? In its reductionist tendency, it has relegated what it is unable to measure or explain into the domain of the mind. Since it is unable to explain the mind, many of its exponents deny its existence and, in doing so, deny themselves.

      • Modern ontology – not the science or the scientists, properly speaking, but the modernist metaphysicians who interpret science – is too clever by half. It thinks it has eliminated the superfluous terms of formal and final causation, when it has not. It deludes itself. It takes a lot of brains to pull that off. It takes a lot of brainpower to believe in something as crazy as eliminative materialism – which is to say, materialism, period full stop – when you must contravene it, and thus disprove it experimentally, in the very process of avowing it.

        Psychopathy is clustered at the right end of the intelligence distribution.

  6. Pingback: This Week In Reaction (2017/11/12) - Social Matter

Comment

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s