This is part 3 of a three part series on Scholasticism and some topics in the philosophy of science, loosely organized around a review of Edward Feser’s new book Aristotle’s Revenge: The Metaphysical Foundations of Physical and Biological Science.
Probably no development in physics has deeper metaphysical implications than quantum mechanics, although it is not yet clear what those implications are. Classically, we are used to a subject’s having any property to be either definitely true or definitely false, not linear combinations, not relative to an observer or a “framework”. The subject/predicate relation has been at the heart of logic since Aristotle, and indications that our understanding of it needs to be broadened are exciting to scientists and philosophers alike.
Heisenberg himself pointed out that indeterminate properties in his theory are reminiscent of Aristotelian potencies. In his book, Feser reports speculations by scholastic philosophers to the effect that matter at small scales is somehow closer to prime matter than macroscopic substances, that it is therefore expected to be less determinate, less actual and more potential. Their arguments to this effect are rather unclear, but perhaps we can do better. I would not say that elementary particles have more potential in the sense of freedom of possibility. The Hilbert space of a single particle (assuming, for the moment, that the particle is in a pure state so that its state is a vector in this space) is small compared to that of a many-particle system. Nor is the indeterminacy of an elementary particle (e.g. ) necessarily larger in an absolute sense–the uncertainty principles apply to all objects–but it is generally larger in a relative sense.
Let’s try to make this more explicit. Inspired by Leibniz’ attempt to identify the act of Scholasticism with kinetic energy, let us try associating it with something Lorentz invariant: the action along the classical path . Other paths will suffer destructive interference if their actions differ from this extremal value by order $\hbar$. So we might guess $\hbar/S_0$ as our measure of relative indefiniteness. This won’t work for massless particles or fields, but for a particle of mass and lifetime , we get , which seems reasonable enough. [One could, of course, also get this directly from the uncertainty principle as .]
On the other hand, if one works from some objective collapse interpretation of quantum mechanics, the task is much simpler. Definite things are those that have whatever it is that triggers wavefunction collapse / state vector reduction.
The most interesting aspect of these speculations is that they invite us to reconsider Leibniz’s task of connecting Aristotelian categories directly to mathematical structures in physics.
Parts and wholes: is water ?
It is a very deep belief of the modern mind that parts are ontologically prior to–“more real than”–their wholes. And yet, this is not a result of any scientific discovery but just a largely unquestioned metaphysical prejudice. Thomists claim rather that wholes, when they constitute substances, are ontologically prior. They frame this priority in terms of the potency/act duality: the composite substance exists in act, its components merely virtually or in potency.
Thus, in water (Feser’s example), hydrogen and oxygen exist but only in this attenuated sense. David Oderberg argues for this understanding by noting that water lacks the distinctive properties that accompany hydrogen and oxygen, implying that they are not present in act. As an argument against atomism, this doesn’t work. Atomists do not claim that water is made of hydrogen (which they regard as the gas that is an aggregate of molecules) and oxygen (the gas that is an aggregate of molecules) but that water is an aggregate of molecules, and it is not clear that the hydrogen and oxygen atoms do not retain sufficient essential properties to be identifiable within individual molecules. However, Oderberg’s point is also intended as a criterion that substantial union has been achieved to those who accept such categories. More generally, substances are said to have powers and properties that are irreducible to those of their components. Of course, atomists who endorse the phenomenon of emergence also accept irreducible emergent properties, so this observation is really not as controversial as the metaphysics in which it is embedded.
It sounds strange to hear that the components of my body (molecules? cells? macroscopic organs?) exist in only a “potential” or “virtual” sense. Not more, though, than the mainstream view that I myself exist only nominally, that in reality there are only excitations of quantum fields. Why look for ontological priority at all? Why not say that my atoms exist, that I exist, that all existences are on a level, and have that be that? For the Thomists, the motivation is sometimes given that the unity of substances must be properly acknowledged. Both Aristotelians and atomists seem to be worried that putting all existences on a level would mean double counting, multiplying beings unnecessarily in the mind.
Teleology in the living world
That eyes are for seeing and hearts are for pumping blood would seem to be uncontroversial scientific facts. but if one is committed to a view that teleology doesn’t really exist in nature, one will be driven to find some way of explaining the apparent functionality of biological organs. The most popular today is to appeal to natural selection–the function of an organ is what caused it to be selected for. Feser shows that this criterion is subject to indeterminacy problems; in some cases, a number of non-equivalent descriptions would all meet the claim to be the function.
However, I think his first, simpler argument is more decisive. The function of a thing cannot depend on its history. If the first human being had suddenly popped into existence fully formed five minutes ago, that would surely not change the function of the human eyes and heart. Clearly, causation actually works the other way, e.g. eyes were selected for because they are for seeing, which is an adaptive skill.
As Feser has argued many times, pushing non-mechanistic features into the mind is not a viable long-term materialist strategy, because it just makes the mystery of the mind completely intractable. To use the delightful analogy from his blog, the strategy of sweeping dirt under the rug is guaranteed to fail when the time comes to clean under the rug.
Conclusion: giving the world its due
I have argued for Aristotelianism-Scholasticism as materialism done right, but this might now seem to have been a bait-and-switch. We have made due without Platonic Forms and Cartesian egos, but we have ended up acknowledging final causes in nature. And there is more. By making the laws of physics immanent in nature, we have captured well the universe as it appears through the scientific method–as an ordered and intelligible but radically contingent being, a combination most naturally explained by invoking a transcendent creator Deity.
Could we by a similar move attribute the features of God instead to nature, say to man, as Feuerbach insisted? We could not. Denying the contingency of the world goes against the premisses of the scientific method, because if all beings are necessary, observation and experimentation are unnecessary. It goes against the formalisms of mathematical physics, all of which involve an act/potency, state/state space split. To deny contingency would be to make the connection between mathematical structure and the subject in which it is instantiated necessary, which would tend to reduce the latter to the former, to a large extent removing the materiality of matter. By offloading pure actuality onto God, we let material beings be material beings.