How do explanations within physics relate to explanations in other sciences, and what different levels of explanation can be distinguished within physics itself? To help answer these questions, on Monday 3rd April FraMEPhys hosted a workshop at the University of Birmingham on Levels of Explanation, with talks from Karen Crowther, Alex Franklin, Lina Jansson, Eleanor Knox, Christian List and David Yates.
On 14 March 2019 Al Wilson was at the Department of Philosophy at the University of Stockholm to give a talk titled ‘Emergent Contingency’ – on the general prospects of naturalistic metaphysics, on how to bring science to bear on modality, and on how Everettian quantum theory can underwrite a naturalistic theory of contingency. Fun was had (we think) by all. Abstract and slides are below!
Abstract: I develop and defend a reductive account of objective contingency in nature, drawing on resources from Everettian (many-worlds) quantum mechanics. I distinguish four degrees of naturalistic involvement in the theory of modality; the proposed quantum modal realism is naturalistic in all four senses. I also sketch some consequences of the account for the methodology of metaphysics.
Our next visitor in the FraMEPhys Seminar series was Dr Laura Felline (Roma Tre) who spoke on Monday 11th March. Laura’s title was ‘The Measurement Problem in Quantum Information Theory’ and her abstract was as follows:
“In this talk I criticize the idea, wide-spread between the advocates of Quantum Information Theory, that in order to explain away the measurement problem it is sufficient to reject the assumption that the quantum state represents physical objects. In order to do that, I will analyse three notable information-theoretic approaches to QT: Bub’s new information-theoretic interpretation (as a representative of a psi-ontic approach), Pitowsky’s Bayesian interpretation (as a representative of an objective psi-epistemic approach) and Qbism (as a representative of a subjective psi-epistemic approach) and argue that the measurement problem still affects the first two interpretations, while Qbism leads to an unacceptable clash between the alleged content of quantum theory and scientific practice.”
Before the talk, there was a reading group with the speaker in ERI G54 from 1.30-2.30pm. The paper discussed was “An Introduction to QBism with an Application to the Locality of Quantum Mechanics” by Fuchs, Mermin and Schack, available here: https://arxiv.org/pdf/1311.5253.pdf
The second talk in our spring FraMEPhys Seminar series was given by Dr Antonio Vassallo (University of Barcelona) on Monday 25th February 2019. Antonio’s title was “Dependence Relations in General Relativity”, and his abstract was as follows:
“I will discuss the nature of the dependence relations underpinning the talk of mutual action between material and spatiotemporal structures in general relativity. In particular, I will present a case study involving frame-dragging effects. Frame-dragging relates local inertial frames to distant distributions of matter in a time-independent way, thus establishing some sort of non-local link between the two. For this reason, a plain causal interpretation of frame-dragging faces huge challenges. By using a generalized structural equation model analysis I will argue that frame-dragging is best understood in terms of a novel type of dependence relation that is half-way between causation and grounding.”
FraMEPhys postdoc Dr Katie Robertson gave a talk to the Oxford Philosophy of Physics seminar on Thursday 21 February. Here are the details:
Reducing the second law of thermodynamics: the demons and difficulties In this talk I consider how to reduce the second law of thermodynamics. I first discuss what I mean by ‘reduction’, and emphasize how functionalism can be helpful in securing reductions. Then I articulate the second law, and discuss what the ramifications of Maxwell’s demon are for the status of the second law. Should we take Maxwell’s means-relative approach? I argue no: the second law is not a relic of our inability to manipulate individual molecules in the manner of the nimble-fingered demon. When articulating the second law, I take care to distinguish it from the minus first law (Brown and Uffink 2001); the latter concerns the spontaneous approach to equilibrium whereas the former concerns the thermodynamic entropy change between equilibrium states, especially in quasi-static processes. Distinguishing these laws alters the reductive project (Luczak 2018): locating what Callender (1999) calls the Holy Grail – a non-decreasing statistical mechanical quantity to call entropy – is neither necessary nor sufficient. Instead, we must find a quantity that plays the right role, viz. to be constant in adiabatic quasi-static processes and increasing in non-quasi-static processes, and I argue that the Gibbs entropy plays this role.
Inspired by a combination of Immanuel Kant’s philosophy of mathematics and Al Wilson’s notion of grounding as metaphysical causation, Sloman draws attention to the extraordinary metaphysical creativity of biological evolution (the most creative mechanism known to us) repeatedly “discovering” and instantiatiating new metaphysical types of ever increasing complexity and generative power, building on (still unidentified) generative features of fundamental physics that made everything else possible, including increasingly complex and varied forms and uses of information (mostly via chemistry).
He suggests that key features of evolution constitute a process in which pre-existing parametrisable mathematical structures of ever increasing complexity and generative power, are systematically “discovered”, combined and used in creating new (parametrised) instances that when combined with appropriate parameters produce instances of newly discovered metaphysical types, including not only new physical structures and processes but also increasingly complex and powerful new types of information, and information processing mechanisms. This creative, productive, grounding, can be construed as exemplifying Wilson’s characterisation of Grounding as Metaphysical Causation [G=MC].
The details of this process, and its products provide deep challenges for both neuroscience and current AI, neither of which explains the ability of animal brains to discover and use powerful mathematical theories, e.g. concerning topology and geometry. Sloman also links this to Alan Turing’s suggestion (1938) that digital computers cannot replicate human mathematical intuition, only mathematical ingenuity.
Emily’s title was ‘A Tale of Two Anachronisms’, and her abstract was:
“Scientific reasoning is constrained not only by the outcomes of experiments, but also by the history of human thought and our own place in it. As a result, even our best theoretical models often incorporate features which are present more as the result of historical accident than as the endpoint of a process of evidence-based deliberation, and it is sometimes possible to make considerable progress by identifying and eliminating such features. In this talk, I will identify two features of current thought about quantum physics which may be anachronisms of this kind. I will briefly discuss their history and then raise some arguments against them. Both of these features have previously been recognized as problematic by parts of the physics community, but I argue that this recognition is not sufficiently widespread and that both features are actively limiting progress in the field of quantum foundations.”