-
1043443.176262
In operational quantum mechanics two measurements are called operationally equivalent if they yield the same distribution of outcomes in every quantum state and hence are represented by the same operator. In this paper, I will show that the ontological models for quantum mechanics and, more generally, for any operational theory sensitively depend on which measurement we choose from the class of operationally equivalent measurements, or more precisely, which of the chosen measurements can be performed simultaneously. To this goal, I will take first three examples—a classical theory, the EPR-Bell scenario and the Popescu-Rochlich box; then realize each example by two operationally equivalent but different operational theories—one with a trivial and another with a non-trivial compatibility structure; and finally show that the ontological models for the different theories will be different with respect to their causal structure, contextuality, and fine-tuning.
-
1158750.176346
QBism explicitly takes the subjective view: probabilities of events are defined solely by past experiences, i.e. the record of observations. As shown by the authors (Fuchs et al, 2013), this: “... removes the paradoxes, conundra, and pseudo-problems that have plagued quantum foundations for the past nine decades”. It is criticised for its lack of ontology and anthropocentric nature. However, if Everett's (1957) formulation is taken at face value, exactly the features of QBism are the result, and the ontology is inherent. The anthropocentric nature of the solution is simply an indication that the quantum state is relative, as is central to Everett. Problems of measurement and locality do not arise.
-
1158768.176361
In Part 1 the properties of QBism are shown to be natural consequences of taking quantum mechanics at face value, as does Everett in his Relative State Formulation (1957). In Part 2 supporting evidence is presented. Parmenides' (Palmer, 2012) notion that the physical world is static and unchanging is vividly confirmed in the new physics. This means the time evolution of the physical world perceived by observers only occurs at the level of appearances as noted by Davies (2002). In order to generate this appearance of time evolution, a moving frame of reference is required: this is the only possible explanation of the enactment of the dynamics of physics in a static universe.
-
1158784.176371
Despite the simplicity of Weyl's solution to the paradox of the passage of time in the static block universe, virtually no interest is shown in this approach although as shown in Part 2, the problem of the Now could be taken as evidence for his solution being correct. A moving frame of reference is required to explain the experience of the enactment of any of the dynamics of physics, and the experiencing consciousness supervenes on this phenomenon. Given the logic involved is straightforward, it seems that the reasons all this has been ignored may be less so. Here it is suggested, based on Davies' (2006) research, that this might well involve a horror of even the possibility of deity and mysticism being dignified by discussion, let alone endorsement. The objective here is to demonstrate that this approach does validate certain archetypal myths of the great spiritual traditions, but at the same time fully supports and reinforces the objective basis of the science of physics. The myths are exploded to reveal simply scientific principles, and a complete absence of gods or mystical phenomena, indeed such things are categorically ruled out. The scientific principles illustrated by the third logical type which have languished unexamined turn out to be powerful knowledge which serves only to reinforce and emphasise how deeply flawed were the key principles of the religious preoccupations which our culture had to relinquish in order to move forward.
-
1158803.17638
The localization problem in relativistic quantum theory has persisted for more than seven decades, yet it is largely unknown and continues to perplex even those well-versed in the subject. At the heart of this problem lies a fundamental conflict between localizability and relativistic causality, which can also be construed as part of the broader dichotomy between measurement and unitary dynamics. This article provides a historical review of the localization problem in one-particle relativistic quantum mechanics, clarifying some persistent misconceptions in the literature, and underscoring the antinomy between causal dynamics and localized observables.
-
1158821.176389
— While emergentism enjoys some good fortune in contemporary philosophy, attempts at elucidating the history of this view are rare. Among such attempts, by far the most influential certainly is McLaughlin’s landmark paper “The Rise and Fall of British Emergentism” (1992). While McLaughlin’s analysis of the recent history of emergentism is insightful and instructive in its own ways, in the present paper we offer reasons to be suspicious of some of its central claims. In particular, we advance evidence that rebuts McLaughlin’s contention that British Emergentism did not fall in the 1920–1930s because of philosophical criticism but rather because of an alleged empirical inconsistency with fledgling quantum mechanics.
-
1187782.176401
Comparative philosophy of religion is a subfield of both philosophy of
religion and comparative philosophy. Philosophy of religion engages
with philosophical questions related to religious belief and practice,
including questions concerning the concept of religion itself. Comparative philosophy compares concepts, theories, and arguments from
diverse philosophical traditions. The term “comparative
philosophy of religion” can refer to the comparative
philosophical study of different religions or of different
philosophies of religion. It can thus be either a first-order
philosophical discipline—investigating matters to do with
religion—or a second-order philosophical discipline,
investigating matters to do with philosophical inquiry into religion.
-
1198927.17641
High speed store required: 947 words. No of bits in a word: 64 Is the program overlaid? No No. of magnetic tapes required: None What other peripherals are used? Card Reader; Line Printer No. of cards in combined program and test deck: 112 Card punching code: EBCDIC Keywords: Atomic, Molecular, Nuclear, Rotation Matrix, Rotation Group, Representation, Euler Angle, Symmetry, Helicity, Correlation.
-
1222896.176418
The most common argument that mathematical truth is not provability uses Tarski’s indefinability of truth theorem or Goedel’s first incompleteness theorem. But while this is a powerful argument, it won’t convince an intuitionist who rejects the law of excluded middle. …
-
1234584.176427
I gave a talk on March 8 at an AI, Systems, and Society Conference at the Emory Center for Ethics. The organizer, Alex Tolbert (who had been a student at Virginia Tech), suggested I speak about controversies in statistics, especially P-hacking in statistical significance testing. …
-
1244896.176438
To promise something, I need to communicate something to you. What is that thing that I need to communicate to you? To a first approximation, what I need to communicate to you is that I am promising. But that’s circular: it says that promising is communicating that I am promising. …
-
1277075.176447
A neglected but challenging argument developed by Peter Geach, John Haldane, and Stephen Rothman purports to show that reproduction cannot be explained by natural selection and is irreducibly teleological. Meanwhile, the most plausible definitions of life include reproduction as a constitutive feature. The implication of combining these ideas is that life cannot be explained by natural selection and is irreducibly teleological. This does not entail that life cannot be explained in evolutionary terms of some kind, but it does lend support to the controversial view of Jerry Fodor and Thomas Nagel that evolutionists need to look beyond the constraints of Neo-Darwinism.
-
1309282.176455
Where does the Born Rule come from? We ask: “What is the simplest extension of probability theory where the Born rule appears”? This is answered by introducing “superposition events” in addition to the usual discrete events. Two-dimensional matrices (e.g., incidence matrices and density matrices) are needed to mathematically represent the differences between the two types of events. Then it is shown that those incidence and density matrices for superposition events are the (outer) products of a vector and its transpose whose components foreshadow the “amplitudes” of quantum mechanics. The squares of the components of those “amplitude” vectors yield the probabilities of the outcomes. That is how probability amplitudes and the Born Rule arise in the minimal extension of probability theory to include superposition events. This naturally extends to the full Born Rule in the Hilbert spaces over the complex numbers of quantum mechanics. It would perhaps be satisfying if probability amplitudes and the Born Rule only arose as the result of deep results in quantum mechanics (e.g., Gleason’s Theorem). But both arise in a simple extension of probability theory to include “superposition events”–which should not be too surprising since superposition is the key non-classical concept in quantum mechanics.
-
1333272.176464
In some games like Mafia, uttering falsehoods is a part of the game mechanic. These falsehoods are no more lies than falsehoods uttered by an actor in a performance are lies. Now consider a variant of poker where a player is permitted to utter falsehoods when and only when they have a Joker in hand. …
-
1360838.176472
Edith Landmann-Kalischer (1877–1951) is the author of several
significant studies on topics in the philosophy of art, aesthetics,
value, mind, and knowledge in the first half of the twentieth century. Influenced by Franz Brentano, Georg Simmel, Carl Stumpf, and Stefan
George, her studies were initiated at a time when the academic, often
tendentious borders between psychology and philosophy, like those
between aesthetics and art history, were still being drawn. While
clearly also influenced by Edmund Husserl, she takes his phenomenology
to task for its idealism and, in her view, its unfounded isolation
from the sciences, especially psychology.
-
1405913.17648
I have for a long time inclined towards ifthenism in mathematics: the idea that mathematics discovers truths of the form ``If these axioms are true, then this thing is true as well.’’
Two things have weakened my inclination to ifthenism. …
-
1433962.176489
I've been exploring in this newsletter recently how people's growing inability to understand and control the institutions that shape their lives affects their political views (see here or here for instance). …
-
1548182.176498
The critic can seem an impotent spectator, tossing in smiles and frowns from his seat while in the arena the Artist strains and strides, creating something new. But some critics, with their pen and their patronage, succeed in steering art’s development, making and breaking careers along the way. …
-
1618996.176506
Brian Leftow’s 2022 book, Anselm’s Argument: Divine Necessity is an impressively thorough discussion of Anselmian modal metaphysics, centred around what he takes to be Anselm’s strongest “argument from perfection” (Leftow’s preferred term for an Ontological Argument). This is not the famous argument from Proslogion 2, nor even the modal argument that some have claimed to find in Proslogion 3, but rather, an argument from Anselm’s Reply to Gaunilo, expressed in the following quotation: “If … something than which no greater can be thought … existed, neither actually nor in the mind could it not exist. Otherwise it would not be something than which no greater can be thought. But whatever can be thought to exist and does not exist, if it existed, would be able actually or in the mind not to exist. For this reason, if it can be thought, it cannot not exist.” (p. 66) Before turning to this argument, Leftow offers an extended and closely-argued case for understanding Anselm’s modality in terms of absolute necessity and possibility, with a metaphysical foundation on powers as argued for at length (575 pages) in his 2012 book God and Necessity. After presenting this interpretation in Chapter 1, Leftow’s second chapter discusses various theological applications (such as the fixity of the past, God’s veracity, and immortality), addressing them in a way that both expounds and defends what he takes to be Anselm’s approach. Then in Chapter 3 Leftow addresses certain problems, for both his philosophical and interpretative claims, while Chapter 4 spells out the key Anselmian argument, together with Leftow’s suggested improvements. Chapter 5 explains how the argument depends on Brouwer’s system of modal logic, and defends this while also endorsing the more standard and comprehensive system S5.
-
1620164.176526
In a recent TLS, I wrote about the spoils of pessimism—whether we should be quietists, retreating from the world, or activists who fight for it—but my real subject was despair. I did not get to write about the best book on despair I’ve read: Christian Wiman’s prose-poetic Zero at the Bone. …
-
1670803.176534
One way of defining life is via a real definition, which gives the essence of life. Another approach is an operational definition, which shows how living things can be tested or measured in a way that is distinctive of the biological. Although I give a real definition elsewhere, in this paper I provide an operational definition, echoing Canguilhem’s dictum that life is what is capable of making mistakes. Biological mistakes are central to the behaviour of organisms, their parts and sub-systems, and the collections to which they belong. I provide an informal definition of a biological mistake. I contrast mistakes with mere failures and malfunctions. Although closely related phenomena, each is distinct. After giving some brief examples of mistake-making and how it can be tested, I reply to some objections to the very idea of a biological mistake.
-
1678945.176543
At the start of the pandemic, Peter Singer and I argued that our top priority should be to learn more, fast. I feel similarly about AI, today. I’m far from an expert on the topic, so the main things I want to do in this post are to (i) share some resources that I’ve found helpful as a novice starting to learn more about the topic over the past couple months, and (ii) invite others to do likewise! …
-
1731418.176551
Let us say that a being is omnisubjective if it has a perfect first-person grasp of all subjective states (including belief states). The question of whether God is omnisubjective raises a nest of thorny issues in the philosophy of language, philosophy of mind, and metaphysics, at least if there are irreducibly subjective states. There are notorious difficulties analyzing the core traditional divine attributes—omniscience, omnipotence, and omnibenevolence—but those difficulties are notorious partly because we seem to have a decent pre-theoretic grasp of what it means for something to be all knowing, powerful, and good, and so it is surprising, frustrating, and perplexing that it is so difficult to provide a satisfactory analysis of those notions.
-
1731441.176559
What is it for an argument to be successful? Some take success to be mind-independent, holding that successful arguments are those that meet some objective criterion such as soundness. Others take success to be dialectical, holding that successful arguments are those that would convince anyone meeting certain (perhaps idealized) conditions, or perhaps some targeted audience meeting those conditions. I defend a set of desiderata for theories of success, and argue that no objective or dialectical meets those desiderata. Instead, I argue, success is individualistic: arguments can only (plausibly) be evaluated as successes (qua argument) relative to individuals. In particular, I defend The Knowledge Account, according to which an argument A is successful for individual i iff i knows A is sound and non-fallacious. This conception of success is a significant departure from orthodoxy and has interesting and unexplored philosophical and methodological implications for the evaluation of arguments.
-
1735318.176567
The theoretical developments that led to supersymmetry – first global and then local – over a period of about six years (1970/71-1976) emerged from a confluence of physical insights and mathematical methods drawn from diverse, and sometimes independent, research directions. Despite these varied origins, a common thread united them all: the pursuit of a unity in physics, grounded in the central role of symmetry, where “symmetry” is understood in terms of group theory, representation theory, algebra, and differential geometry.
-
1735336.176575
Scientific fields frequently need to exchange data to advance their own inquiries. Data unification is the process of stabilizing these forms of interfield data exchange. I present an account of the epistemic structure of data unification, drawing on case studies from model-based cognitive neuroscience (MBCN). MBCN is distinctive because it shows that modeling practices play an essential role in mediating these data exchanges. Models often serve as interfield evidential integrators, and models built for this purpose have their own representational and inferential functions. This form of data unification should be seen as autonomous from other forms, particularly explanatory unification.
-
1788015.176586
According to classical utilitarianism, well-being consists in pleasure or happiness, the good consists in the sum of well-being, and moral rightness consists in maximizing the good. Leibniz was perhaps the first to formulate this doctrine. Bentham made it widely known. For a long time, however, the second, summing part lacked any clear foundation. John Stuart Mill, Henry Sidgwick, and Richard Hare all gave arguments for utilitarianism, but they took this summing part for granted. It was John Harsanyi who finally presented compelling arguments for this controversial part of the utilitarian doctrine.
-
1798251.176595
Scientists do not merely choose to accept fully formed theories, they also have to decide which models to work on before they are fully developed and tested. Since decisive empirical evidence in favour of a model will not yet have been gathered, other criteria must play determining roles. I examine the case of modern high-energy physics where the experimental context that once favoured the pursuit of beautiful, simple, and general theories now favours the pursuit of models that are ad hoc, narrow in scope, and complex; in short, ugly models. The lack of new discoveries since the Higgs boson, together with the unlikeliness of a new higher energy collider, has left searches for new physics conceptually and empirically wide open. Physicists must make use of the experiment at hand while also creatively exploring alternatives that have not yet been explored. This encourages the pursuit of models that have at least one of two key features: i) they take radically novel approaches, or ii) are easily testable. I present three models, neutralino dark matter, the relaxion, and repulsive gravity, and show that even if they do exhibit traditional epistemic virtues, they are nonetheless pursuitworthy. I argue that experimental context strongly determines pursuitworthiness and I lay out the conditions under which experiment encourages the pursuit of ugly models.
-
1822242.176603
[Editor’s Note: The following new entry by Juliana
Bidadanure and David Axelsen replaces the
former entry
on this topic by the previous author.]
Egalitarianism is a school of thought in contemporary political
philosophy that treats equality as the chief value of a just political
system. Simply put, egalitarians argue for equality. They
have a presumption in favor of social arrangements that advance
equality, and they treat deviations from equality as prima
facie suspect. They recommend a far greater degree of equality
than we currently have, and they do so for distinctly egalitarian
reasons.
-
1908341.176612
Probabilities play an essential role in the prediction and explanation of events and thus feature prominently in well-confirmed scientific theories. However, such probabilities are frequently described as subjective, epistemic, or both. This prompts a well-known puzzle: how could scientific posits that predict and explain human-independent events essentially involve agents or knowers? I argue that the puzzle can be resolved by acknowledging that although such probabilities are non-fundamental, they may still be ontic and objective. To this end I describe dynamical mechanisms that are responsible for the convergence of probability distributions for chaotic systems, and apply an account of emergence developed elsewhere. I suggest that this analysis will generalise and claim that, consequently, a great many of the probabilities in science should be characterised in the same terms. Along the way I’ll defend a particular definition of chaos that suits the emergence analysis.