-
785093.517818
Scientific realism is the philosophical stance that science tracks truth, in particular in its depiction of the world’s ontology. Ontologically, this involves a commitment to the existence of entities posited by our best scientific theories; metaontologically, it includes the claim that the theoretical framework itself is true. In this article, we examine wave function realism as a case study within this broader methodological debate. Wave function realism holds that the wave function, as described by quantum mechanics, corresponds to a real physical entity. We focus on a recent formulation of this view that commits to the ontology of the wave function while deliberately avoiding the metaontological question of the framework’s truth. Instead, the view is defended on pragmatic, non-truth-conductive grounds. This, we argue, raises tensions for the purported realism of wave function realism and its compatibility with scientific realism more broadly.
-
785129.517921
There is a tendency in the philosophy of science to present the scientist as a ghostly being that just has degrees of belief in various descriptive statements, which are adjusted according to some rules of rational thinking (e.g. Bayes’ theorem) . . . We need a more serious understanding of scientists as agents, not as passive receivers of information or algorithmic processors of propositions . . .
-
800085.51794
Did you know that Lawvere did classified work on arms control in the 1960s, back when he was writing his thesis? Did you know that the French government offered him a job in military intelligence? The following paper should be interesting to applied category theorists—for a couple of different reasons:
• Bill Lawvere, The category of probabilistic mappings with applications to stochastic processes, statistics, and pattern recognition, Spring 1962, featuring Lawvere’s abstract and author commentary from 2020, reformatted for Lawvere Archive Posthumous Publications by Tobias Fritz, July 14, 2025. …
-
800086.517954
As you may recall, Matthew Adelstein uses r-K selection theory to argue that the average bug’s life is not worth living. Quick version: Humans have a few offspring, who typically receive immense parental investment. …
-
820209.517966
Picture a playground on a sunny day, bustling with excited children. One falls and scratches her knee. Cries of distress draw the concern of a new friend. A few breaths later, she’s back on her feet with a big grin, ready for the next adventure. …
-
837306.517978
Social authorities claim that we are obliged to obey their commands and they also claim the right to enforce them should we refuse. Many liberals (amongst others) insist that these claims hold water only when those subject to such an authority have agreed to obey it. Thus, according to classical liberals, people are subject to the authority of the state only if they have (in some sense) consented to its rule. Grounds for scepticism about a consent-based theory of political authority are no less familiar. Though ‘consent’ can mean different things, it is often observed that there is no form of consent which could both (a) validate political authority and (b) plausibly be attributed to most of the population of either past or present states.
-
839224.518018
PEA Soup is pleased to introduce the July Ethics article discussion on “Gender, Gender Expression, and the Dilemma of the Body” by Katie Zhou (MIT). The précis is from Cressida Heyes (University of Alberta). …
-
877067.518037
Aristotle had a famous argument that time had no beginning or end. In the case of beginnings, this argument caused immense philosophical suffering in the middle ages, since combined with the idea that time requires change it implies that the universe was eternal, contrary to the Jewish, Muslim and Christian that God created the universe a finite amount of time ago. …
-
893015.51805
There are two parts of Aristotle’s theory that are hard to fit together. First, we have Aristotle’s view of future contingents, on which
- It is neither true nor false that tomorrow there will be a sea battle
but, of course:
- It is true that tomorrow there will be a sea battle or no sea battle. …
-
898546.518062
The paper proposes and studies new classical, type-free theories of truth and determinateness with unprecedented features. The theories are fully compositional, strongly classical (namely, their internal and external logics are both classical), and feature a defined determinateness predicate satisfying desirable and widely agreed principles. The theories capture a conception of truth and determinateness according to which the generalizing power associated with the classicality and full compositionality of truth is combined with the identification of a natural class of sentences – the determinate ones – for which clear-cut semantic rules are available. Our theories can also be seen as the classical closures of Kripke-Feferman truth: their ω-models, which we precisely pinned down, result from including in the extension of the truth predicate the sentences that are satisfied by a Kripkean closed-off fixed point model. The theories compare to recent theories proposed by Fujimoto and Halbach, featuring a primitive determinateness predicate. In the paper we show that our theories entail all principles of Fujimoto and Halbach’s theories, and are proof-theoretically equivalent to Fujimoto and Halbach’s CD . We also show establish some negative results on Fujimoto and Halbach’s theories: such results show that, unlike what happens in our theories, the primitive determinateness predicate prevents one from establishing clear and unrestricted semantic rules for the language with type-free truth.
-
900438.518073
In contemporary philosophy of physics, there has recently been a renewed interest in the theory of geometric objects—a programme developed originally by geometers such as Schouten, Veblen, and others in the 1920s and 30s. However, as yet, there has been little-to-no systematic investigation into the history of the geometric object concept. I discuss the early development of the geometric object concept, and show that geometers working on the programme in the 1920s and early 1930s had a more expansive conception of geometric objects than that which is found in later presentations— which, unlike the modern conception of geometric objects, included embedded submanifolds such as points, curves, and hypersurfaces. I reconstruct and critically evaluate their arguments for this more expansive geometric object concept, and also locate and assess the transition to the more restrictive modern geometric object concept.
-
900465.518084
Chronogeometry is often conceived as a necessary condition for spatiotemporality, yet many theories of quantum gravity (QG) seem to challenge it. Applications of noncommutative geometry (NCG) to QG propose that spacetime exhibits noncommutative features at or beyond the Planck scale, thereby replacing relativistic symmetries with their deformations, known as quantum groups. This leads to an algebraic formulation of noncommutative structure that postulates a minimal length scale and deforms relativistic (commutative) physics, raising questions about whether noncommutative theories preserve spatiotemporal content, and specifically, chronogeometry. I argue that noncommutative approaches can satisfy an appropriate definition of chronogeometry, thus attaining physical significance within QG. In particular, I contend that noncommutativity is compatible with chronogeometricity, using κ-Minkowski spacetime as case study in NCG. In this algebraic setting, physical interpretation hinges on two crucial elements: a representation of the noncommutative algebra and a corresponding set of observers. I show how this framework enables the algebra to encode localisation procedures for events in noncommutative spacetime, relative to a noncommutative reference frame, with frame transformations governed by the quantum group structure. By enriching the theory with noncommutative reference frames, NCG can satisfy the necessary representational principles to support chronogeometric content.
-
934384.518095
Many of the theories found in contemporary high-energy physics are
gauge theories. The theory of the electromagnetic force is a gauge
theory, as are the theories of the weak and strong nuclear forces. Philosophers disagree about which other theories are gauge theories,
but they generally agree that gauge theories present distinctive
puzzles concerning mathematical representation. Philosophical
discussion of gauge theories has focused on these puzzles alongside
the metaphysical and epistemological consequences of the fact that
gauge theories feature centrally in theories of the fundamental
physical forces.
-
934402.518106
The concept of preference spans numerous research fields, resulting in
diverse perspectives on the topic. Preference logic specifically
focuses on reasoning about preferences when comparing objects,
situations, actions, and more, by examining their formal properties. This entry surveys major developments in preference logic to date. Section 2
provides a historical overview, beginning with foundational work by
Halldén and von Wright, who emphasized the syntactic aspects of
preference. In
Section 3,
early semantic contributions by Rescher and Van Dalen are introduced. The consideration of preference relations over possible worlds
naturally gives rise to modal preference logic where preference
lifting enables comparisons across sets of possible worlds.
-
943944.518117
I’ve explained a cool way to treat bound states of the hydrogen atom as wavefunctions on a sphere in 4-dimensional space. But so far I’ve been neglecting the electron’s spin. Now let’s throw that in too! …
-
952649.518129
It is uncontroversial that humanistic thought and scientific inquiry have been entangled throughout a very long arc of intellectual history. Beyond this, however, significant challenges await anyone hoping to understand let alone articulate the nature of these entanglements. Since ‘science’ and ‘humanism’ are labels that are commonly applied to traditions of theorizing and practice that predate the 18th and 19th century introduction and use of these terms in their modern senses, respectively, and since both of these traditions have evolved and speciated a great deal from antiquity to the present, any attempt to untangle the many complex relationships between them amounts to a formidable task.
-
958108.518141
We ask how and why mathematical physics may be seen as a rigorous discipline. Starting with Newton but drawing on a philosophical tradition ranging from Aristotle to (late) Wittgenstein, we argue that, as in mathematics, rigour ultimately comes from rules. These include logical rules of inference as well as definitions that give a precise meaning to physical concepts such as space and time by providing rules governing their use in models of the theories in which they are defined. In particular, so-called implicit definitions characterize “indefinables” whose traditionally assumed familiarity through “intuition” or “acquaintance” from Aristotle down to Russell blasts any hope of both rigour and innovation. Given the basic physical concepts, one may subsequently define derived concepts (like black holes or determinism). Definitions are seen as a priori meaning-constitutive conventions that are neither necessary `a la Kant nor arbitrary `a la Carnap, as they originate in empirical science as well as in the autonomous development of mathematics and physics. As such definitions are best seen as hypothetical.
-
958179.518152
According to the stochastic-quantum correspondence, a quantum system can be understood as a stochastic process unfolding in an old-fashioned configuration space based on ordinary notions of probability and ‘indivisible’ stochastic laws, which are a non-Markovian generalization of the laws that describe a textbook stochastic process. The Hilbert spaces of quantum theory and their ingredients, including wave functions, can then be relegated to secondary roles as convenient mathematical appurtenances. In addition to providing an arguably more transparent way to understand and modify quantum theory, this indivisible-stochastic formulation may lead to new possible applications of the theory. This paper initiates a deeper investigation into the conceptual foundations and structure of the stochastic-quantum correspondence, with a particular focus on novel forms of gauge invariance, dynamical symmetries, and Hilbert-space dilations.
-
958203.518165
Critics of ambivalence see it as something of inherent disvalue: a sign of poorly functioning agency. Instead, this chapter challenges this assumption, outlining the potential benefits of ambivalence for well-functioning agency, using criteria of rationality, agential effectiveness, autonomy, and authenticity. Furthermore, by exploring the interplay between philosophical debates on ambivalence and psychological research on suicide, the chapter shows how insights from each field can inform the other. For example, it follows that fostering ambivalence, rather than eliminating it, can sometimes support more effective suicide interventions, while ambivalence alone should not be seen as a marker of deficient agency and thus as justification for paternalistic measures.
-
970790.51818
On all-false open future (AFOF), future contingent claims are all false. The standard way to define “Will p” is to say that p is true in all possible futures. But defining a possible future is difficult. …
-
1015879.518192
The literature on values in science contains countless claims to the effect that a particular type of scientific choice is or is not value-laden. This chapter exposes an ambiguity in the notion of a value-laden choice. In the first half, I distinguish four ways a choice can be said to be value-laden. In the second half, I illustrate the usefulness of this taxonomy by assessing arguments about whether the value-ladenness of science is inevitable. I focus on the “randomizer reply,” which claims that, in principle, scientists could always avoid value-laden choices by flipping a coin.
-
1069990.518203
Accuracy plays an important role in the deployment of machine learning algorithms. But accuracy is not the only epistemic property that matters. For instance, it is well-known that algorithms may perform accurately during their training phase but experience a significant drop in performance when deployed in real-world conditions. To address this gap, people have turned to the concept of algorithmic robustness. Roughly, robustness refers to an algorithm’s ability to maintain its performance across a range of real-world and hypothetical conditions. In this paper, we develop a rigorous account of algorithmic robustness grounded in Robert Nozick’s counterfactual sensitivity and adherence conditions for knowledge. By bridging insights from epistemology and machine learning, we offer a novel conceptualization of robustness that captures key instances of algorithmic brittleness while advancing discussions on reliable AI deployment. We also show how a sensitivity-based account of robustness provides notable advantages over related approaches to algorithmic brittleness, including causal and safety-based ones.
-
1131150.518214
Why are quantum correlations so puzzling? A standard answer is that they seem to require either nonlocal influences or conspiratorial coincidences. This suggests that by embracing nonlocal influences we can avoid conspiratorial fine-tuning. But that’s not entirely true. Recent work, leveraging the framework of graphical causal models, shows that even with nonlocal influences, a kind of fine-tuning is needed to recover quantum correlations. This fine-tuning arises because the world has to be just so as to disable the use of nonlocal influences to signal, as required by the no-signaling theorem. This places an extra burden on theories that posit nonlocal influences, such as Bohmian mechanics, of explaining why such influences are inaccessible to causal control. I argue that Everettian Quantum Mechanics suffers no such burden. Not only does it not posit nonlocal influences, it operates outside the causal models framework that was presupposed in raising the fine-tuning worry. Specifically, it represents subsystems with density matrices instead of random variables. This allows it to sidestep all the results (including EPR and Bell) that put quantum correlations in tension with causal models. However, this doesn’t mean one must abandon causal reasoning altogether in a quantum world. After all, quantum systems can clearly stand in causal relations. When decoherence is rampant and there’s no controlled entanglement, Everettian Quantum Mechanics licenses our continued use of standard causal models. When controlled entanglement is present—such as in Bell-type experiments—we can employ recently proposed quantum causal models that are consistent with Everettian Quantum Mechanics. We never need invoke any kind of nonlocal influence or any kind of fine-tuning.
-
1131177.518226
Feynman diagrams are used to calculate scattering amplitudes in quantum field theory, where they simplify the derivation of individual terms in the corresponding perturbation series. Considered mathematical tools with an approximative character, the received view in the philosophy of physics denies that individual diagrams can represent physical processes. A different story, however, can be observed in physics practice. From education to high-profile research publications, Feynman diagrams are used in connection with particle phenomena without any reference to perturbative calculations. In the first part of the paper, I argue that this illuminates an additional use of Feynman diagrams that is not calculatory but representational. It is not a possible translation into mathematical terms that prompts this practice but rather the epistemic insights into the target phenomenon that the diagrams provide. Based on this practical use, I intend to push back against the received view. In the second part of the paper, I conceptualize the representative use of Feynman diagrams as models that provide modal understanding of their associated target phenomena. The set of Feynman diagrams corresponding to an interaction is taken as a possibility space whose dependency relations can be analysed, allowing an agent to grasp possible target behaviour, leading to understanding. In clearly separating the diagrams from perturbative calculations for their use as a model, the concerns that hinder a representative reading can be resolved.
-
1131203.51824
We take a fresh look at Daniel Dennett’s naturalist legacy in philosophy, focusing on his rethinking of philosophical methods. Critics sometimes mistake Dennett for promoting a crude naturalism or dismissing philosophical tools like first-person intuition. We present his approach as more methodologically radical, blending science and philosophy in a way that treats inquiry as an evolving process. Concepts and intuitions are tested and adjusted in light of empirical findings and broader epistemic aims. For Dennett, science isn’t a limitation on philosophy, but a tool that sharpens it, with empirical data helping to refine our understanding both of concepts and philosophical phenomena alike. By exploring Dennett’s methodological contributions, we underscore the ongoing importance of his naturalist perspective in today’s philosophical landscape.
-
1131272.518268
In this paper, we argue that a perceiver’s contributions to perception can substantially affect what objects are represented in perceptual experience. To capture the scalar nature of these perceiver-contingent contributions, we introduce three grades of subject-dependency in object perception. The first grade, “weak subject-dependency,” concerns attentional changes to perceptual content like, for instance, when a perceiver turns their head, plugs their ears, or primes their attention to a particular cue. The second grade, “moderate subject-dependency,” concerns changes in the contingent features of perceptual objects due to action-orientation, location, and agential interest. For instance, being to the right or left of an object will cause the object to have a corresponding locative feature, but that feature is non-essential to the object in question. Finally, the third grade, “strong subject-dependency,” concerns generating perceptual objects whose existence depends upon their perceivers’ sensory contributions to perception. For this final grade of subject-dependency the adaptive perceptual system shapes diverse representations of sensory information by contributing necessary features to perceptual objects. To exemplify this nonstandard form of object perception we offer evidence from the future-directed anticipation of perceptual experts, and from the feature binding of synesthetes. We conclude that strongly subject-dependent perceptual objects are more than mere material objects, but are rather a necessary combination of material objects with the contributions of a perceiving subject.
-
1182465.518281
Pedants complain that the word “literally” is more often misused than used correctly. “This post will literally blow your mind! Your brain will literally explode!” “Literally?” they exclaim. “Then I had better stop reading.”
But the pedants are not pedantic enough. …
-
1205191.518292
In Part 4 we saw that the classical Kepler problem—the problem of a single classical particle in an inverse square force—has symmetry under the group of rotations of 4-dimensional space Since the Lie algebra of this group is
we must have conserved quantities
and
corresponding to these two copies of The physical meaning of these quantities is a bit obscure until we form linear combinations
Then is the angular momentum of the particle, while is a subtler conserved quantity: it’s the eccentricity vector of the particle divided by where the energy is negative for bound states (that is, elliptical orbits)
The advantage of working with and is that these quantities have very nice Poisson brackets:
This says they generate two commuting symmetries. …
-
1239215.518304
1. Should the state get out of the marriage business? Would it be better, if “personal relationships are regulated, the vulnerable are protected, and justice is furthered, all without the state recognition of marriage or any similar alternative”? …
-
1246602.518314
We revisit Einstein’s 1927 thought experiment on electron diffraction, using a single-electron source and an opaque hemispheric detector array, now achievable with modern sensors (~0.1 ns). In this fully enclosed system, where no signals escape the hemisphere, we provide a direct empirical comparison of the Many-Worlds Interpretation (MWI) and the Branched Hilbert Subspace Interpretation (BHSI). Both maintain unitarity without invoking wavefunction collapse, as in the Copenhagen Interpretation (CI), but differ ontologically: MWI proposes irreversible global branching into parallel worlds, while BHSI describes local, potentially reversible branching into decohered subspaces. In this setup, all quantum events (branching, engagement, disengagement, and relocation) occur entirely within the local system, and the Born rule, naturally emerging through branch weights, can be observed in detector statistics. To explore branching dynamics more thoroughly, we suggest an enhanced dual-layer experimental setup with an inner transparent detector. Because the electron’s transit time between layers (~0.12 ns) is shorter than the average response times of the inner sensors (~1 ns), this allows a crucial test of measurement timing and potential anomalies (“delayed” or “uncommitted” choice?). Our analysis challenges the notion that unitarity necessitates parallel worlds, instead advocating for a simpler view: local, unitary branching without collapse or global splitting.