-
1439795.680744
I. Abstract.......................................................................................................................................3 II. Introduction................................................................................................................................4
-
1439818.680871
Existing characterizations of ‘trace’ in the philosophy of the historical sciences agree that traces need to be downstream of the long-past event under investigation. I argue that this misses an important type of trace used in historical reconstructions. Existing characterizations of traces focus on what I propose to call direct traces. What I call circumstantial traces (i) share a common cause with a past event and (ii) allow an inference to said event via an intermediate step. I illustrate the significance of checking the alignment between direct and circumstantial traces in historical reconstructions through a case study from (micro-)palaeontology.
-
1471085.680889
We propose a framework for the analysis of choice behaviour when the latter is made explicitly in chronological order. We relate this framework to the traditional choice theoretic setting from which the chronological aspect is absent, and compare it to other frameworks that extend this traditional setting. Then, we use this framework to analyse various models of preference discovery. We characterise, via simple revealed preference tests, several models that differ in terms of (1) the priors that the decision-maker holds about alternatives and (2) whether the decision-maker chooses period by period or uses her knowledge about future menus to inform her present choices. These results provide novel testable implications for the preference discovery process of myopic and forward-looking agents.
-
1472611.680911
Consultant Statistician
Edinburgh
Relevant significance? Be careful what you wish for
Despised and Rejected
Scarcely a good word can be had for statistical significance these days. We are admonished (as if we did not know) that just because a null hypothesis has been ‘rejected’ by some statistical test, it does not mean it is not true and thus it does not follow that significance implies a genuine effect of treatment. …
-
1497805.680925
We critically examine the revised general relativity (GR) framework proposed by Capozziello, De Bianchi, and Battista [1, 2]. The authors introduce a Lorentzian-Euclidean Schwarzschild metric with a signature change at the event horizon (r = 2M ), claiming that radial geodesics halt at r = 2M with infinite proper time, avoiding the singularity at r = 0. We argue that their framework lacks physical justification, producing unphysical dynamics in the Lorentzian region (r > 2M ), where the metric is identical to Schwarzschild. Their revisions violate fundamental GR principles—including the equivalence principle, energy conservation, geodesic well-definedness, and consistency with the metric’s geometry—without empirical or theoretical grounding. Notably, their modified energy definition and geodesic equation yield an infinite proper time, contradicting GR’s finite result. We address the potential defense that these violations are expected in a revised GR, demonstrating that their framework’s deviations are ad hoc and undermine its validity as a physically meaningful extension.
-
1497826.680936
Jody Azzouni (2012b; 2010; 2009; 2004a; 2004b) defends a “deflationary nominalism”; deflationary in that mathematical sentences are true in a non-correspondence sense, and nominalist because mathematical terms—appearing in sentences of scientific theory or otherwise—refer to nothing at all. In this paper, I focus on Azzouni’s positive account of what should be said to exist. The quaternary “sufficient condition” (Azzouni 2004b: 384) for posit existence, Azzouni (2012b: 956) calls “thick epistemic access” (hereafter TEA), and in this paper I argue that TEA surreptitiously reifies some mathematical entities. The mathematical entity that I argue TEA reifies is the Fourier harmonic, an infinite-duration sinusoid applied throughout contemporary engineering and physics. The Fourier harmonic exists for the deflationary nominalist, I claim, because the harmonic plays what Azzouni calls an “epistemic role” (see section 2) in the commonplace observation of macroscopic entities, for example in viewing a vase with the human eye. Thus, I present More precisely, Azzouni’s deflationism interprets truth as nothing above and beyond the “generalization” expressed by the Tarski biconditional (e.g.): “Snow is white” is true iff snow is white (Azzouni 2010: 19). Hence what redeems that biconditional, in Azzouni’s account, is neither strictly correspondence, nor coherence, nor indispensability of the truth idiom to language. On the other hand, Azzouni rejects truth pluralism (see Azzouni 2010: §§4.7-4.8). The best articulation of Azzouni’s deflationary account of truth in science, mathematics, and applied mathematics may be Azzouni (2009), but see also Azzouni (2010: Chap. 4). The details will not concern me in this paper.
-
1527586.680949
Reductive doxastic moral relativism is the view that an action type’s being morally wrong is nothing but an individual or society’s belief that the action type is morally wrong. But this is viciously circular, since we reduce wrongness to a belief about wrongness. …
-
1541605.680961
This chapter begins by explaining why it is important to attend to duties when theorizing human rights. It then assesses four constraints on the duties associated with human rights: the constraints of correlativity, ability, agency, and demandingness. Finally, it compares two approaches to the duties associated with human rights: practice-based approaches and naturalistic approaches. It concludes that both approaches successfully produce duties, though neither abides by all four constraints.
-
1613159.680974
One way to interpret the difference between presentism and eternalism is perspectively. This view argues that from a perspective outside of time, we should adopt eternalism, and from a perspective embedded within time, we should be presentist. I will use the perspectival view to make two central claims about the probabilities in statistical mechanics. First, the perspectival view can help us respond to the challenge that these probabilities are merely epistemic, subjective, or anthropocentric. Second, we should treat the future as metaphysically open, due to both probabilities in statistical mechanics and the localised nature of the present.
-
1613181.680986
There are two main styles of interpreting the quantum state: either focusing on the fundamentality of the quantum state (a wavefunction or state realist view), or on how projection operators represent observable properties (an observable-first approach). Rather than being incompatible, I argue that these correspond to taking a 3rd person and 1st person perspective respectively. I further contend that the 1st person perspective - and the observable-first approach that goes with it - is better suited to explain measurement, based on the way that the metrology literature, as well as the work of Bohr, characterises measurement through the properties of a system. Finally, I show how the 1st person, observable-first approach can emerge in the world through the process of decoherence, hence showing the compatibility of the two approaches and resolving the need to choose absolutely between them.
-
1646732.680998
Classical liberalism tends to respond to the criticism of any voluntary market contract by promoting a wider choice of options and increased information and bargaining power so that no one would seem to be ‘forced’ or ‘tricked’ into an ‘unconscionable’ contract. Hence, at first glance, the strict logic of the classical liberal freedom-of-contract philosophy would seem to argue against ever abolishing any mutually voluntary contract between knowledgeable and consenting adults. Yet the modern liberal democratic societies have abolished (i.e., treated as invalid) at least three types of historical contracts: the voluntary slavery or perpetual servitude contract, the coverture marriage contract, and an undemocratic constitution to establish an autocratic government. Thus, the rights associated with those contracts are considered as inalienable. This paper analyzes these three contracts and shows that there is indeed a deeper democratic or Enlightenment classical liberal tradition of jurisprudence that rules out those contracts. The ‘problem’ is that the same principles imply the abolition of the employment contract, the contract for renting human beings, which is the foundation for the economic system that is often (but superficially) identified with classical liberalism itself. Frank Knight is taken throughout as the exemplary advocate of the economics of conventional classical liberalism.
-
1670884.681009
Despite decades of research in philosophy and cognitive science, the nature of concepts and the mechanisms underlying their change remain unresolved. Competing frameworks— externalist, inferentialist, embodied, and geometric—offer important insights but lack a unified account of how different types of concepts form, stabilize, and evolve. We propose a Systematic and Dynamic (SD) approach that examines mental content before and after concept formation, leading to the identification of inferential connections as ontologically distinct elements of conceptual architecture. Building on this foundation, we introduce the Inferential-Connection Mediated (ICM) model, which reconceptualizes concepts as dynamically structured entities composed of referential anchors—core subsets of inferential connections that fix reference— and broader networks that support reasoning, explanation, and communication. We distinguish among three types of inferential connections (observational, intentional, indirect) and classify four major concept types (theoretical, observational, subjective, and utilitarian), showing how differences in internal structure predict divergent developmental and evolutionary trajectories. The ICM model resolves long standing theoretical tensions—such as inferentialism vs. externalism, atomism vs. empiricism, and relativism vs. realism—by offering a unified, structurally grounded account of conceptual stability and change. We invite interdisciplinary commentary on the model’s implications for concept acquisition, reference, revision, and conceptual engineering across philosophy, cognitive science, linguistics, and artificial intelligence.
-
1670904.681021
This article introduces the concept of authoritarian recursion to describe how artificial intelligence (AI) systems increasingly mediate control across education, warfare, and digital discourse. Drawing on critical discourse analysis and sociotechnical theory, the study reveals how AI-driven platforms delegate judgment to algorithmic processes, normalize opacity, and recursively reinforce behavioral norms under the guise of neutrality and optimization. Case studies include generative AI models in classroom surveillance, autonomous targeting in military AI systems, and content curation logics in platform governance. Rather than treating these domains as disparate, the paper maps their structural convergence within recursive architectures of abstraction, surveillance, and classification. These feedback systems do not simply automate tasks—they encode modes of epistemic authority that disperse accountability while intensifying political asymmetries. Through cultural and policy analysis, the article argues that authoritarian recursion operates as a hybrid logic, fusing technical abstraction with state and market imperatives. The paper concludes by outlining implications for democratic legitimacy, human oversight, and the political design of AI governance frameworks.
-
1670931.681033
This paper reviews a paper from 1906 by J. Henri Poincaré on statistical mechanics with a background in his earlier work and notable connections to J. Willard Gibbs. Poincaré’s paper presents important ideas that are still relevant for understanding the need for probability in statistical mechanics. Poincaré understands the foundations of statistical mechanics as a many-body problem in analytical mechanics (reflecting his 1890 monograph on The Three-Body Problem and the Equations of Dynamics) and possibly influenced by Gibbs independent development published in chapters in his 1902 book, Elementary Principles in Statistical Mechanics. This dynamical systems approach of Poincaré and Gibbs provides great flexibility including applications to many systems besides gases. This foundation benefits from close connections to Poincaré’s earlier work. Notably, Poincaré had shown (e.g. in his study of nonlinear oscillators) that Hamiltonian dynamical systems display sensitivity to initial conditions separating stable and unstable trajectories. In the first context it precludes proving the stability of orbits in the solar system, here it compels the use of ensembles of systems for which the probability is ontic and frequentist and does not have an a priori value. Poincaré’s key concepts relating to uncertain initial conditions, and fine- and coarse-grained entropy are presented for the readers’ consideration. Poincaré and Gibbs clearly both wanted to say something about irreversibility, but came up short.
-
1670957.681046
Of all philosophers of the twentieth century, Karl Popper stands out as the one who did most to build bridges between the diverse academic disciplines. His first major work, Logik der Forschung (1934), concerns scientific method. Popper’s ideas were formed in the intellectual climate dominated by the logical positivism of the Wiener Kreis; despite a great diversity in academic interests, the members of the Vienna Circle wanted to reaffirm the scientific ethos of the Enlightenment ideal. Excited by the revolutionary ideas of Einstein (whom they engaged in both scientific and philosophical discussions), they believed that philosophy must play an active role in this new era by drawing as close to science as possible. Although Popper shared these general ideals, he strictly rejected all the main pillars of the positivist philosophy of science: inductivist logic of discovery, the verifiability principle and the concern with meaning. In single-handed opposition to this influential philosophical movement, Popper offered new solutions: a hypothetico-deductive view of science, based on falsifiability as the demarcation criterion and a denial of the claim that scientific theories could be verified. It is fair to say that the radicalism of Popper’s proposals caused an upheaval among philosophers of science, especially after the publication of his work in English in 1959.
-
1670982.681057
How do biologists pursue generalizations given the heterogeneity of biological systems? This paper addresses this question by examining an aspect of scientific generalization that has received little philosophical attention: how scientists express generalizations. Although it is commonly assumed that a scientific generalization takes the form of a representation referring to a property that is shared across a range of things, scientists sometimes express their ideas about generality by displaying multiple representations in certain configurations. Such configurations highlight commonalities between different target systems without eliminating system-specific differences. I analyze visual representations in review articles about collective cell migration as a case study. This illustrates that different types of visualizations, including single diagrams and configurations of multiple representations, function in a complementary way to promote understanding of, and reasoning about, generality, specificity, and diversity of biological mechanisms. I also discuss roles of generalizations in scientific investigations more broadly. I argue that an important role of generalizations in scientific research is to mediate and facilitate cross-fertilization among studies of different target systems. Multiple generalizations in research on collective cell migration together provide perspectives from which different biological systems are characterized and compared. They also provide heuristic hypotheses for studying less-explored systems as well as a basis for comparing developmental, pathological, and regenerative processes. This study sheds new light on how scientists pursue generalizations while embracing system-specific details. It also suggests that philosophical discussions should pay more attention to not only what representations scientists construct, but also how they present such representations.
-
1671036.681068
On a mathematically foundational level, our most successful physical theories (gauge field theories and general-relativistic theories) are formulated in a framework based on the differential geometry of connections on principal bundles. After reviewing the essentials of this framework, we articulate the generalized hole and point-coincidence arguments, examining how they weight on a substantivalist position towards bundle spaces. This question, then, is considered in light of the Dressing Field Method, which allows a manifestly invariant reformulation of gauge field theories and general-relativistic theories, making their conceptual structure more transparent: it formally implements the point-coincidence argument and thus allows to define (dressed) fields and (dressed) bundle spaces immune to hole-type arguments.
-
1671060.681078
We offer a category-theoretic representation of the process theory of causality. The new formalism allows process theorists to (i) explicate their explanatory strategies (etiological and constitutive explanations) using the compositional features of string diagrams; (ii) probabilistically evaluate causal effects through the categorical notion of functor; (iii) address the problem of explanatory irrelevance via diagram surgery; and (iv) provide a theoretical explanation for the difference between conjunctive and interactive forks. We also claim that the fundamental building blocks of the process theory—namely processes, interactions, and events—can be modeled using three types of morphisms. Overall, categorical modeling demonstrates that the philosophical theory of process causality possesses scientific rigor and expressive power comparable to those of its event-based counterparts, such as causal Bayes nets.
-
1711839.681089
T.M. Scanlon, following John Rawls, sought to change the landscape of moral theory by establishing an alternative to both intuitionism and consequentialism: contractualism. One of Scanlon’s most prominent arguments for contractualism is that it alone captures the value of mutual recognition and the role of norms of recognition in enacting this ideal moral relationship. Moreover, Scanlon argues that this ideal moral relationship explains the distinctive authority and force of morality. We concur. Nevertheless, we wish to offer an alternative to Scanlon’s account of mutual recognition and to the moral theory that emerges from it. Instead of construing mutual recognition in terms of justifiability to others, as Scanlon does, we propose to construe such relations as relations of caring solidarity with others as human. This alternative retains the overall benefits of the moral recognition approach, while offering quite different structural features, including a different account of the scope of morality. This essay is programmatic. The primary goal is to disentangle the infrastructure of moral recognition from the specific idea of justifiability, thereby to open up a range of striking new questions for moral theory.
-
1717073.681102
Very short summary: In this essay, I explore a potential tension in Chandran Kukathas’s account of the liberal archipelago, between the idea of morality conceived as a commons and the politics of indifference of the liberal state. …
-
1824994.681114
Time was, philosophers were skeptics, looking down on the poor benighted masses, who think their opinions are knowledge when they really aren’t. Maybe Bloggs thinks there’s a tree in the courtyard, but ah, a brain in a vat that was fed experiences just like those he’s having would think the same. …
-
1825412.681125
The paper offers a new analysis of the German particle wohl as akin to Italian futuro. They are both, we argue, necessity modals, but without bias. They are therefore more flexible than MUST and useable in situations with less reliable evidence or heightened uncertainty such as in reflective questions where they create Socratic inquisitiveness, a self- directed state of inquisitiveness with the goal to introspect rather than seek information.
-
1873273.681138
Three guys claim that any heavy chunk of matter emits Hawking radiation, even if it’s not a black hole:
• Michael F. Wondrak, Walter D. van Suijlekom and Heino Falcke, Gravitational pair production and black hole evaporation, Phys. …
-
1900480.681149
Errorstatistics.com has been extremely fortunate to have contributions by leading medical statistician, Stephen Senn, over many years. Recently, he provided me with a new post that I’m about to put up, but as it builds on an earlier post, I’ll reblog that one first. …
-
1901732.681186
We introduce a projection-based semantic interpretation of differentiation within the Universal Theory of Differentiation (UTD), reframing acts of distinction as structured projections of relational patterns. Building on UTD’s categorical and topos-theoretic foundations, we extend the formalism with a recursive theory of differentiational convergence. We define Stable Differentiational Identities (SDIs) as the terminal forms of recursive differentiation, prove their uniqueness and hierarchical organization, and derive a transparency theorem showing that systems capable of stable recursion can reflect upon their own structure. These results support an ontological model in which complexity, identity, and semantic expressibility emerge from structured difference. Applications span logic, semantics, quantum mechanics, and machine learning, with experiments validating the structural and computational power of the framework.
-
1944572.681197
Few contributions, if any, has had a more significant impact on philosophy of language than Kripke’s (1980) ‘Naming and Necessity’ lectures. As a result of Kripke’s work, Millianism, viz. the view that names are singular terms a kin to individual constants in first-order logic, became orthodoxy. In this paper, we want to explore the idea that there is an alternative to Millianism that is not only compatible with Kripke’s seminal arguments in ‘Naming and Necessity’, but in fact strongly supported by those arguments. This alternative view is now typically referred to as Variabilism. Variabilism maintains, like Millianism, that proper names are singular terms, but rather than individual constants, the Variabilist argues that names are in fact individual variables. Throughout the years, there has been a numberVariabilist views proposed. These views share the assumption that names should be treated as variables, but they differ significantly in how these variables behave, what kind of restrictions are imposed, and what syntactic environments they can occur in. These details are obviously essential especially with respect to how similar the view is to the standard Millian view.
-
1959449.681208
In this paper I propose a concept to describe the circular developmental trajectory of psychometrics of intelligence in the twentieth century, and I argue that this circularity explains the degenerate character of the field. Defining, measuring, and explaining intelligence formed a closed circuit of reciprocal refinement activities. I call this circular, internally guided, and non-progressive refinement process degenerate bootstrapping. Bootstrapping, especially in the initial stages of a science, is inevitable and might end up with better measuring instruments and a better theoretical foundation. In the psychometric intelligence case, the absence of truly test-independent benchmarks, over-reliance on test score correlations, and the absence of genuine theorizing prevented the field from making significant conceptual progress. The circularity is specific to psychometric intelligence research and the diagnosis of degenerate bootstrapping does not apply to neighboring fields and approaches. To describe the bootstrapping process, I will offer a conceptual history, starting with Alfred Binet and focusing on the work of American founders, namely, Lewis M.
-
1959472.681218
The interpretation of quantum measurements remains contested between collapse-based frameworks like the Copenhagen Interpretation and no-collapse approaches like the Many- Worlds Interpretation (MWI). We propose the Branched Hilbert Subspace Interpretation (BHSI) as a minimalist alternative that preserves unitarity while avoiding both wavefunction collapse and ontological excess. BHSI models measurement as the unitary branching of a system's local Hilbert space into decoherent subspaces, formalized through causal state updates using branching and disengaging operators. Unlike MWI, BHSI avoids parallel worlds by maintaining a single-world ontology in which branching is confined to observable subspaces; unlike Copenhagen, it eliminates collapse while recovering the Born rule through branch weights. Through physically meaningful subspace records, the framework resolves quantum paradoxes such as particle-wave duality, Wigner and his friend, black hole radiation, etc. It remains consistent with interference patterns, entanglement correlations, and information preservation. By comparing BHSI with QBism, Relational Quantum Mechanics, and modal interpretations, as well as analyzing quantum teleportation (where locally decoherent Hilbert subspaces are observed), we demonstrate its advantages as a causally sound and empirically grounded approach that reconciles unitary evolution with definite measurement outcomes without metaphysical proliferation.
-
1959497.681229
Many instances of scientific progress feature the development of theories that are not fully true, but merely approximately true to various extents. Since only fully true propositions can be known, this seems to rule out the view that scientific progress consists in the accumulation of knowledge. According to Bird’s Cumulative Knowledge Account of progress, however, what becomes known in such instances is a (fully true) proposition expressing that the theory in question is approximately true to some extent. We present a general challenge for this idea–the Epistemic Mismatch Problem–and consider various strategies by which proponents of the Cumulative Knowledge Account might respond to it. We suggest, however, that the only plausible such strategies involve giving up on aspects of the Cumulative Knowledge Account that are central to why it has seemed plausible to begin with.
-
1959523.681242
This paper examines Free Creations of the Human Mind: The Worlds of Albert Einstein by Diana Kormos Buchwald and Michael D. Gordin. The authors seek to dispel the long-standing myths of Einstein as the ”lone genius” of Bern and the ”stubborn sage” of Princeton, drawing on newly uncovered archival materials to illuminate his intellectual networks and collaborative engagements. By exploring the authors’ reasoning, this paper engages with their interpretations, highlighting the strengths of their archival revelations and areas where alternative perspectives may enrich the understanding of Einstein’s intellectual development.