-
195426.916737
Disruptive technologies are a key theme in economics, the philosophy of technology, and situated cognition - yet these debates remain largely disconnected. This paper addresses four core questions that cut across them: (i) What, precisely, are disruptive technologies “disrupting” across the different contexts in which the literature situates them? (ii) Why do technological disruptions play such prominent roles, in multiple domains, concerning the development of our species, cultures, and personal lives? (iii) Are technological disruptions inherently beneficial or harmful, and how are potential benefits and harms brought about? (iv) What strategies are available for adaptation to disruptive technologies, and how accessible are they for different groups and individuals? To unify current debates and provide a conceptual and normative foundation for future research, we draw on niche construction theory. We argue that disruptive technologies are technological niche disruptions (TENDs) that occur at various spatiotemporal scales. TENDs pressure social groups and individuals to adapt. As the abilities and resources that adaptation requires are often unevenly distributed, so are the harms and benefits TENDs produce. TENDs, therefore, both reflect and sustain existing inequalities.
-
195454.916831
This chapter compares Andreas and G ünther’s (forthcoming) epochetic analysis of actual causation to the currently popular counterfactual accounts. The primary focus will be on the shortcomings of the counterfactual approach to causation. But we will also explain the motivation behind counterfactual accounts and how the counterfactual approach has successively moved away from its core idea in response to recalcitrant counterexamples. The upshot is that our epochetic analysis tallies better with our causal judgments than the counterfactual accounts. A comparison to counterfactual accounts at manageable length must be selective. For reasons of systematicity, we have chosen Lewis’s (1973a) analysis of causation in terms of chains of difference-making, Yablo’s (2002) account in terms of de facto dependence, and the causal model accounts of Hitchcock (2001), Halpern and Pearl (2005), Halpern (2015), Halpern (2016), and Gallow (2021). The latter may be seen as the current culmination of the counterfactual approach and the strongest competitor to our epochetic analysis. This is why we devoted a rather long section on Gal-low’s theory towards the end of this chapter.
-
253219.916849
Inertia has long been treated as the paradigm of natural motion. This paper challenges this identification through the lens of General Relativity. By refining Norton (2012)’s distinction between idealisation and approximation and drawing on key insights from Tamir ( ) regarding the theorems and proofs of Einstein and Grommer (1927), Geroch and Jang ( ), Geroch and Traschen (1987) and Ehlers and Geroch (2004), I argue that geodesic motion—commonly taken as the relativistic counterpart of inertial motion—qualifies as neither an approximation nor an idealisation. Rather, geodesic motion is best understood as a useful construct—a formal artefact of the theory’s geometric structure, lacking both real and fictitious instantiation, and ultimately excluded by the dynamical structure of General Relativity. In place of inertial motion, I develop a layered account of natural motion, which is not encoded in a single ‘master equation of motion’. Extended, structured, and backreacting bodies require dynamical formalisms of increasing refinement that systematically depart from geodesic motion. This pluralist framework displaces inertial motion as the privileged expression of pure gravitational motion, replacing it with a dynamically grounded hierarchy of approximations fully consistent with the Einstein field equations.
-
269959.916859
In Why Nuclear Power Has Been a Flop, Jack Devanney explains that pessimistic estimates of the danger of nuclear power rests on the assumption of linear harm: double the radiation, double the harm. This linear assumption is blatantly wrong at high doses, because humans exposed to sufficiently high doses die with certainty, and you can’t double-die. …
-
289282.916868
We introduce a challenge designed to evaluate the capability of Large Language Models (LLMs) in performing mathematical induction proofs, with a particular focus on nested induction. Our task requires models to construct direct induction proofs in both formal and informal settings, without relying on any preexisting lemmas. Experimental results indicate that current models struggle with generating direct induction proofs, suggesting that there remains significant room for improvement.
-
297437.916876
On its surface, a sentence like If Laura becomes a zombie, she wants you to shoot her looks like a plain conditional with the attitude want in its consequent. However, the most salient reading of this sentence is not about the desires of a hypothetical zombie- Laura. Rather, it asserts that the actual, non-zombie Laura has a certain restricted attitude: her present desires, when considering only possible states of affairs in which she becomes a zombie, are such that you shoot her. This can be contrasted with the shifted reading about zombie-desires that arises with conditional morphosyntax, e.g., If Laura became a zombie, she would want you to shoot her. Furthermore, as Blumberg and Holguín (J Semant 36(3):377–406, 2019) note, restricted attitude readings can also arise in disjunctive environments, as in Either a lot of people are on the deck outside, or I regret that I didn’t bring more friends. We provide a novel analysis of restricted and shifted readings in conditional and disjunctive environments, with a few crucial features. First, both restricted and shifted attitude conditionals are in fact “regular” conditionals with attitudes in their consequents, which accords with their surface-level appearance and contrasts with Pasternak’s (The mereology of attitudes, Ph.D. thesis, Stony Brook University, Stony Brook, NY, 2018) Kratzerian approach, in which the if -clause restricts the attitude directly. Second, whether the attitude is or is not shifted—i.e., zombie versus actual desires—is dependent on the presence or absence of conditional morphosyntax. And third, the restriction of the attitude is effected by means of aboutness, a concept for which we provide two potential Kai von Fintel and Robert Pasternak are listed alphabetically and share joint lead authorship of this work.
-
321682.916885
Scott Aaronson’s Brief Foreword:
Harvey Lederman is a distinguished analytic philosopher who moved from Princeton to UT Austin a few years ago. Since his arrival, he’s become one of my best friends among the UT professoriate. …
-
349805.916892
In this chapter, I re!ect on my career to date and my current thinking on the core topics of this volume: values, pluralism, and pragmatism in science and philosophy of science. Since I am, I hope, merely in the middle rather than the end of my career, this is not a retrospective but a mediospective, if you will, though I will begin with retrospective re!ections and end with prospective ones. Happily, there are many opportunities to reference and engage with the excellent contributions to this volume in the course of things, to discuss what I have learned from my generous colleagues as well as indicate the very few places where our views differ.
-
367543.916903
I recall being baffled, as an undergraduate student, by the debates over epistemic vs practical reasons for belief: “If the fate of the world depended on it, who could deny that having the world-saving belief was more important than having an epistemically rational belief?” I only later realized that of course nobody does deny that. …
-
368539.916911
In this article, I develop the idea of theoretical complexes to characterize large-scale theoretical movements in the cognitive sciences, such as classical computational cognitivism, connectionism, embodied cognition, and predictive processing. It is argued that these theoretical movements should be construed as groups of closely connected individual theories and models of cognitive processes that share similar general hypotheses about the nature of cognition. General hypotheses form conceptual cores of complexes of cognitive theories, giving them their structure and functional properties. The latter are said to consist primarily of helping practitioners of theoretical complexes further develop their individual accounts of cognitive phenomena. It is claimed that the theoretical diversity fostered in this way has already benefited the cognitive sciences in a number of important ways and has the potential to further advance the field.
-
368596.91692
It is a widespread consensus among metaphysicians that the bundle and substratum theories are substantially different metaphysical theories of individuality. In a realist stance towards metaphysics, they cannot both track the truth when describing fundamental reality, thus they’re rival metaphysical theories. Against that consensus, Jiri Benovsky has advanced a metametaphysical thesis that they are in fact metaphysically equivalent. This paper challenges Benovsky’s equivalence thesis with two counter-arguments based on the metaphysics of quantum mechanics: quantum metaphysical indeterminacy and wavefunction realism. As we shall argue, while both substratum and bundle theories arguably fail in standard quantum mechanics, they fail in different ways. Hence, given Benovsky’s own notion of metaphysical equivalence, they are not equivalent.
-
412492.916929
Today I want to make a little digression into the quaternions. We won’t need this for anything later—it’s just for fun. But it’s quite beautiful. We saw in Part 8 that if we take the spin of the electron into account, we can think of bound states of the hydrogen atom as spinor-valued functions on the 3-sphere. …
-
467790.916937
Learning not to be so embarrassed by my ignorance and failures. Reminder: everyone is welcome here, but paid subscriptions are what enable me to devote the necessary time to researching and writing this newsletter, including pieces like this one on Katie Johnson, the woman who alleged Trump sexually assaulted her at the age of thirteen at a party of Jeffrey Epstein’s. …
-
592834.916946
The famous Catholic pilgrimage site at Lourdes, France, until fairly recently displayed hundreds of discarded crutches as testament to miraculous cures. It has, though, never displayed a wooden leg. Hence the Wooden Leg Problem (WLP) for believers in miracles: if God can cure paralysis, why does He seem never to have given an amputee back their lost limb? The WLP is a severe challenge for believers in miracles and must be confronted head-on. Yet there does not appear to be any systematic analysis of the problem, at least as formulated here, in the literature on miracles or philosophy of religion generally. I discuss ten possible solutions to the WLP on behalf of the believer in miracles. Although some are stronger than others, all but the final one seem too weak to solve the problem. It is the final one – the ‘how do you know?’ solution – that I endorse and examine in some depth. This solution, I argue, shows that the WLP does not move the epistemological dial when it comes to belief or disbelief in miracles.
-
597479.916957
Subsumption theodicies aim to subsume apparent cases of natural evil under the category of moral evil, claiming that apparently natural evils result from the actions or omissions of free creatures. Subsumption theodicies include Fall theodicies, according to which nature was corrupted by the sins of the first humans (Aquinas 1993, Dembski 2009), demonic-action theodicies, according to which apparently natural evils are caused by the actions of fallen angels (Lewis 1944, Plantinga 1974, Johnston 2023), and simulation theodicies, according to which our universe is a computer simulation, with its apparent natural evils caused by the free actions of simulators in the next universe up (Dainton 2020, Crummett 2021).
-
599199.91697
This paper introduces the physics and philosophy of strange metals, which are characterized by unusual electrical and thermal properties that deviate from conventional metallic behaviour. The anomalous strange-metal behaviour discussed here appears in the normal state of a copper-oxide high-temperature superconductor, and it cannot be described using standard condensed-matter physics. Currently, it can only be described through a holographic dual, viz. a four-dimensional black hole in anti-de Sitter spacetime. This paper first We dedicate this paper to the memory of Umut G¨ursoy, who tragically passed away on 24 April 2025, during the last stages of the paper’s completion.
-
599223.91699
We employ a pragmatist model of inquiry to explain how measurement in physics can solve the problem of usefulness. In spite of the fact that a variety of resources, including theory, simulation, heuristics, rules of thumb, and practical considerations contribute to the context of a specific measurement inquiry, the measurement inquiry process partially decontextualizes its results, making them useful for other inquiries. This measurement inquiry process involves a process of transformation of data we call “entheorization,” which happens in conjunction with the evaluation of uncertainty of measurement results. These uncertainty estimates then serve to define the sensitivity of the result to the aims of subsequent inquiries. On this approach, the epistemology of measurement requires treating measurement procedure, uncertainty estimation, and sensitivity to targets of inquiry as equally fundamental to understanding how measurement yields knowledge. To help understand how the abstract elements of our epistemological model of experimental inquiries are applicable to concrete episodes of measurement, we use the example of the W -boson mass measurement at the Large Hadron Collider to illustrate our arguments.
-
599245.917
This article uses especially Sartre’s existential philosophy (also drawing from Scheler, Husserl, and Descartes) to investigate pathogenetic issues in psychopathology from a first-person perspective. Psychosis is a “total experience” that points to orientating changes in subjectivity, supported by evidence regarding self-disorders in the schizophrenia spectrum. This article proposes that schizophrenia is essentially characterized (and distinguished) by specific structural alterations of (inter)subjectivity around the relationship between self and Other, which all its seemingly disparate signs and symptoms eventually point to. Two reciprocal distortions are present in psychotic schizophrenia patients: (A) an encroaching and substantialized Other, and (B) a self transformed into being-for-the-Other. Under the altered conditions of (A & B), delusional mood is the presence but inaccessibility of the Other; a delusional perception is an eruption or surfacing of objectification of self by Other; a delusion is an experience of the Other, which fulfills certainty, incorrigibility, and potentially falsehood.
-
602972.917008
[This continues an earlier essay, Book Review: Paradise Lost by John Milton.] 1. Adam asks Raphael, and you would too, admit it, about the sex lives of angels:
Love not the heavenly spirits, and how their love Express they, by looks only, or do they mix Irradiance, virtual or immediate touch? …
-
602973.917017
PEA Soup is pleased to announce the forthcoming discussion from Free & Equal, on Elise Sugarman’s “Supposed Corpses and Correspondence” with a précis from Gabriel Mendlow. The discussion will take place from August 6th to 8th. …
-
658470.917028
Sorry for the long blog-hiatus! I was completely occupied for weeks, teaching an intensive course on theoretical computer science to 11-year-olds (! ), at a math camp in St. Louis that was also attended by my 8-year-old son. …
-
668473.917036
Now comes the really new stuff. I want to explain how the hydrogen atom is in a certain sense equivalent to a massless spin-½ particle in the ‘Einstein universe’. This is the universe Einstein believed in before Hubble said the universe was expanding! …
-
688180.917045
For predictive processing, perception is tied to the upshot of probabilistic inference, which makes perception internal, affording only indirect access to the world external to the perceiver. The metaphysical implications of predictive processing however remain unresolved, which is a significant gap given the major influence of this framework across philosophy and other fields of research. Here, I present what I believe is a consistent metaphysical package of commitments for predictive processing. My starting point is a suitable challenge to predictive processing presented by Tobias Schlicht, who argues that the framework is committed to Kantian transcendental idealism, and marshals several lines of argument that this commitment undermines predictive processing’s aspirations to completeness, realism, and naturalism. I first trace Hermann von Helmholtz’s nuanced reaction to Kant, which sets out the preconditions for perception in a manner prescient of the notion of self-evidencing central to contemporary predictive processing. This position enables a fundamental structural realism, rather than idealism, which blocks Schlicht’s line of argument, allowing plausible versions of completeness, realism and naturalism. Schlicht’s challenge is nevertheless valuable because addressing it, in the specific context of Helmholtz’s response to Kant, helps bring to light the compelling structural realism at the heart of self-evidencing.
-
713205.917053
philosophical logic may also interest themselves with the logical appendices, one of which presents modal logic as a subsystem of the logic of counterfactuals. Last but not least, the work also includes an afterword that is both a severe reprimand to the analytic community for a certain sloppiness and an exhortation to all colleagues to apply more rigor and patience in addressing metaphysical issues. People familiar with Williamson’s work will not be surprised by the careful and detailed (sometimes a bit technical) argumentation, which demands careful attention from the reader. As expected, this is a most relevant contribution to an increasingly popular topic by one of today’s leading analytic philosophers.
-
714597.917061
Cognitive neuroscientists typically posit representations which relate to various aspects of the world, which philosophers call representational content. Anti-realists about representational content argue that contents play no role in neuroscientific explanations of cognitive capacities. In this paper, I defend realism against an anti-realist argument due to Frances Egan, who argues that for content to be explanatory it must be both essential and naturalistic. I introduce a case study from cognitive neuroscience in which content is both essential and naturalistic, meeting Egan’s challenge. I then spell out some general principles for identifying studies in which content plays an explanatory role.
-
714643.917071
Representations appear to play a central role in cognitive science. Capacities such as face recognition are thought to be enabled by internal states or structures representing external items. However, despite the ubiquity of representational terminology in cognitive science, there is no explicit scientific theory outlining what makes an internal state a representation of an external item. Nonetheless, many philosophers hope to uncover an implicit theory in the scientific literature. This is the project of the current thesis. However, all such projects face an obstacle in the form of Frances Egan’s argument that content plays no role in scientific theorising. I respond that, in some limited regions of cognitive science, content is crucial for explanation. The unifying idea is that closer attention to the application of information theory in those regions of cognitive neuroscience enables us to uncover an implicit theory of content. I examine the conditions which must be met for the cognitive system to be modelled using information theory, presenting some constraints on how we apply the mathematical framework. For example, information theory requires identifying probability distributions over measurable outcomes, which leads us to focus specifically on neural representation. I then argue that functions are required to make tractable measures of information, since they serve to narrow the range of possible contents to those potentially explanatory of a cognitive capacity. However, unlike many other teleosemanticists, I argue that we need to use a non-etiological form of function. I consider whether non-etiological functions allow for misrepresentation, and conclude that they do. Finally, I introduce what I argue is the implicit theory of content in cognitive neuroscience: maxMI. The content of a representation is that item in the environment with which the representation shares maximal mutual information.
-
748578.917079
The term algorithmic fairness is used to assess whether
machine learning algorithms operate fairly. To get a sense of when
algorithmic fairness is at issue, imagine a data scientist is provided
with data about past instances of some phenomenon: successful
employees, inmates who when released from prison go on to reoffend,
loan recipients who repay their loans, people who click on an
advertisement, etc. and is tasked with developing an algorithm that
will predict other instances of these phenomena. While an algorithm
can be successful or unsuccessful at its task to varying degrees, it
is unclear what makes such an algorithm fair or unfair.
-
758443.917087
would be like if the theory were true. This concerns (i) what possibilities it represents, (ii) the internal structure of those possibilities and their interrelations, and, to some extent, (iii) how those possibilities differ from what’s come before. By providing an interpretive foil that one can amplify or amend, it aspires to shape the research agenda in the foundations of general relativity for established philosophers of physics, graduate students searching for work in these topics, and other interested academics. This title is also available as Open Access on Cambridge Core.
-
772329.917098
Hilbert-space techniques are widely used not only for quantum theory, but also for classical physics. Two important examples are the Koopman-von Neumann (KvN) formulation and the method of “classical” wave functions. As this paper explains, these two approaches are conceptually distinct. In particular, the method of classical wave functions was not due to Bernard Koopman and John von Neumann, but was developed independently by a number of later researchers, perhaps first by Mario Schönberg, with key contributions from Angelo Loinger, Giacomo Della Riccia, Norbert Wiener, and E. C. George Sudarshan. The primary goals of this paper are to explain these two approaches, describe the relevant history in detail, and give credit where credit is due.
-
772351.917107
Why does quantum theory need the complex numbers? With a view toward answering this question, this paper argues that the usual Hilbert-space formalism is a special case of the general method of Markovian embeddings. This paper then describes the ‘indivisible interpretation’ of quantum theory, according to which a quantum system can be regarded as an ‘indivisible’ stochastic process unfolding in an old-fashioned configuration space, with wave functions and other exotic Hilbert-space ingredients demoted from having an ontological status. The complex numbers end up being necessary to ensure that the Hilbert-space formalism is indeed a Markovian embedding.