-
114301.691399
In the last NYRB, Kwame Anthony Appiah reviewed two recent books about translation. One is by Damion Searls, whose Tractatus I criticized in this space, provoked in part by his complaint about philosophers as translators of philosophy. …
-
115664.692207
In philosophy of science, the pseudosciences (like cryptozoology, homeopathy, Flat-Earth Theory, anti-vaccination activism, etc.) have been treated mainly negatively. They are viewed not simply as false, but even dangerous, since they try to mimic our best scientific theories, thus gaining respect and trust from the public, without the appropriate credentials. As a result, philosophers have traditionally put considerable effort into demarcating genuine sciences and scientific theories from pseudoscience. Since these general attempts at demarcation have repeatedly been shown to break down, the present paper takes a different and somewhat more positive approach to the study of pseudoscience. My main point is not that we should embrace and accept the pseudosciences as they are, but rather that there are indeed valuable and important lessons inherent in the study of pseudoscience and the different sections of the paper list at least six of them. By showing, through numerous examples, how (the study of) pseudoscience can teach us something about science, ourselves, and society, it makes the case that as philosophers, we should devote more time and energy to engaging with such beliefs and theories to help remedy their harmful effects.
-
202001.69225
The belief that beauty leads to truth is prevalent among contemporary physicists. Far from being a private faith, it operates as a methodological guiding principle, essentially when physicists have to develop theories without new empirical data.
-
202054.692268
Scenarios and pathways, as defined and used in the “SSP-RCP scenario framework”, are key in last decade’s climate change research and in the latest report of the Intergovernmental Panel on Climate Change (IPCC). In this framework, Shared Socioeconomic Pathways (SSP) consist of a limited set of alternative socioeconomic futures, that are both represented in short qualitative narratives and with quantitative projections of key drivers. One important use of the computationally derived SSP-scenarios is to do mitigation analysis and present a “manageable” set of options to decision-makers. However, all SSPs and derivatively SSP-scenarios in this framework assume a globally growing economy into 2100. This, in practice, amounts to a value-laden restriction of the space of solutions to be presented to decision-makers, falling short of IPCC’s general mandate of being “policy-relevant and yet policy-neutral, never policy-prescriptive”. Yet, the Global Economic Growth Assumption (GEGA) could be challenged and in practice is challenged by post-growth scholars.
-
363048.692283
Baumann, Peter. 2025. “Transcendental Arguments in Reid? A Reply to McCraw.” Social Epistemology Review and Reply Collective 14 (7): 1–6. https://wp.me/p1Bfg0-9Ze. Benjamin W. McCraw’s article “A Reidian Transcendental Argument Against Skepticism” (2025) constitutes an original and thought-provoking contribution both to Reid scholarship and to the discussion of epistemic skepticism. In the following I will make a few remarks about it, focusing on the discussion of skepticism. I start with a brief historical remark on Reid and Kant (§ 1) before I explain the anti-skeptical argument in some detail (§ 2). A discussion of the premises of the argument follows (§ 3). I add some remarks about the social aspect of McCraw’s anti-skeptical stance (§ 4). I finish with another set of historical remarks (§ 5), this time about Reid and Wittgenstein, and a brief conclusion (§ 6).
-
372590.692319
In a system with identity, quotation, and an axiom predicate, a classical extension of the system yields a falsity. The result illustrates a novel form of instability in classical logic. Notably, the phenomenon arises without vocabulary such as ’true’ or ’provable’. Conservative extensions are safe expansions: They add expressive resources while proving the same theorems (or at most, terminological variants thereof). Conservative extensions are foundational for major developments, including the Lowenheim-Skolem theorems, precise comparisons of proof-theoretic strength (Simpson 2009), and the understanding of reflection principles in arithmetic and set theory (Feferman 1962). The purpose here is not to question these developments, but rather to advise caution for the future. Some extensions that appear quite conservative end up not being so. In a system with identity, quotation, and a metalinguistic singular term, a purely syntactic predicate for axioms can create instability under an innocent-looking extension.
-
374903.692334
Our fine arts were developed, their types and uses were established, in times very different from the present...But the amazing growth of our techniques, the adaptability and precision they have attained, the ideas and habits they are creating, make it a certainty that profound changes are impending in the ancient craft of the Beautiful. In all the arts there is a physical component which can no longer be considered or treated as it used to be, which cannot remain unaffected by our modern knowledge and power…We must expect great innovations to transform the entire technique of the arts, thereby affecting artistic invention itself and perhaps even bringing about an amazing change in our very notion of art. (Valéry 1964 [1928], 225) The passage describes a moment in the history of art in the West in the 20th century characterized by the introduction of new artistic technologies of production and reproduction such as photography. The passage serves as the epigraph to Walter Benjamin’s “The Work of Art in the Age of Mechanical Reproduction” in which he contends that the analyses necessitated by the condition described by Paul Valery compels us to “brush aside a number of outmoded concepts, such as creativity and genius, eternal value and mystery…” (Benjamin 1969[1936]).
-
538886.692353
A minimal realist thinks we are justified in believing in unobservable entities as explanatory, but we should be cautious in allowing non-empirically justified entities in our ontology. In this paper I argue that a minimalist would find my proposal for an ontology of fundamental entities without fundamental properties the best balance between empirical adequacy, explanatory power, and physical justification.
-
547467.692368
In this paper we will try to provide a solid form of intrinsic set theoretical optimism. In other words, we will try to vindicate Gödel’s views on phenomenology as a method for arriving at new axioms of ZFC in order to decide independent statements such as CH. Since we have previously written on this very same subject [41, 43, 44], it is necessary to provide a justification for addressing it once again.
-
547563.692385
This work explores the connection between logical independence and the algebraic structure of quantum mechanics. Building on results by Brukner et al., it introduces the notion of onto-epistemic ignorance : situations in which the truth of a proposition is not deducible due to an objective breakdown in the phenomenal chain that transmits information from a system A to a system B, rather than to any subjective lack of knowledge. It is shown that, under such conditions, the probabilities accessible to a real observer are necessarily conditioned by decidability and obey a non-commutative algebra, formally equivalent to the fundamental postulates of quantum mechanics.
-
547587.692399
In the 1960s and 1970s a series of observations and theoretical developments highlighted the presence of several anomalies which could, in principle, be explained by postulating one of the following two working hypotheses: (i) the existence of dark matter, or (ii) the modification of standard gravitational dynamics in low accelerations. In the years that followed, the dark matter hypothesis as an explanation for dark matter phenomenology attracted far more attention compared to the hypothesis of modified gravity, and the latter is largely regarded today as a non-viable alternative. The present article takes an integrated history and philosophy of science approach in order to identify the reasons why the scientific community mainly pursued the dark matter hypothesis in the years that followed, as opposed to modified gravity. A plausible answer is given in terms of three epistemic criteria for the pursuitworthiness of a hypothesis: (a) its problem-solving potential, (b) its compatibility with established theories and the feasibility of incorporation, and (c) its independent testability. A further comparison between the problem of dark matter and the problem of dark energy is also presented, explaining why in the latter case the situation is different, and modified gravity is still considered a viable possibility.
-
609815.692413
The puzzle of aphantasia concerns how individuals reporting no visual imagery perform more-or-less normally on tasks presumed to depend on it [1]. In his splendid recent review in TiCS, Zeman [2] canvasses four ‘cognitive explanations’: (i) differences in description; (ii) ‘faulty introspection’; (iii) “unconscious or ‘sub-personal’ imagery”; and (iv) total lack of imagery. Difficulties beset all four. To make progress, we must recognize that imagery is a complex and multidimensional capacity and that aphantasia commonly reflects partial imagery loss with selective sparing. Specifically, I propose that aphantasia often involves a lack of visual-object imagery (explaining subjective reports and objective correlates) but selectively spared spatial imagery (explaining Some researchers have suggested that aphantasics may have failed to follow instructions or engage imagery [7]. This is unconvincing. In studies of galvanic skin responses, trials were excluded in which subjects failed to demonstrate ‘proper reading and comprehension’ of the frightening stories. Thus, it remains a mystery why spontaneous imagery did not emerge [6]. Similarly, in studies of pupillary light responses, aphantasics showed a characteristic in-task correlation between pupil and stimulus set size, indicating that they were not “‘refusing’ to actively participate…due to…a belief that they are unable to imagine” [5]. Aphantasics also do voluntarily form images in other tasks despite a lack of incentives [8].
-
633920.692428
Scientists decide to perform an experiment based on the expectation that their efforts will bear fruit. While assessing such expectations belongs to the everyday work of practicing scientists, we have a limited understanding of the epistemological principles underlying such assessments. Here I argue that we should delineate a “context of pursuit” for experiments. The rational pursuit of experiments, like the pursuit of theories, is governed by distinct epistemic and pragmatic considerations that concern epistemic gain, likelihood of success, and feasibility. A key question that arises is: what exactly is being evaluated when we assess experimental pursuits? I argue that, beyond the research questions an experiment aims to address, we must also assess the concrete experimental facilities and activities involved, because (1) there are often multiple ways to address a research question, (2) pursuitworthy experiments typically address a combination of research questions, and (3) experimental pursuitworthiness can be boosted by past experimental successes. My claims are supported by a look into ongoing debates about future particle colliders.
-
707336.692446
Tarot is widely disdained as a way of finding things out. Critics claim it is bunk or—worse— a wretched scam. This disdain misunderstands both tarot and the activity of finding thing out. I argue that tarot is an excellent tool for inquiry. It initiates and structures percipient conversation and contemplation about important, challenging, and deep topics. It galvanises creative attention, especially towards inward-looking, introspective inquiry and openminded, collaborative inquiry with others. Tarot can cultivate virtues like epistemic playfulness and cognitive dexterity.
-
718928.692468
Very short summary: This is a two-part essay on the crisis of contemporary liberalism. I argue that this crisis reflects the growing influence of a conception of the political as a praxis that is beyond human rationality and reason. …
-
720127.692483
People are often interested in physics due to its purported objectivity. It aims to truly be a study of nature (φύσεις) in itself. On the other hand, physics is a human construct, a language we use to describe the world as we experience it. In our quest for absolute reality, then, it seems that we must rid our description of the world of all subjectivity. This lecture concerns part of a story of such an attempt: the quest for absolute measurement. We will consider physical and philosophical aspects of the attempts of Maxwell, Peirce, and Planck to rid our language of physical measurement of undue subjectivity. This will shed some light on the possibility of knowing absolute reality—and the possibility of communication with aliens.
-
893122.692532
Achilles and the tortoise compete in a race where the beginning (the start) is at point O and end (the finish) is at point P. At all times the tortoise can run at a speed that is a fraction of Achilles' speed at most (with being a positive real number lower than 1, 0 < < 1), and both start the race at t = 0 at O. If the trajectory joining O with P is a straight line, Achilles will obviously win every time. It is easy to prove that there is a trajectory joining O and P along which the tortoise has a strategy to win every time, reaching the finish before Achilles.
-
893144.692551
In recent years, there has been heightened interest in (at least) two threads regarding geometrical aspects of spacetime theories. On the one hand, physicists have explored a richer space of relativistic spacetime structures than that of general relativity, in which the conditions both of torsion-freeness and of metric compatibility are relaxed—this has led to the study of so-called ‘metricaffine theories’ of gravitation, on which see e.g. Hehl et al. (1995) for a masterly review. On the other hand, physicists have been increasingly interested in securing a rigorous and fully general understanding of the non-relativistic limit of general relativity—this has to novel version of Newtonian physics, potentially with spacetime torsion (‘Type II’ Newton–Cartan theory—see Hansen et al. (2022) for a systematic overview).
-
946314.692566
▼ AbStrACt Since the early days of its professionalization, in the aftermath of the Second World War, the history of science has been seen as a bridge between the natural sciences and the humanities. However, only one aspect of this triadic nexus, the relations between the history of science and the natural sciences, has been extensively discussed. The other aspect, the relations between the history of science and the humanities, has been less commented upon. With this paper I hope to make a small step towards redressing this imbalance, by discussing the relationships between the history of science and two other humanistic disciplines that have been historically and institutionally associated with it: the philosophy of science and general history. I argue that both of these relationships are marked by the characteristics of an unrequited friendship: on the one hand, historians of science have ignored, for the most part, calls for collaboration from their philosopher colleagues; and, on the other hand, historians specializing in other branches of history have been rather indifferent, again for the most part, to the efforts of historians of science to understand science as a historical phenomenon.
-
1077162.692581
This article defends the compatibility of evolutionary theory and religious belief against the objection that God could not have intentionally brought humans into existence given that the evolutionary process by which humans came into existence crucially involves random genetic mutation. The thought behind the objection is that a process cannot be both random and intended by God to unfold as it does.
-
1093302.692599
Even with everything happening in the Middle East right now, even with (relatedly) everything happening in my own family (my wife and son sheltering in Tel Aviv as Iranian missiles rained down), even with all the rather ill-timed travel I’ve found myself doing as these events unfolded (Ecuador and the Galapagos and now STOC’2025 in Prague) … there’s been another thing, a huge one, weighing on my soul. …
-
1143763.692613
A theory of quantum gravity consists of a gravitational framework which, unlike general relativity, takes into account the quantum character of matter. In spite of impressive advances, no fully satisfactory, self-consistent and empirically viable theory with those characteristics has ever been constructed. A successful semiclassical gravity model, in which the classical Einstein tensor couples to the expectation value of the energy-momentum tensor of quantum matter fields, would, at the very least, constitute a useful stepping stone towards quantum gravity. However, not only no empirically viable semiclassical theory has ever been proposed, but the self-consistency of semiclassical gravity itself has been called into question repeatedly over the years. Here, we put forward a fully self-consistent, empirically viable semiclassical gravity framework, in which the expectation value of the energy-momentum tensor of a quantum field, evolving via a relativistic objective collapse dynamics, couples to a fully classical Einstein tensor. We present the general framework, a concrete example, and briefly explore possible empirical consequences of our model.
-
1189448.692627
Synthetic media generators, such as DALL-E, and synthetic media artifacts, such as deepfakes, undermine our fundamental epistemic standards and practices. Yet, the nature of their epistemic threat remains elusive. After all, fictional or distorted representations of reality are as old as photography. We argue that the novel epistemic threat of synthetic media is that, for the first time, synthetic media tools afford ordinary computer users the practicable possibility to cheaply and effortlessly create and widely share fictional worlds indistinguishable from the real world or credible representations of it. We further argue that a synthetic media artifact is epistemically malignant in a given media context for a person acquainted with the context when the person is misled to confuse the version of the world depicted in it with the real world in an epistemically or morally significant way.
-
1201423.692644
This paper advocates for a pragmatist view on quantum theory, offering a response to David Wallace’s recent criticisms of Richard Healey’s quantum pragmatism. In particular, I challenge Wallace’s general claim that quantum pragmatists—and anti-representationalists more broadly— lack the resources to make sense of the novel ‘quantum’ language used throughout modern physics in applications of quantum theory. I then show how a novel way of viewing our current best physics and the relation between quantum and classical theories follows from the pragmatist view advanced in this paper.
-
1201446.692658
The pragmatist philosophy of language has undergone a significant revival in recent decades, emerging as a compelling alternative to the traditional representationalist view of language and its relation to thought and reality. Richard Rorty was instrumental in this resurgence, advancing his ‘neo-pragmatism’ as a radical, global anti-representationalism. Building on Rorty’s work, Robert Brandom and Huw Price have each developed distinct neo-pragmatist frameworks, refining and adapting his ideas in their own analytic vocabularies and presenting them in a less confrontational, more conciliatory tone. This chapter aims to advance this conciliatory tradition by offering a new vision of neo-pragmatism as an irenic—common-ground-seeking—approach to the philosophy of language, which I term irenic pragmatism.
-
1315674.692672
Political meritocracy is the idea that political institutions should aim to empower those people who are particularly well-suited to rule. This article surveys recent literature in democratic theory that argues on behalf of institutional arrangements that aim to realize the ideal of political meritocracy. We detail two prominent families of meritocratic proposals: nondemocratic meritocracy and weighted voting. We then describe and briefly evaluate five potentially important criticisms of political meritocracy related to the coherence of merit as an ideal, the demographic objection, rent-seeking, political inequality, and social peace. We also consider the key ways in which existing electoral democracies create spaces for institutionally meritocratic forms. Finally, we highlight the importance of exploring institutional innovations that allow democracies to effectively incorporate expertise without, at the same time, becoming vulnerable to the criticisms of political meritocracy that we discuss.
-
1489638.692689
Teleparallel Gravity (TPG) is an alternative, but empirically equivalent, spacetime theory to General Relativity. Rather than as a manifestation of spacetime curvature, TPG conceptualises gravitational degrees of freedom as a manifestation of spacetime torsion. In its modern formulation (as presented e.g. in the book-length study by Aldrovandi and Pereira (2013)), TPG also and expressly purports to be both a gauge theory of translations (G), as well as locally Lorentz-invariant (L). However, the reasoning which these authors invoke in order to implement (L) and (G) is often involved; indeed its mathematical coherence seems on occasion to be questionable. As such, clarification of the reasoning upon which TPG proponents rely in constructing the theory is sorely needed. The present paper will address this need. More broadly, we aim at achieving three interrelated tasks: (i) to shed light on TPG’s aspirations of maintaining (G) and (L) at the same time, (ii) to illuminate TPG’s conceptual and interpretative structure, and (iii) to offer a succinct methodological assessment of TPG as a theory per se.
-
1547402.692703
The concept of infinity has long occupied a central place at the intersection of mathematics and philosophy. This paper explores the multifaceted concept of infinity, beginning with its mathematical foundations, distinguishing between potential and actual infinity and outlining the revolutionary insights of Cantorian set theory. The paper then explores paradoxes such as Hilbert’s Hotel, the St. Petersburg Paradox, and Thomson’s Lamp, each of which reveals tensions between mathematical formalism and basic human intuition. Adopting a philosophical approach, the paper analyzes how five major frameworks—Platonism, formalism, constructivism, structuralism, and intuitionism—each grapple with the metaphysical and epistemological implications of infinity. While each framework provides unique insights, none fully resolves the many paradoxes inherent in infinite mathematical objects. Ultimately, this paper argues that infinity serves not as a problem to be conclusively solved, but as a generative lens through which to ask deeper questions about the nature of mathematics, knowledge, and reality itself.
-
1720361.692736
Even when one continued to speak of the fundamental concepts of theoretical physics as symbols, in order to avoid from the first any danger of ontological interpretation, there was a necessity of attributing to these very symbols themselves a theoretical meaning and therewith an “objective” content. Far from being merely arbitrary additions to what was given by direct observations they became essential factors with which alone an organization of the given, the fusion of the isolated details into the system of experience, was possible. The first great physicist actually to complete this turn of affairs and at the same time to grasp the full measure of its philosophical implications, was Heinrich Hertz, with whom began a new phase in the theory of physical methods.
-
1720407.692767
There is large consensus across clinical research that feelings of worthlessness (FOW) are one of the highest risk factors for a patient’s depression becoming suicidal. In this paper, I attempt to make sense of this empirical relationship from a phenomenological perspective. I propose that there are purely reactive and pervasive forms of FOW. Subsequently, I present a phenomenological demonstration for how and why it is pervasive FOW that pose a direct suicidal threat. I then outline criteria, contingent upon empirical verification, by which clinicians can more confidently identify when a patient’s FOW place them at high risk of suicide.