-
191876.128472
This paper introduces the concept of regulatory kinds — socially constructed classifications that come to function epistemically like natural kinds through recursive uptake across institutional domains. These kinds do not reflect causal unity or semantic precision, but they acquire stability, portability, and predictive utility by being embedded in the inferential routines of medicine, law, policy, and science. I develop the notion of simulated kindhood to explain how such classifications support explanation and coordination despite lacking metaphysical integrity. Race serves as the central case: a contested and heterogeneous category that nonetheless endures as a diagnostic tool, a policy metric, and a risk factor. By treating race as a regulatory kind, the paper reframes classificatory persistence as an institutional phenomenon, rather than a cognitive or conceptual error. The account challenges traditional views of kindhood, highlights the epistemic logic of infrastructural classification, and raises ethical concerns about the reification of simulated categories.
-
249569.128531
We develop a classification of general Carrollian structures, permitting affine connections with both torsion and non-metricity. We compare with a recent classification of general Galilean structures in order to present a unified perspective on both. Moreover, we demonstrate how both sets of structures emerge from the most general possible Lorentzian structures in their respective limits, and we highlight the role of global hyperbolicity in constraining both structures. We then leverage this work in order to construct for the first time an ultra-relativistic geometric trinity of gravitational theories, and consider connections which are simultaneously compatible with Galilean and Carrollian structures. We close by outlining a number of open questions and future prospects.
-
249592.12854
The study of molecular structure has played a central role in the debate around chemistry’s reduction to quantum physics. So far, this case has been invoked to support the non-reducibility of chemistry. However, recent papers claim that there might not be any structure to be assigned to isolated molecules, thus prompting a deeper investigation of the nature of molecular structure. To this end, this paper explores two alternative accounts of structure: the relational and dispositional accounts. Each metaphysical account has interesting implications for the reduction debate and opens news ways of arguing for (but also against) the reducibility of chemistry. The aim is to show that the debate around chemistry’s reduction needs to be radically reframed so as to include a rigorous metaphysical analysis of the nature of molecular structure.
-
249613.128549
Two forms of chemical reaction statements are standardly found in the chemical corpus. First, individual reactions statements describe reactions that occur between specific chemical substances, leading to the production of specific substances. Secondly, general reactions statements describe chemical transformations between groups of substances. Both forms of statements track regularities in nature and are thus warranted to be viewed as representing causal relations. However, a convincing analysis in terms of causation also requires spelling out the metaphysical relation between individual and general reactions. This is because their relation prompts concerns regarding causal priority and causal overdetermination. I present these concerns and address them by arguing that we should view individual and general reactions in the context of the determinate/determinable distinction.
-
249634.128556
The two times problem, where time as experienced seems to have distinctive features different than those found in fundamental physics, appears to be more intractable than necessary, I argue, because the two times are marked out from the positions furthest apart: neuroscience and physics. I offer causation as exactly the kind of bridge between these two times that authors like Buonomano and Rovelli (forthcoming) are seeking. It is a historical contingency from philosophical discussions around phenomenology, and methodological artefact from neuroscience, that most studies of temporal features of experience require subjects to be sufficiently still that their engagement with affordances in the environment can be at best tested in artificial and highly constrained ways. Physics does not offer an account of causation, but accounts of causation are tied to or grounded in physics in ways that can be clearly delineated. Causation then serves as a bridge that coordinates time as experienced, via interaction with affordances in the environment, with time in physics as it constrains causal relationships. I conclude by showing how an information-theoretic account of causation fits neatly into and extends the information gathering and utilizing system (IGUS) of Gruber et al (2022).
-
249656.128563
This article offers a hybrid account of regulatory kinds and subjective fit to explain why the oft-invoked analogy between gender transition and so-called race transition fails both conceptually and normatively. The argument—recently circulated in popular commentary and endorsed by figures such as Richard Dawkins—suggests that if gender transition is legitimate on the basis of social construction, then racial transition should be equally so. Yet since racial transition is generally regarded as illegitimate, the analogy concludes that gender transition must be suspect. I argue that this inference rests on a category error: it conflates social construction with norm-governed intelligibility.
-
249678.128568
In a reliabilist epistemology of algorithms, a high frequency of accurate output representations is indicative of the algorithm’s reliability. Recently, Humphreys challenged this assumption, arguing that reliability depends not only on frequency but also on the quality of outputs. Specifically, he contends that radical and egregious misrepresentations have a distinct epistemic impact on our assessment of an algorithm’s reliability, regardless of the frequency of their occurrence. He terms these statistically insignificant but serious errors (SIS-Errors) and maintains that their occurrence warrants revoking our epistemic attitude towards the algorithm’s reliability. This article seeks to defend reliabilist epistemologies of algorithms against the challenge posed by SIS-Errors. To this end, I draw upon computational reliabilism as a foundational framework and articulate epistemological conditions designed to prevent SIS-Errors and thus preserve algorithmic reliability.
-
249699.128573
– In opposition to traditional approaches in metaphysics of science, Entity Realism proposes to extract ontological commitments from experimental practice instead of abstract theories, using an inference from manipulability to existence that would be continuous with everyday inferences regarding ordinary objects. A problem is that most accounts of ordinary artefacts make them mind-dependent or language-dependent, and so not real by philosophical standards. Furthermore, the functional kinds of biology and chemistry are not necessarily compatible with mind-independence either. It follows that Entity Realism is better understood within a pragmatist or deflationary alternative to standard metaphysics. The approach is beneficial for responding to sceptical arguments.
-
249722.12858
This article analyzes some of the methodological tensions that can be observed in the regulation of science and technology, and that often manifest themselves as controversies. We offer a three-way classification of such tensions. The latter can arise from: 1) external (non-cognitive) factors that are specific to a particular regulation; 2) external (non-cognitive) factors of wider societal importance that are not related to any particular regulatory process; and 3) internal (non-cognitive, as well as cognitive) factors related to the cognitive, as well as practical limitations of a particular scientific methodology in the context of regulatory decision making. We analyze case studies of regulation of, among other, pharmaceuticals, chemical products, health claims on foods, as well as genetically modified organisms. The analysis shows that most often such methodological tensions are driven, directly or indirectly, by different stances with respect to non-cognitive factors that underlie the fundamental choices of methods and standards, and therefore the data that underpin regulatory decisions. Our paper makes clear an important feature of regulatory science: cognitive factors (like improved scientific data or accepted best practices), that in academic science facilitate the resolution of debates, in regulatory science do not suffice for achieving closure with respect to such tensions. Any attempt at closure has to deal primarily with the relevant non-cognitive factors.
-
249744.128585
Advances in animal sentience research, neural organoids, and artificial intelligence reinforce the relevance of justifying attributions of consciousness to non-standard systems. Clarifying the argumentative structure behind these attributions is important for evaluating their validity. This paper addresses this issue, concluding that analogical abduction – a form of reasoning combining analogical and abductive elements – is the strongest method for extrapolating consciousness from humans to non-standard systems. We argue that the argument from analogy and inference to the best explanation, individually taken, do not meet the criteria for successful extrapolations, while analogical abduction offers a promising approach despite limitations in current consciousness science.
-
364716.128592
Suppose that the right way to combine epistemic utilities or scores across individuals is averaging, and I am an epistemic act expected-utility utilitarian—I act for the sake of expected overall epistemic utility. …
-
364717.128598
An infinitely long life of repetition of a session meaningful pleasure followed by a memory wipe. A closed time loop involving one session of the meaningful pleasure followed by a memory wipe. Scenario (1) involves infinitely many sessions of the meaningful pleasure. …
-
365039.128603
Patterns and pattern ontologies are a powerful way for pragmatists to address metaphysical issues by rejecting a false dichotomy between pluralism and realism. However, there is a common misconception about patterns that I call the philosophically perverse patterns (PPP) problem. Here, critics of patterns invent perverse examples that meet the metaphysical criteria to count as patterns. I defuse this concern by showing how PPP misunderstands what the pragmatist metaphysics of patterns is supposed to accomplish: the bare definition should not rule out, or in, substantive examples of patterns that instead should involve methodological considerations. I use this response to the PPP problem to show how the metaphysical definition of 'pattern' allows the pragmatist to capture the rich intricacies of ontologies in the sciences and yields two illustrative norms by which methodology can be guided in developing or refining ontologies: cohesion and coherence.
-
365096.128609
Recent work in quantum gravity (QG) suggests that spacetime is not fundamental. Rather, spacetime emerges from an underlying non-spatiotemporal reality. Spacetime functionalism has been proposed as one way to make sense of the emergence of spacetime. However, spacetime functionalism faces a ‘collapse’ problem. The functionalist analysis seems to force spacetime into the (more) fundamental ontology of QG, thereby conflicting with—rather than elucidating—spacetime emergence. In this paper, I show how to resolve the collapse problem. The solution is to differentiate between physical and metaphysical notions of (relative) fundamentality. With this distinction in hand, we can see that spacetime functionalism does not after all force spacetime into the (more) fundamental ontology of QG in any troubling sense. A side benefit of the paper is that it provides a sharpened characterisation of various notions of (relative) fundamentality.
-
365140.128614
This paper proposes that relational ontology, which defines existence through relations, serves as a bridge between scientific realism and empiricism by offering a structural criterion for scientific explanation. Through case studies in quantum mechanics and thermodynamics, we illustrate how relationality grounds scientific theories in empirical interactions while supporting realist commitments to unobservable structures. Engaging with philosophy of science debates—realism, reductionism, and demarcation—and drawing on thinkers such as Lakatos, Kuhn, Cartwright, van Fraassen, and contemporary authors like Ladyman and Chakravartty, this work examines the explanatory limits of relational ontology in addressing consciousness and contrasts scientific explanations with non-scientific accounts. Its original contribution lies in demonstrating how relational ontology unifies these perspectives through a rigorous structural criterion, advancing our understanding of scientific explanation within the philosophy of science.
-
365200.12862
Ecosystems are increasingly being represented as marketplaces that produce goods for humanity, and because of this, economic metaphors for increasing efficiency have been introduced into conservation. A powerful model for economic growth is the globalised free market and some are implicitly deploying it to suggest changes in conservation practice. Ecological globalisation is the position that we should not control the free movement of species and re-wilding occurs most efficiently through non-intervention. When species can move and interact with new ecological systems, they create novel ecosystems. These novel arrangements create experimental markets in nature's economy, providing opportunities for the efficient production of goods for humans, also known as ecosystem services. When invasive species supersede local populations, it indicates previous biotic systems were inefficient, which is why they were replaced, and therefore it is wrong to protect indigenous ‘losers’ from extinction. Those who defend indigenous species are accused of being xenophobic against recent biotic migrants. This position is flawed both empirically and morally as there is a disconnect between these economic and political arguments when applied to human economies and nature's economy.
-
410569.128628
The dialogical stance on meaning in the Lorenzen-Lorenz tradition is dynamic, as it is based on interaction between players, and contextual, as meaning depends on the set of rules adopted for the dialogical justification of claims including those implicit in linguistic practice. Grasping the meaning of an expression or an action amounts to identifying the rationale behind our verbal and behavioural practices. This knowledge is informed by the collective intelligence embodied within public criticism Different aspects of meaning are made explicit within the game rules: particle rules for the meaning of logical constants, the Socratic rule for non-logical constants and structural rules that set contextual meaning by shaping the development of a play. The level of plays is governed by these meaning-determining rules, and validity (or proof) is built from the plays. The result is a framework that grounds language and logic in the dynamics of dialogical meaning, and which has proven fruitful for studying frameworks for the logical analysis of language, modern and ancient.
-
413668.128634
Our adversarial system of international relations poses substantial risks of violent catastrophe and impedes morally urgent initiatives and reform collaborations. The domestic politics of more evolved societies provide guidance toward a better world governed by just rules, which ensure that basic human needs are met, inequalities constrained, and weapons and wealth marginalized as tools for influencing political and judicial outcomes. Impartial administration, adjudication, and enforcement of just rules require a strong normative expectation on officials and citizens to fully subordinate their personal and national loyalties to their shared commitment to the just and fair functioning of the global order. As we have fought nepotism within states, we must fight nepotism on behalf of states to overcome humanity’s great common challenges. To moralize international relations, states can plausibly begin with reforming the world economy toward ending severe poverty, thereby building the trust and respect needed for more difficult reforms.
-
413980.128645
I argue that moral dialogue concerning an agent’s standing to blame facilitates moral understanding about the purported wrongdoing that her blame targets. Challenges to a blamer’s standing serve a communicative function: they initiate dialogue or reflection meant to align the moral understanding of the blamer and challenger. On standard accounts of standing to blame, challenges to standing facilitate shared moral understanding about the blamer herself: it matters per se whether the blamer has a stake in the purported wrongdoing at issue, is blaming hypocritically, or is complicit in the wrongdoing at issue. In contrast, I argue that three widely recognized conditions on standing to blame—the business, non-hypocrisy, and non-complicity conditions—serve as epistemically tractable proxies through which we evaluate the accuracy and proportionality of blame. Standing matters because, and to the extent that, it indirectly informs our understanding of the purported wrongdoing that an act of blaming targets.
-
423131.128652
I present an argument that undermines the standardly held view that chemical substances are natural kinds. This argument is based on examining the properties required to pick out members of these purported kinds. In particular, for a sample to be identified as -say- a member of the kind-water, it has to be stable in the chemical sense of stability. However, the property of stability is artificially determined within chemical practice. This undermines the kindhood of substances as they fail to satisfy one of two key requirements: namely that they are picked out by (some) natural properties and that they are categorically distinct. This is a problem specifically for the natural realist interpretation of kinds. I discuss whether there are other ways to conceive of kinds in order to overcome it.
-
423157.128658
In his 1997 paper “Technology and Complexity” Dasgupta draws a distinction between systematic and epistemic complexity. Entities are called systematically complex when they are composed of a large number of parts that interact in complicated ways. This means that even if one knows the properties of the parts one may not be able to infer the behaviour of the system as a whole. In contrast, epistemic complexity refers to the knowledge that is used in, or generated by the making of an artefact and is embodied in it. Interestingly, a high level of systematic complexity does not entail a high level of epistematic complexity and vice versa.
-
423230.128664
What distinguishes genuine intelligence from sophisticated simulation? This paper argues that the answer lies in symbolic coherence—the structural capacity to interpret information, revise commitments, and maintain continuity of reasoning across contradiction. Current AI systems generate fluent outputs while lacking mechanisms to track their own symbolic commitments or resolve contradictions through norm-guided revision. This theory proposes F(S), a structural identity condition requiring interpretive embedding, reflexive situatedness, and internal normativity. This condition is substrate-neutral and applies to both biological and artificial systems. Unlike behavioral benchmarks, F(S) offers criteria for participation in symbolic reasoning rather than surface-level imitation. To demonstrate implementability, the paper presents a justification graph architecture that supports recursive coherence and transparent revision. A diagnostic scalar, symbolic density, tracks alignment over symbolic time. By uniting philosophical insights with concrete system design, this framework outlines foundations for machines that may one day understand rather than simulate understanding.
-
423250.12867
Several prominent scientists, philosophers, and scientific institutions have argued that science cannot test supernatural worldviews on the grounds that (1) science presupposes a naturalistic worldview (Naturalism) or that (2) claims involving supernatural phenomena are inherently beyond the scope of scientific investigation. The present paper argues that these assumptions are questionable and that indeed science can test supernatural claims. While scientific evidence may ultimately support a naturalistic worldview, science does not presuppose Naturalism as an a priori commitment, and supernatural claims are amenable to scientific evaluation. This conclusion challenges the rationale behind a recent judicial ruling in the United States concerning the teaching of “Intelligent Design” in public schools as an alternative to evolution and the official statements of two major scientific institutions that exert a substantial influence on science educational policies in the United States. Given that science does have implications concerning the probable truth of supernatural worldviews, claims should not be excluded a priori from science education simply because they might be characterized as supernatural, paranormal, or religious. Rather, claims should be excluded from science education when the evidence does not support them, regardless of whether they are designated as ‘natural’ or ‘supernatural’.
-
423272.128676
It has long been known that brain damage has negative effects on one’s mental states and alters (or even eliminates) one’s ability to have certain conscious experiences. Even centuries ago, a person would much prefer to suffer trauma to one’s leg, for example, than to one’s head. It thus stands to reason that when all of one’s brain activity ceases upon death, consciousness is no longer possible and so neither is an afterlife. It seems clear from all the empirical evidence that human consciousness is dependent upon the functioning of individual brains, which we might call the “dependence thesis.” Having a functioning brain is, at minimum, necessary for having conscious experience, and thus conscious experience must end when the brain ceases to function.
-
425472.128683
It has long been considered a truism that we can learn more from a variety of sources than from highly correlated sources. This truism is captured by the Variety of Evidence Thesis. To the surprise of many, this thesis turned out to fail in a number of Bayesian settings. In other words, replication can trump variation. Translating the thesis into IP we obtain two distinct, a priori plausible formulations in terms of ‘increased confirmation’ and ‘uncertainty reduction’, respectively. We investigate both formulations, which both fail for different parameters and different reasons, that cannot be predicted prior to formal analysis. The emergence of two distinct formulations distinguishing confirmation increase from uncertainty reduction, which are conflated in the Bayesian picture, highlights fundamental differences between IP and Bayesian reasoning.
-
429834.128688
Suppose infinitely many blindfolded people, including yourself, are uniformly randomly arranged on positions one meter apart numbered 1, 2, 3, 4, …. Intuition: The probability that you’re on an even-numbered position is 1/2 and that you’re on a position divisible by four is 1/4. …
-
451817.128694
A general challenge in life is how to avoid being duped or exploited by clever-sounding but ultimately facile reasoning. One thing’s for sure, you don’t want to internalize the following norm:
(Easy Dupe): Whenever you hear an argument for doing X, and you can’t immediately refute it, you are thereby rationally committed to doing X. …
-
454908.128699
Many physicalists nowadays, and Bigelow for one, stand ready to carry metaphysical baggage when they find it worth the weight. This physicalist’s philosophy of mathematics is premised on selective, a posteriori realism about immanent universals. Bigelow’s universals, like D. M. Armstrong’s, are recurrent elements of the physical world; and mathematical objects are universals. The result is a thoroughgoing threefold realism: mathematical realism, scientific realism, and the realism that stands opposed to nominalism.
-
468256.128704
John Kay and Mervyn King (2020) propose a definition of radical certainty as a form of ontological uncertainty, rather than of epistemic uncertainty: the radical form is considered to be not resolvable. Their notion of radical uncertainty can be likened to the notion of the 'unknown unknowns', which refers to the aspects of uncertainty that are not readily apparent or quantifiable. Thus, instead of seeking solutions through probabilistic methods, Kay&King invite us to embrace this form of uncertainty and develop forms of reasoning within a framework where we do not know what we do not know, drawing inspiration from modal approaches to open futures (and even pasts).
-
524855.128709
Edward Craig’s function-first methodology says we can illuminate the concept of knowledge by asking what functions the concept evolved to fulfil. To do this, Craig imagines a fictional state of nature in which humans lacked the concept. Hilary Kornblith rejects every part of Craig’s methodology. He instead develops a naturalistic epistemology, according to which we should study knowledge—not its concept—through the scientific study of animal cognition.