1. 18476.782425
    What's Really Wrong with "Luxury Beliefs" A critique of Rob Henderson, with a callback to Charles Murray I’ve yet to meet Rob Henderson, but I’ve watched some of his cultural criticism, and he seems like a great guy. …
    Found 5 hours, 7 minutes ago on Bet On It
  2. 90664.782602
    Many philosophers hold that generics (i.e., unquantified generalizations) are pervasive in communication and that when they are about social groups, this may offend and polarize people because generics gloss over variations between individuals. Generics about social groups might be particularly common on Twitter (X). This remains unexplored, however. Using machine learning (ML) techniques, we therefore developed an automatic classifier for social generics, applied it to 1.1 million tweets about people, and analyzed the tweets. While it is often suggested that generics are ubiquitous in everyday communication, we found that most tweets (78%) about people contained no generics. However, tweets with generics received more “likes” and retweets. Furthermore, while recent psychological research may lead to the prediction that tweets with generics about political groups are more common than tweets with generics about ethnic groups, we found the opposite. However, consistent with recent claims that political animosity is less constrained by social norms than animosity against gender and ethnic groups, negative tweets with generics about political groups were significantly more prevalent and retweeted than negative tweets about ethnic groups. Our study provides the first ML-based insights into the use and impact of social generics on Twitter.
    Found 1 day, 1 hour ago on PhilSci Archive
  3. 95786.782633
    This paper argues that the extended mind approach to cognition can be distinguished from its alternatives, such as embedded cognition and distributed cognition, not only in terms of metaphysics, but also in terms of epistemology. In other words, it cannot be understood in terms of a mere verbal redefinition of cognitive processing. This is because the extended mind approach differs in its theoretical virtues compared to competing approaches to cognition. The extended mind approach is thus evaluated in terms of its theoretical virtues, both essential to empirical adequacy and those that are ideal desiderata for scientific theories. While the extended mind approach may have similar internal consistency and empirical adequacy compared to other approaches, it may be more problematic in terms of its generality and simplicity as well as unificatory properties due to the cognitive bloat and the motley crew objections.
    Found 1 day, 2 hours ago on PhilSci Archive
  4. 95819.782663
    This article distinguishes between two different kinds of biological normativity. One is the ‘objective ’ biological normativity of biological units discussed in anglophone philosophy of biology on the naturalization of such notions as function and pathology. The other is a ‘subjective’ biological normativity of the biological subject discussed in the continental tradition of Canguilhem and Goldstein. The existence of these two distinct kinds of biological normativity calls for a closer philosophical examination of their relationship. The aim of this paper is to address this omission in the literature and to initiate the construction of conceptual bridges that span the gaps between continental, analytic, and naturalist philosophy on biological normativity.
    Found 1 day, 2 hours ago on PhilSci Archive
  5. 122839.782687
    I continue my selective 5-year review of some of the posts revolving around the statistical significance test controversy from 2019. This post was first published on the blog on November 14, 2019. I feared then that many of the howlers of statistical significance tests would be further etched in granite after the ASA’s P-value project, and in many quarters this is, unfortunately, true. …
    Found 1 day, 10 hours ago on D. G. Mayo's blog
  6. 140811.782713
    A standard line of iambic pentameter is five “feet,” each of which is an “iamb”—an unstressed, then a stressed syllable. Or so says the classical theory of English meter. Similarly, trochaic hexameter is four trochees (stressed-then-unstressed). …
    Found 1 day, 15 hours ago on Mostly Aesthetics
  7. 153556.782734
    I.J. Good’s “On the Principle of Total Evidence" (1967) looms large in decision theory and Bayesian epistemology. Good proves that in Savage’s (1954) decision theory, a coherent agent always prefers to collect, rather than ignore, free evidence. It is now well known that Good’s result was prefigured in an unpublished note by Frank Ramsey (Skyrms 2006). The present paper highlights another early forerunner to Good’s argument, appearing in Janina Hosiasson’s “Why do We Prefer Probabilities Relative to Many Data?" (1931), that has been neglected in the literature. Section 1 reviews Good’s argument and the problem it was meant to resolve; call this the value of evidence problem. Section 2 offers a brief history of the value of evidence problem and provides biographical background to contextualize Hosiasson’s contribution. Section 3 explicates the central argument of Hosiasson’s paper and considers its relationship to Good’s (1967).
    Found 1 day, 18 hours ago on PhilSci Archive
  8. 187589.782757
    We’ve been hard at work here in Edinburgh. Kris Brown has created Julia code to implement the ‘stochastic C-set rewriting systems’ I described last time. I want to start explaining this code and also examples of how we use it. …
    Found 2 days, 4 hours ago on Azimuth
  9. 211227.782778
    Judgment-aggregation theory has always focused on the attainment of rational collective judgments. But so far, rationality has been understood in static terms: as coherence of judgments at a given time, defined as consistency, completeness, and/or deductive closure. This paper asks whether collective judgments can be dynamically rational, so that they change rationally in response to new information. Formally, a judgment aggregation rule is dynamically rational with respect to a given revision operator if, whenever all individuals revise their judgments in light of some information (a learnt proposition), then the new aggregate judgments are the old ones revised in light of this information, i.e., aggregation and revision commute. We prove an impossibility theorem: if the propositions on the agenda are non-trivially connected, no judgment aggregation rule with standard properties is dynamically rational with respect to any revision operator satisfying some basic conditions. Our theorem is the dynamic-rationality counterpart of some well-known impossibility theorems for static rationality. We also explore how dynamic rationality might be achieved by relaxing some of the conditions on the aggregation rule and/or the revision operator. Notably, premise-based aggregation rules are dynamically rational with respect to so-called premise-based revision operators.
    Found 2 days, 10 hours ago on PhilSci Archive
  10. 211267.7828
    We propose an approach to the evolution of joint agency and cooperative behavior that contrasts with views that take joint agency to be a uniquely human trait. We argue that there is huge variation in cooperative behavior and that while much human cooperative behavior may be explained by invoking cognitively rich capacities, there is cooperative behavior that does not require such explanation. On both comparative and theoretical grounds, complex cognition is not necessary for forms of joint agency, or the evolution of cooperation. As a result, promising evolutionary approaches to cooperative behavior should explain how it arises across many contexts.
    Found 2 days, 10 hours ago on PhilSci Archive
  11. 211296.782822
    In quantum mechanics, we appeal to decoherence as a process that explains the emergence of a quasi-classical order. Decoherence has no classical counterpart. Moreover, it is an apparently irreversible process [1–7]. In this paper, we investigate the nature and origin of its irreversibility. Decoherence and quantum entanglement are two physical phenomena that tend to go together. The former relies on the latter, but the reverse is not true. One can imagine a simple bipartite system in which two microscopic subsystems are initially unentangled and become entangled at the end of the interaction. Decoherence does not occur, since neither system is macroscopic. Nevertheless, we will still need to quantify entanglement in order to describe the arrow of time associated with decoherence, because it occurs when microscopic systems become increasingly entangled with the degrees of freedom in their macroscopic environments. To do this we need to define entanglement entropy in terms of the sum of the von Neumann entropies of the subsystems.
    Found 2 days, 10 hours ago on PhilSci Archive
  12. 211370.782843
    A striking feature of our world is that we only seem to have records of the past. To explain this ‘record asymmetry’, Albert and Loewer claim that the Past Hypothesis induces a narrow probability density over the world’s possible past macrohistories, but not its future macro-histories. Because we’re indirectly acquainted with this low-entropy initial macrostate, our observations of records allow us to exploit the associated narrow density to infer the past. I will argue that Albert and Loewer cannot make sense of why this probabilistic structure exists without falling back on the very records they wish to explain. To avoid this circularity, I o↵er an alternative account: the ‘fork asymmetry’ explains the record asymmetry, and this in turn explains the narrow density - not vice versa.
    Found 2 days, 10 hours ago on PhilSci Archive
  13. 211406.782868
    Duality in the Exact Sciences: The Application to Quantum Mechanics.
    Found 2 days, 10 hours ago on PhilSci Archive
  14. 211436.78289
    It has recently been remarked that the argument for physicalism from the causal closure of the physical is incomplete. It is only effective against mental causation manifested in the action of putative mental forces that lead to acceleration of particles in the nervous system. Based on consideration of anomalous, physically unaccounted-for correlations of neural events, I argue that irreducible mental causation whose nature is at least prima facie probabilistic is conceivable. The manifestation of such causation should be accompanied by a local violation of the Second Law of thermodynamics. I claim that mental causation can be viewed as the disposition of mental states to alter the state probability distribution within the nervous system, with no violation of the conservation laws. If confirmed by neurophysical research, it would indicate a kind of causal homogeneity of the world. Causation would manifest probabilistically in both quantum mechanical and psychophysical systems, and the dynamics of both would be determined by the temporal evolution of the corresponding system state function. Finally, I contend that a probabilistic account of mental causation can consistently explain the character of the selectional states that ensure uniformity of causal patterns, as well as the fact that different physical realizers of a mental property cause the same physical effects in different contexts.
    Found 2 days, 10 hours ago on PhilSci Archive
  15. 213349.782914
    Pregnancy and birth can be approached from many philosophical angles, including philosophy of law, philosophy of biology, and mereology. Some authors have focused on ethical issues surrounding abortion and assisted reproduction, others have discussed pregnancy in phenomenological terms, and others have used pregnancy and/or birth as a springboard for more theoretical reflections on the nature of selfhood, care, embodiment, and personal identity (see entries on feminist perspectives on reproduction and the family, parenthood and procreation, and the grounds of moral status for discussions of these and related issues).
    Found 2 days, 11 hours ago on Stanford Encyclopedia of Philosophy
  16. 221984.782936
    TLDR: The vibes are bad, even though—on most ways we can measure—things are (comparatively) good. The last post showed how disproportionately-negative sharing can emerge from trying to solve problems. …
    Found 2 days, 13 hours ago on Stranger Apologies
  17. 248683.782957
    Anyone who has lived abroad knows the frustration of being held liable for the misdeeds of your country. Israelis get grilled about Palestine, Chinese receive disbelief over Xinjiang, Britons are berated for colonialism. ‘It’s not my fault!’ some are tempted to reply. ‘I attend protests; or I am politically repressed; or I wasn’t even born yet!’ Sometimes, the effects of our states’ wrongdoings hit us materially. When states pay compensation to the victims of their wrongdoings, these payments almost always detract from what would otherwise be enjoyed by those living in the state. Is this effect justified?
    Found 2 days, 21 hours ago on Stephanie Collins's site
  18. 253169.782979
    Scientific reasoning represents complex argumentation patterns that eventually lead to scientific discoveries. Social epistemology of science provides a perspective on the scientific community as a whole and on its collective knowledge acquisition. Different techniques have been employed with the goal of maximization of scientific knowledge on the group level. These techniques include formal models and computer simulations of scientific reasoning and interaction. Still, these models have tested mainly abstract hypothetical scenarios. The present thesis instead presents data-driven approaches in social epistemology of science. A data-driven approach requires data collection and curation for its further usage, which can include creating empirically calibrated models and simulations of scientific inquiry, performing statistical analyses, or employing data-mining techniques and other procedures.
    Found 2 days, 22 hours ago on Vlasta Sikimić's site
  19. 262853.783
    Suppose that a fair coin has been flipped in my absence. If it’s heads, there is an independent 50% chance that I will be irresistably brainwashed tonight after I go to bed in a way that permanently forces my credence in heads to zero. …
    Found 3 days, 1 hour ago on Alexander Pruss's Blog
  20. 268399.783055
    This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License <www.philosophersimprint.org/000000/> This can be pronounced as the claim that necessarily everything necessarily exists, if it is kept in mind that ‘exists’ is understood in terms of quantification and identity. The position is called first-order necessitism because the quantification in question is first-order, i.e. quantification into the syntactic position of singular terms. First-order contingentism is the view that it is a contingent matter what individuals there are. Thus, first-order contingentists assert the negation of (Nec), which amounts to the claim that possibly something could have failed to exist. Of course, most first-order contingentists also believe the stronger claim that in fact (not merely possibly) many (not just some) individuals could have failed to exist.
    Found 3 days, 2 hours ago on Lukas Skiba's site
  21. 270272.783078
    Alice is confused about the nature of practical rationality and asks wrong philosopher about it. She is given this advice: - For each of your options consider all the potential pleasures and pains for you that could result from the option. …
    Found 3 days, 3 hours ago on Alexander Pruss's Blog
  22. 273863.783099
    Commentary from Tina Röck on today’s post from Mazviita Chirimuuta on The Brain Abstracted (MIT Press). One way to read this book is to consider it a discussion of the limitations in our ability to understand hyper-complex, dynamic objects like the brain. …
    Found 3 days, 4 hours ago on The Brains Blog
  23. 288122.783124
    Post 5 of 5 from Mazviita Chirimuuta on The Brain Abstracted (Open Access: MIT Press). The last of this series of posts summarises the conclusions regarding philosophy of science more generally that emerge from this study of simplification in neuroscience. …
    Found 3 days, 8 hours ago on The Brains Blog
  24. 311818.783146
    In this chapter, I discuss time in nonrelativistic quantum theories. Within an instrumentalist theory like von Neumann’s axiomatic quantum mechanics, I focus on the meaning of time as an observable quantity, on the idea of time quantization, and whether the wavefunction collapse suggests that there is a preferred temporal direction. I explore this last issue within realist quantum theories as well, focusing on time reversal symmetry, and I analyze whether some theories are more hospitable for time travel than others.
    Found 3 days, 14 hours ago on Valia Allori's site
  25. 311852.783168
    This is a brief review of the history and development of quantum theories. Starting from the experimental findings and theoretical results which marked the crisis of the classical framework, I overview the rise of axiomatic quantum mechanics through matrix and wave mechanics. I discuss conceptual problems such as the measurement problem that led scientific realists to explore other, more satisfactory, quantum theories, as well as Bell’s theorem and quantum nonlocality, concluding with a short review of relativistic theories.
    Found 3 days, 14 hours ago on Valia Allori's site
  26. 326789.783189
    Pain, Ross; University of Bristol, Philosophy interventionism, transitions in human evolution, cultural complexity, causation, single-factor explanations Transitions in human evolution (e.g., the appearance of a novel technological industry) are typically complex events involving change at both spatial and temporal scales. As such, we expect them to have multiple causes. Yet it is commonplace for theorists to prioritise a single causal factor (e.g., cognitive change) in explaining these events. One rationale for this is pragmatic: theorists are specialised in a particular area—say, lithics or cognitive psychology—and so focus on one particular cause, holding all others equal. But could single-factor explanations ever be justified on objective grounds? In this article, we explore this latter idea using a highly influential theory of causation from the philosophy of science literature; namely, interventionism. This theory defines causation in a minimal way, and then draws a range of distinctions among causes, producing a range of different causal concepts. We outline some of these distinctions and show how they can be used to articulate when privileging one cause among many is objectively justified—and, by extension, when it is not. We suggest the interventionist theory of causation is thus a useful tool for theorists developing causal explanations for human behavioural evolution.
    Found 3 days, 18 hours ago on PhilSci Archive
  27. 326831.78321
    We propose a pluralist account of content for predictive processing systems. Our pluralism combines Millikan’s teleosemantics with existing structural resemblance accounts. The paper has two goals. First, we outline how a teleosemantic treatment of signal passing in predictive processing systems would work, and how it integrates with structural resemblance accounts. We show that the core explanatory motivations and conceptual machinery of teleosemantics and predictive processing mesh together well. Second, we argue this pluralist approach expands the range of empirical cases to which the predictive processing framework might be successfully applied. This because our pluralism is practice-oriented. A range of different notions of content are used in the cognitive sciences to explain behaviour, and some of these cases look to employ teleosemantic notions. As a result, our pluralism gives predictive processing the scope to cover these cases.
    Found 3 days, 18 hours ago on PhilSci Archive
  28. 326896.78323
    Drew Leder’s new book, The Healing Body, provides rich descriptions and analyses of ways to live well when faced with bodily afflictions, such as pain, illness, impairment, and aging. “Healing,” in Leder’s sense, goes beyond medical “treatment” of bodily dysfunctions as it aspires to regain existential wholeness “with reintegration of various dimensions of life that have been torn asunder by bodily breakdown” (2024, 27). The book belongs to a broader stream of phenomenological accounts of embodiment and illness that has thrived over the last few decades. In contrast to most contributions in this field that focus on bodily breakdowns, sometimes providing alternative understandings to biomedical accounts, Leder’s new book looks to the possibilities of existential recovery, especially where medical treatment has nothing more to offer. To anyone acquainted with this field, Leder should already be well known. In fact, the present book completes the trilogy that he has been working on for more than thirty years. Leder’s most famous book, The Absent Body (1990), is the first book in the trilogy and is a phenomenological account of the absence and presence of our lived bodies. Twenty-six years later, the second book appeared, The Distressed Body (2016), which circles in on chronic pain, illness, and incarceration. Turning to healing in his third book, Leder ends his trilogy in a hopeful key.
    Found 3 days, 18 hours ago on PhilSci Archive
  29. 326927.783251
    Robert Chapman’s Empire of Normality: Neurodiversity and Capitalism (2023) charts thinking about normality and pathology, showing how both have links to the historical conditions set by capitalism. The book also outlines a Marxist notion of neurodiversity, which rejects liberal capitalism. In this review, I outline Chapman’s argument and highlight its strengths. I also explore a potential consequence of what Chapman does not explore in their book, which is significant for notions of neurodiversity: if we reject liberal capitalism, we might also need to reject a key assumption of liberal capitalism; namely, that people have good self-understanding.
    Found 3 days, 18 hours ago on PhilSci Archive
  30. 326957.783287
    Advocates of philosophy in science and biomedicine argue that philosophers can embed their ideas into scientific research in order to help solve scientific problems (Pradeu et al. 2021). One successful example of this is the philosopher Thomas Pradeu’s essay, with Sébastien Jaeger and Eric Vivier, titled “The Speed of Change: Towards a Discontinuity Theory of Immunity?” published in Nature Reviews Immunology (2013). For my PhD in philosophy of science on Alzheimer’s disease embedded in a neurology environment, I was interested in the relationship between theory and practice, with a particular focus on the dominant “amyloid cascade hypothesis” of Alzheimer’s disease that has existed since the turn of the 1990s (Hardy and Higgins 1992; Hardy 2006; Herrup 2015; Kepp et al. 2023). According to this hypothesis, one of the brain proteins that defines Alzheimer’s disease—beta-amyloid—also causes it when it accumulates (Hardy and Higgins 1992). Thus, according to the hypothesis’s proponents, removing amyloid from the brain should be the priority for developing therapeutics. However, given the absence of effective treatments for Alzheimer’s disease based on this strategy, I was interested in whether this hypothesis represented a premature convergence of consensus around an untrue idea of what causes disease.
    Found 3 days, 18 hours ago on PhilSci Archive