1. 709889.51695
    When is it explanatorily better to adopt a conjunction of explanatory hypotheses as opposed to committing to only some of them? Although conjunctive explanations are inevitably less probable than less committed alternatives, we argue that the answer is not ‘never’. This paper provides an account of the conditions under which explanatory considerations warrant a preference for less probable, conjunctive explanations. After setting out four formal conditions that must be met by such an account, we consider the shortcomings of several approaches. We develop an account that avoids these shortcomings and then defend it by applying it to a well-known example of explanatory reasoning in contemporary science.
    Found 1 week, 1 day ago on PhilSci Archive
  2. 709912.517108
    Existing metaphysical accounts of mechanisms commit to the existence of objects or entities posited in scientific theories, and thus fall within the category of maximal metaphysics. In this paper, I demonstrate the incompatibility of object-based metaphysics of mechanisms with the prevailing trend in the philosophy of physics by discussing the so-called bottoming-out problem. In response, I propose and flesh out a structuralist metaphysics of mechanisms based on Ontic Structural Realism (OSR), which is a kind of minimal metaphysics. I argue that the metaphysical underpinnings of mechanisms are structures, whose metaphysical nature is elaborated through comparison with existing metaphysical theories of mechanisms. After that, I address the concern of whether objects in mechanisms can be accommodated in my account by invoking existing metaphysical theories of objects in special science by structuralists, such as Ladyman and Ross (2007)’s real pattern account and suggesting a potential alignment between OSR and processual ontology. Finally, I demonstrate how my view can naturally serve as the metaphysics for Mechanism 2.0 and be applied to systems biology.
    Found 1 week, 1 day ago on PhilSci Archive
  3. 709950.517123
    Analyzing the doctrine of methodological individualism and its opposition to methodological holism, I start by briefly reviewing three historical periods in which the discussion around it was very lively (i.e., the turn of the century around 1900, the 1950s, and the 1980s-90s) explicating the variety of characterizations of methodological individualism. To highlight the connection of these philosophical discussions to social scientific practice, intradisciplinary as well as interdisciplinary dynamics, I then look into the debates around microfoundations (in the 1980s) and so-called economics imperialism in the social sciences (and its impact within political science in the 1990s). While both the microfoundations project and economics imperialism are driven by methodological individualists, here too it seems important to pay attention to the variety of understandings of methodological individualism as well as the exact ambition of their undertakings (evaluated along different dimensions, i.e., ontological, epistemological, axiological, and institutional). Overall, I conclude that when looking into the history of the social sciences, the best philosophers can do when discussing methodological individualism and holism is to carefully distinguish the broad variety of positions that can be evaluated along multiple dimensions, to demonstrate the benefits of methodological pluralism as well as expose the downsides of methodological monocultures.
    Found 1 week, 1 day ago on PhilSci Archive
  4. 709976.51714
    In this work we argue against the interpretation that underlies the “Standard” account of Quantum Mechanics (SQM) that was established during the 1930s by Niels Bohr and Paul Dirac. Ever since, following this orthodox narrative, physicists have dogmatically proclaimed –quite regardless of the deep contradictions and problems– that the the theory of quanta describes a microscopic realm composed of elementary particles (such as electrons, protons and neutrons) which underly our macroscopic world composed of tables, chairs and dogs. After critically addressing this atomist dogma still present today in contemporary (quantum) physics and philosophy, we present a new understanding of quantum individuals defined as the minimum set of relations within a specific degree of complexity capable to account for all relations within that same degree. In this case, quantum individuality is not conceived in absolute terms but –instead– as an objectively relative concept which even though depends of the choice of bases and factorizations remain nonetheless part of the same invariant representation.
    Found 1 week, 1 day ago on PhilSci Archive
  5. 716331.517154
    A common assumption in discussions of abilities is that phobias restrict an agent's abilities. Arachnophobics, for example, can't pick up spiders. I wonder if this is true, if we're talking about the pure 'can' of ability. …
    Found 1 week, 1 day ago on wo's weblog
  6. 727595.517164
    One more observation on Integrated Information Theory (IIT), in Aaronson’s simplified formulation. Let RM, N be a wide rectangular grid of points (x,y) with x and y integers such that 0 ≤ x < M and 0 ≤ y < N. Suppose M ≫ 4N and M is divisible by four. …
    Found 1 week, 1 day ago on Alexander Pruss's Blog
  7. 738207.517175
    Discourse involving predicates of personal taste (PPT) such as ‘delicious,’ ‘disgusting,’ ‘fun,’ and ‘cool’ has been a focal point in a large, interdisciplinary body of research spanning the past 20 years. This research has shown that PPT are connected to numerous topics, including disagreement, meaning, context-sensitivity, subjectivity and objectivity, truth, aesthetic and gustatory taste, evaluation, speech acts, and so on. Researchers involved in the PPT debates have developed many subtle and inventive analyses of PPT, so that anyone interested in their behaviour must traverse a complex theoretical landscape. Despite the massive amount of work on the topic, there is a crucial methodological question about PPT that remains underexplored: what sorts of evidence should be called upon to evaluate an analysis of PPT? So far, most researchers have operated from the armchair, using their own intuitions about various linguistic phenomena to evaluate analyses of PPT. In recent years, however, certain philosophers and linguists have found this method wanting, noting that hypotheses about PPT are empirical, and thus need to be evaluated empirically.
    Found 1 week, 1 day ago on Jeremy Wyatt's site
  8. 767661.517186
    Standard textbooks on quantum mechanics present the theory in terms of Hilbert spaces over the …eld of complex numbers and complex linear operator algebras acting on these spaces. What would be lost (or gained) if a different scalar …eld, e.g. the real numbers or the quaternions, were used? This issue arose with the birthing of the new quantum theory, and over the decades it has been raised over and over again, drawing a variety of different opinions. Here I attempt to identify and to clarify some of the key points of contention, focusing especially on procedures for complexifying real Hilbert spaces and real algebras of observables.
    Found 1 week, 1 day ago on PhilSci Archive
  9. 767682.517196
    Teleparallel gravity shares many qualitative features with general relativity, but differs from it in the following way: whereas in general relativity, gravitation is a manifestation of space-time curvature, in teleparallel gravity, spacetime is (always) flat. Gravitational effects in this theory arise due to spacetime torsion. It is often claimed that teleparallel gravity is an equivalent reformulation of general relativity. In this paper we question that view. We argue that the theories are not equivalent, by the criterion of categorical equivalence and any stronger criterion, and that teleparallel gravity posits strictly more structure than general relativity.
    Found 1 week, 1 day ago on PhilSci Archive
  10. 767731.517206
    Kuhn’s analysis of the structure and function of the scientific community has been recently re-interpreted as a seminal contribution to the so-called social epistemology of science. Kuhn’s social epistemology should be considered as part of a normative-descriptive philosophical framework in which epistemological, historical, sociological, and psychological elements are interconnected. In this chapter, I will compare Kuhn’s seminal insights with two contemporary approaches to the social epistemology of science, namely: the development of idealised formal models of the scientific community and the use of qualitative studies for philosophical purposes. On the one hand, these contemporary approaches to social epistemology may be regarded as developing some of Kuhn’s views in new and exciting ways. On the other hand, however, it is still not entirely clear which kind of general philosophical ‘image of science’ they are contributing to. This chapter, therefore, aims at illuminating how analysing some of the contemporary debates in social epistemology through the lenses of Kuhn’s philosophy may recast under a new light the issue of the value of the study of the social dimension of scientific research for general philosophy of science.
    Found 1 week, 1 day ago on PhilSci Archive
  11. 824866.517216
    This paper analyses the phenomenology and epistemology of chatbots such as ChatGPT and Bard. The computational architecture underpinning these chatbots are large language models (LLMs), which are generative artificial intelligence (AI) systems trained on a massive dataset of text extracted from the Web. We conceptualise these LLMs as multifunctional computational cognitive artifacts, used for various cognitive tasks such as translating, summarizing, answering questions, information-seeking, and much more. Phenomenologically, LLMs can be experienced as a “quasi-other”; when that happens, users anthropomorphise them. For most users, current LLMs are black boxes, i.e., for the most part, they lack data transparency and algorithmic transparency. They can, however, be phenomenologically and informationally transparent, in which case there is an interactional flow. Anthropomorphising and interactional flow can, in some users, create an attitude of (unwarranted) trust towards the output LLMs generate. We conclude this paper by drawing on the epistemology of trust and testimony to examine the epistemic implications of these dimensions. Whilst LLMs generally generate accurate responses, we observe two epistemic pitfalls. Ideally, users should be able to match the level of trust that they place in LLMs to the degree that LLMs are trustworthy. However, both their data and algorithmic opacity and their phenomenological and informational transparency can make it difficult for users to calibrate their trust correctly. The effects of these limitations are twofold: users may adopt unwarranted attitudes of trust towards the outputs of LLMs (which is particularly problematic when LLMs hallucinate), and the trustworthiness of LLMs may be undermined.
    Found 1 week, 2 days ago on Matteo Colombo's site
  12. 839677.517224
    What should morally conscientious agents do if they must choose among options that are somewhat right and somewhat wrong? Should you select an option that is right to the highest degree, or would it perhaps be more rational to choose randomly among all somewhat right options? And how should lawmakers and courts address behavior that is neither entirely right nor entirely wrong? In this first book-length discussion of the “gray area” in ethics, Martin Peterson challenges the assumption that rightness and wrongness are binary properties. He argues that some acts are neither entirely right nor entirely wrong, but rather a bit of both. Including discussions of white lies and the permissibility of abortion, Peterson’s book presents a gradualist theory of right and wrong designed to answer pressing practical questions about the gray area in ethics.
    Found 1 week, 2 days ago on Martin Peterson's site
  13. 846698.517237
    Metric verse sticks in the mind. If “gather ye rosebuds while ye may” isn’t lodged in your memory, something else iambic is: “Friends, Romans, countrymen,” or “to be or not to be.” But there are other means to this end. …
    Found 1 week, 2 days ago on Mostly Aesthetics
  14. 865415.517248
    Suppose that an informant (test, expert, device, perceptual system, etc.) is unlikely to err when pronouncing on a particular subject matter. When this is so, it might be tempting to defer to that informant when forming beliefs about that subject matter. How is such an inferential process expected to fare in terms of truth (leading to true beliefs) and evidential fit (leading to beliefs that fit one’s total evidence)? Using a medical diagnostic test as an example, we set out a formal framework to investigate this question. We establish seven results and make one conjecture. The first four results show that when the test’s error probabilities are low, the process of deferring to the test can score well in terms of (i) both truth and evidential fit, (ii) truth but not evidential fit, (iii) evidential fit but not truth, or (iv) neither truth nor evidential fit. Anything is possible. The remaining results and conjecture generalize these results in certain ways. These results are interesting in themselves—especially given that the diagnostic test is not sensitive to the target disease’s base rate—but also have broader implications for the more general process of deferring to an informant. Additionally, our framework and diagnostic example can be used to create test cases for various reliabilist theories of inferential justification. We show, for example, that they can be used to motivate evidentialist process reliabilism over process reliabilism.
    Found 1 week, 3 days ago on Michael Roche's site
  15. 928326.517258
    In The Yale Review, Lydia Davis writes about seeing the dark: Absolute, unbroken darkness feels like one massive, enveloping substance, though it is not a substance and is not palpable. It feels close to the face, right up against the face. …
    Found 1 week, 3 days ago on Under the Net
  16. 939624.517268
    Ordinal, interval, and ratio scales are discussed and arguments for the thesis that “better than” comparisons reside on interval or ratio scales are laid out. It is argued that linguistic arguments are not conclusive since alternative rank-based definitions can be given, and that in general “better than” comparisons do not have a common scale type. Some comparison dimensions reside on ratio scales, whereas others do not show any evidence of lying on a scale stronger than an ordinal scale.
    Found 1 week, 3 days ago on Erich Rast's site
  17. 940666.517279
    In this paper we establish an Epistemological Thesis based on Bohrian thought constituted of three different claims: the continuity claim, the classicality claim, and the limiting claim. The thesis is founded on a notion of physicality as spatiotemporality which is used to show the necessity of application of classical concepts in physical descriptions within physical theories. Further, various views on the metaphysics of the wavefunction are analysed in view of the notion of physicality as mentioned above, along with the implied necessity of the classical conceptual framework. These approaches to the metaphysics of the wavefunction necessitated by non-locality is seen as the basis of limit of classical physical description, therefore, of description of quantum phenomena. In view of the established thesis, two more complete alternatives to Bohrian thought, Bohmian Mechanics and GRW theory are analysed and the persistence of elements of Bohrian thought along with a vindication of the doctrine of classical concepts within both of these alternative theories is shown.
    Found 1 week, 3 days ago on PhilSci Archive
  18. 940689.517289
    Recent advances in stem cell-derived human brain organoids and microelectrode array (MEA) technology raise profound questions about the potential for these systems to give rise to sentience. Brain organoids are 3D tissue constructs that recapitulate key aspects of brain development and function, while MEAs enable bidirectional communication with neuronal cultures. As brain organoids become more sophisticated and integrated with MEAs, the question arises: Could such a system support not only intelligent computation, but subjective experience? This paper explores the philosophical implications of this thought experiment, considering scenarios in which brain organoids exhibit signs of sensory awareness, distress, preference, and other hallmarks of sentience. It examines the ethical quandaries that would arise if compelling evidence of sentience were found in brain organoids, such as the moral status of these entities and the permissibility of different types of research. The paper also explores how the phenomenon of organoid sentience might shed light on the nature of consciousness and the plausibility of artificial sentience. While acknowledging the speculative nature of these reflections, the paper argues that the possibility of sentient brain organoids deserves serious consideration given the rapid pace of advances in this field. Grappling with these questions proactively could help set important ethical boundaries for future research and highlight critical avenues of scientific and philosophical inquiry. The thought experiment of sentient brain organoids thus serves as a valuable lens for examining deep issues at the intersection of neuroscience, ethics, and the philosophy of mind.
    Found 1 week, 3 days ago on PhilSci Archive
  19. 940721.517299
    This paper attempts to revive the epistemological discussion of scientific articles. What are their epistemic aims, and how are they achieved? We argue that scientific experimental articles are best understood as a particular kind of narrative: i.e., modernist narratives (think: Woolf, Joyce), at least in the sense that they employ many of the same techniques, including colligation and the juxtaposition of multiple perspectives. We suggest that this way of writing is necessary given the nature of modern science, but it also has specific epistemic benefits: it provides readers with an effective way to grasp the content of scientific articles which increases their understanding. On the other hand, modernist writing is vulnerable to certain kinds of epistemic abuses, which can be found instantiated in modern scientific writing as well.
    Found 1 week, 3 days ago on PhilSci Archive
  20. 944573.517311
    I show really be done with Integrated Information Theory (IIT), in Aaronson’s simplified formulation, but I noticed a rather interesting difficult. In my previous post on the subject, I noticed that a double grid system where there are two grids stacked on top of one another, with the bottom grid consisting of inputs and the upper grid of outputs, and each upper value being the logical OR of the (up to) five neighboring input values will be conscious according to IIT if all the values are zero and the grid is large enough. …
    Found 1 week, 3 days ago on Alexander Pruss's Blog
  21. 998402.517322
    In this paper I return to Hubert Dreyfus’ old but influential critique of artificial intelligence, redirecting it towards contemporary predictive processing models of the mind (PP). I focus on Dreyfus’ arguments about the “frame problem” for artificial cognitive systems, and his contrasting account of embodied human skills and expertise. The frame problem presents as a prima facie problem for practical work in AI and robotics, but also for computational views of the mind in general, including for PP. Indeed, some of the issues it presents seem more acute for PP, insofar as it seeks to unify all cognition and intelligence, and aims to do so without admitting any cognitive processes or mechanisms outside of the scope of the theory. I contend, however, that there is an unresolved problem for PP concerning whether it can both explain all cognition and intelligent behavior as minimizing prediction error with just the core formal elements of the PP toolbox, and also adequately comprehend (or explain away) some of the apparent cognitive differences between biological and prediction-based artificial intelligence, notably in regard to establishing relevance and flexible context-switching, precisely the features of interest to Dreyfus’ work on embodied indexicality, habits/skills, and abductive inference. I address several influential philosophical versions of PP, including the work of Jakob Hohwy and Andy Clark, as well as more enactive-oriented interpretations of active inference coming from a broadly Fristonian perspective.
    Found 1 week, 4 days ago on PhilSci Archive
  22. 998422.517332
    Scientists have the epistemic responsibility of producing knowledge. They also have the social responsibility of aligning their research with the needs and values of various societal stakeholders. Individual scientists may be left with no guidance on how to prioritise and carry these different responsibilities. As I will argue, however, the responsibilities of science can be harmonised at the collective level. Drawing from debates in moral philosophy, I will propose a theory of the collective responsibilities of science that accounts for the internal diversity of research groups and for their different responsibilities.
    Found 1 week, 4 days ago on PhilSci Archive
  23. 998514.517344
    How should we interpret physical theories, and especially quantum theory, if we drop the assumption that we should treat it as an exact description of the whole Universe? I expound and develop the claim that physics is about the study of autonomous, but not necessarily isolated, dynamical systems, and that when applied to quantum mechanics this entails that in general we should take quantum systems as having mixed states and non-unitary dynamics. I argue that nonetheless unitary dynamics continues to have a special place in physics, via the empirically-well-supported reductionist principles that non-unitarity is to be explained by restriction to a subsystem of a larger unitary system and that microscopic physics is governed by unitary and largely known dynamics. I contrast this position with the ‘Open Systems View’ advocated recently by Michael Cuffaro and Stephan Hartmann.
    Found 1 week, 4 days ago on PhilSci Archive
  24. 998540.517355
    Hawking radiation is a special case of physical conditions predicted by fundamentally probabilistic propensity version of quantum theory to constitute probabilistic transitions. This result has implications for the potential viability of propensiton quantum theory.
    Found 1 week, 4 days ago on PhilSci Archive
  25. 998568.517365
    Among biologists and philosophers, there is an ongoing debate over the Modern Synthesis and the Extended Evolutionary Synthesis. Some argue that our current evolutionary biology is in need of (at least) some substantial revision or nontrivial extension, while others maintain that the Modern Synthesis remains the foundational framework for evolutionary biology. It has been widely debated whether the Extended Evolutionary Synthesis provides a more promising framework than the Modern Synthesis. The nature and methodological implications of the Extended Evolutionary Synthesis were also examined. This paper offers an integrated historical and philosophical examination of the debate over the Extended Evolutionary Synthesis. It reviews the development of evolutionary biology of the twentieth century. It argues that there are substantial conceptual and theoretical differences between the Modern Synthesis and the Extended Evolutionary Synthesis, but they are not incommensurable paradigms in the Kuhnian sense. It also argues for a functional approach to the debate over these two frameworks of evolutionary theory.
    Found 1 week, 4 days ago on PhilSci Archive
  26. 998616.517379
    Franklin and Seifert (2021) argue that solving the measurement problem of quantum mechanics (QM) also answers a question central to the philosophy of chemistry: that of how to reconcile QM with the existence of definite molecular structures. This conclusion may appear premature, however, because interactions play a crucial role in shaping molecules, but we generally lack detailed models of how this is accomplished. Given this explanatory gap, simply choosing an interpretation of QM is insufficient, unless the interpretation also has relevant conceptual resources that address how spatially organized molecules are composed. This article seeks to close the gap, using the interpretation provided by relational quantum mechanics (RQM), along with a posited causal ontology. This framework, which entails the co-existence of multiple perspectives on systems within a single world, offers a path toward reconciling the quantum mechanical view of molecules with another conception more congenial to chemistry: that of molecules shaped by patterns of localizing interactions.
    Found 1 week, 4 days ago on PhilSci Archive
  27. 1057534.517391
    We have good empirical ways of determining the presence of a significant amount of gold and we also have good empirical ways of determining the absence of a significant amount of gold. Not so with consciousness. …
    Found 1 week, 5 days ago on Alexander Pruss's Blog
  28. 1103208.517402
    Found 1 week, 5 days ago on Martin Peterson's site
  29. 1103235.517412
    Australia II became the first foreign yacht to win the America’s Cup in 1983. The boat had a revolutionary wing keel and a better underwater hull form. In official documents, Ben Lexcen is credited with the design. He is also listed as the sole inventor of the wing keel in a patent application submitted on February 5, 1982.
    Found 1 week, 5 days ago on Martin Peterson's site
  30. 1142149.51742
    If p is a discrete probability measure, then the Shannon entropy of p is H(p) = − ∑xp({x})log p({x}). I’ve never had any intuitive feeling for Shannon entropy until I noticed the well-known fact that H(p) is the expected value of the logarithmic inaccuracy score of p by the lights of p. Since I’ve spent a long time thinking about inaccuracy scores, I now get some intuitions about entropy for free. …
    Found 1 week, 6 days ago on Alexander Pruss's Blog