Ladyman and Ross (LR) argue that quantum objects are not individuals (or are at most weakly discernible individuals) and use this idea to ground their metaphysical view, ontic structural realism, according to which relational structures are primary to things. LR acknowledge that there is a version of quantum theory, namely the Bohm theory (BT), according to which particles do have definite trajectories at all times. However, LR interpret the research by Brown et al. as implying that “raw stuff” or haecceities are needed for the individuality of particles of BT, and LR dismiss this as idle metaphysics. In this paper we note that Brown et al.’s research does not imply that haecceities are needed. Thus BT remains as a genuine option for those who seek to understand quantum particles as individuals. However, we go on to discuss some problems with BT which led Bohm and Hiley to modify it.
This paper explores the theme “quantum approaches to consciousness” by considering the work of one of the pioneers in the field. The physicist David Bohm (1917-1992) not only made important contributions to quantum physics, but also had a long-term interest in interpreting the results of quantum physics and relativity in order to develop a general world view. His idea was further that living and mental processes could be understood in a new, scientifically and philosophically more coherent way in the context of such a new world view. This paper gives a brief overview of different – and sometimes contradictory - aspects of Bohm’s research programme, and evaluates how they can be used to give an account of topics of interest in contemporary consciousness studies, such as analogies between thought and quantum processes, the problem of mental causation, the mind-body problem and the problem of time consciousness.
Some recent accounts of constitutive relevance have identified mechanism components with entities that are causal intermediaries between the input and output of a mechanism. I argue that on such accounts there is no distinctive inter-level form of mechanistic explanation and that this highlights an absence in the literature of a compelling argument that there are such explanations. Nevertheless, the entities that these accounts call ‘components’ do play an explanatory role. Studying causal intermediaries linking variables X and Y provides knowledge of the counterfactual conditions under which X will continue to bring about Y. This explanatory role does not depend on whether intermediate variables count as components. The question of whether there are distinctively mechanistic explanations remains open.
Information theory presupposes the notion of an epistemic agent, such as a scientist or an idealized human. Despite that, information theory is increasingly invoked by physicists concerned with fundamental physics, physics at very high energies, or generally with the physics of situations in which even idealized epistemic agents cannot exist. In this paper, I shall try to determine the extent to which the application of information theory in those contexts is legitimate. I will illustrate my considerations using the case of black hole thermodynamics and Bekenstein’s celebrated argument for his formula for the entropy of black holes. This example is particularly pertinent to the theme of the present collection because it is widely accepted as ‘empirical data’ in notoriously empirically deprived quantum gravity, even though the laws of black hole thermodynamics have so far evaded direct empirical confirmation.
This paper considers the importance of unification in the context of developing scientific theories. I argue that unifying hypotheses are not valuable simply because they are supported by multiple lines of evidence. Instead, they can be valuable because they guide experimental research in different domains in such a way that the results from those experiments inform the scope of the theory being developed. I support this characterization by appealing to the early development of quantum theory. I then draw some comparisons with discussions of robustness reasoning.
We hear the term bandied about all the time. A man cheats on his wife. We are told that this is simply part of his 'nature’ - that men have evolved to be philanderers. Two young men fight on the streets, taunting and goading each other on. …
Received: 10 February 2017 / Accepted: 26 June 2017 / Published online: 3 August 2017 # The Author(s) 2017. This article is an open access publication Abstract This paper develops a fourth model of public engagement with science, grounded in the principle of nurturing scientific agency through participatory bioethics. It argues that social media is an effective device through which to enable such engagement, as it has the capacity to empower users and transforms audiences into co-producers of knowledge, rather than consumers of content. Social media also fosters greater engagement with the political and legal implications of science, thus promoting the value of scientific citizenship. This argument is explored by considering the case of nanoscience and nanotechnology, as an exemplar for how emerging technologies may be handled by the scientific community and science policymakers.
For more than twenty five years, Fine has been challenging the traditional interpretation of the violations of Bell inequalities (BI) by experiment. A natural interpretation of Fine’s theorem is that it provides us with an alternative set of assumptions on which to put the blame for the failure of the BI, and a new interpretation of the violation of the BI by experiment should follow. This is not, however, how Fine interprets his theorem. Indeed, Fine claims that his result undermines other interpretations, including the traditional interpretation in terms of local realism. The aim of this paper is to understand and to assess Fine’s claims. We distinguish three different strategies that Fine uses in order to support his interpretation of his result. We show that none of these strategies is successful. Fine fails to prove that local realism is not at stake in the violation of the BI by quantum phenomena.
In Dasgupta (2013) I defended a relationalist view of mass. On this view mass is fundamentally relational, so that the state of a physical system vis-a-vis mass consists at bottom just in facts about mass-relationships, such as that one body is more massive than another. This is in contrast to the absolutist view that in addition to the mass-relations there are further facts about which “intrinsic” mass each body has. In my paper I discussed a number of virtues of
The counterfactual tradition to defining actual causation has come a long way since Lewis started it off. However there are still important open problems that need to be solved. One of them is the (in)transitivity of causation. Endorsing transitivity was a major source of trouble for the approach taken by Lewis, which is why currently most approaches reject it. But transitivity has never lost its appeal, and there is a large literature devoted to understanding why this is so. Starting from a survey of this work, we will develop a formal analysis of transitivity and the problems it poses for causation. This analysis provides us with a sufficient condition for causation to be transitive, a sufficient condition for dependence to be necessary for causation, and several characterisations of the transitivity of dependence. Finally, we show how this analysis leads naturally to several conditions a definition of causation should satisfy, and use those to suggest a new definition of causation.
A new “voucher” program aims to shrink the US waiting list for kidney transplants (Veale, 2016). The waiting list is long, hovering in 2017 at around 95,000 (United Network for Organ Sharing, 2017). During 2016, approximately 19,000 kidney transplants took place, meeting only approximately one fifth of the demand. For patients with end stage renal disease (ESRD), transplantation has greater health benefits than dialysis, both in terms of length and quality of life (Tonelli et al, 2011). Transplantation from living donors is optimal: it tops both dialysis and transplantation from deceased donors in terms of health outcomes and cost-effectiveness (LaPointe Rudow et al, 2015, 914). The new voucher program involves live donation.
Computer simulation of an epistemic landscape model, modified to include explicit representation of a centralised funding body, show the method of funding allocation has significant effects on communal trade-off between exploration and exploitation, with consequences for the community’s ability to generate significant truths. The results show this effect is contextual, and depends on the size of the landscape being explored, with funding that includes explicit random allocation performing significantly better than peer-review on large landscapes. The paper proposes a way of incorporating external institutional factors in formal social epistemology, and offers a way of bringing such investigations to bear on current research policy questions.
11 August 1895 – 12 June 1980
Continuing with my Egon Pearson posts in honor of his birthday, I reblog a post by Aris Spanos: “Egon Pearson’s Neglected Contributions to Statistics“. Egon Pearson (11 August 1895 – 12 June 1980), is widely known today for his contribution in recasting of Fisher’s significance testing into the Neyman-Pearson (1933) theory of hypothesis testing. …
It’s been a long time since I’ve blogged about the Complex Adaptive System Composition and Design Environment or CASCADE project run by John Paschkewitz. For a reminder, read these:
• Complex adaptive system design (part 1), Azimuth, 2 October 2016. …
The Holodeck - Star Trek
There is an apple in front of me. I can see it, but I can’t touch it. The reason is that the apple is actually a 3-D rendered model of an apple. It looks like an apple, but exists only within a virtual environment — one that is projected onto the computer screen in front of me. …
The topic of unity in the sciences can be explored through the
following questions: Is there one privileged, most basic or
fundamental concept or kind of thing, and if not, how are the
different concepts or kinds of things in the universe related? Can the
various natural sciences (e.g.,physics, astronomy, chemistry, biology)
be unified into a single overarching theory, and can theories within a
single science (e.g., general relativity and quantum theory in
physics, or models of evolution and development in biology) be
unified? Are theories or models the relevant connected units? What
other connected or connecting units are there?
As Harvey Brown emphasizes in his book Physical Relativity, inertial motion in general relativity is best understood as a theorem, and not a postulate. Here I discuss the status of the “conservation condition”, which states that the energy-momentum tensor associated with non-interacting matter is covariantly divergence-free, in connection with such theorems.
In this paper I discuss the delayed choice quantum eraser experiment by giving a straightforward account in standard quantum mechanics. At first glance, the experiment suggests that measurements on one part of an entangled photon pair (the idler) can be employed to control whether the measurement outcome of the other part of the photon pair (the signal) produces interference fringes at a screen after being sent through a double slit. Significantly, the choice whether there is interference or not can be made long after the signal photon encounters the screen. The results of the experiment have been alleged to invoke some sort of ‘backwards in time influences’. I argue that in the standard collapse interpretation the issue can be eliminated by taking into account the collapse of the overall entangled state due to the signal photon. Likewise, in the de Broglie-Bohm picture the particle’s trajectories can be given a well-defined description at any instant of time during the experiment. Thus, there is no need to resort to any kind of ‘backwards in time influence’. As a matter of fact, the delayed choice quantum eraser experiment turns out to resemble a Bell-type measurement, and so there really is no mystery.
E.S. Pearson (11 Aug, 1895-12 June, 1980)
This is a belated birthday post for E.S. Pearson (11 August 1895-12 June, 1980). It’s basically a post from 2012 which concerns an issue of interpretation (long-run performance vs probativeness) that’s badly confused these days. …
There’s a new paper on the arXiv that claims to solve a hard problem:
• Norbert Blum, A solution of the P versus NP problem. Most papers that claim to solve hard math problems are wrong: that’s why these problems are considered hard. …
Schupbach and Sprenger (2011) introduce a novel probabilistic approach to measuring the explanatory power that a given explanans exerts over a corresponding explanandum. Though we are sympathetic to their general approach, we argue that it does not (without revision) adequately capture the way in which the causal explanatory power that c exerts on e varies with background knowledge. We then amend their approach so that it does capture this variance. Though our account of explanatory power is less ambitious than Schupbach and Sprenger’s in the sense that it is limited to causal explanatory power, it is also more ambitious because we do not limit its domain to cases where c genuinely explains e. Instead, we claim that c causally explains e if and only if our account says that c explains e with some positive amount of causal explanatory power.
In this chapter, I will discuss what it takes for a dynamical collapse theory to provide a reasonable description of the actual world. I will start with discussions of what is required, in general, of the ontology of a physical theory, and then apply it to the quantum case. One issue of interest is whether a collapse theory can be a quantum state monist theory, adding nothing to the quantum state and changing only its dynamics. Although this was one of the motivations for advancing such theories, its viability has been questioned, and it has been argued that, in order to provide an account of the world, a collapse theory must supplement the quantum state with additional ontology, making such theories more like hidden-variables theories than would first appear. I will make a case for quantum state monism as an adequate ontology, and, indeed, the only sensible ontology for collapse theories. This will involve taking dynamical variables to possess, not sharp values, as in classical physics, but distributions of values.
I discuss a game-theoretic model in which scientists compete to finish the intermediate stages of some research project. Banerjee et al. (2014) have previously shown that if the credit awarded for intermediate results is proportional to their difficulty, then the strategy profile in which scientists share each intermediate stage as soon as they complete it is a Nash equilibrium. I show that the equilibrium is both unique and strict. Thus rational credit-maximizing scientists have an incentive to share their intermediate results, as long as this is sufficiently rewarded.
Illustration by Slate
Last week a team of 72 scientists released the preprint of an article attempting to address one aspect of the reproducibility crisis, the crisis of conscience in which scientists are increasingly skeptical about the rigor of our current methods of conducting scientific research. …
A core question of contemporary social morality concerns how we ought to handle racial categorization. By this we mean, for instance, classifying or thinking of a person as Black, Korean, Latino, White, etc.² While it is widely FN:2 agreed that racial categorization played a crucial role in past racial oppression, there remains disagreement among philosophers and social theorists about the ideal role for racial categorization in future endeavors. At one extreme of this disagreement are short-term eliminativists who want to do away with racial categorization relatively quickly (e.g. Appiah, 1995; D’Souza, 1996; Muir, 1993; Wasserstrom, 2001/1980; Webster, 1992; Zack, 1993, 2002), typically because they view it as mistaken and oppressive. At the opposite end of the spectrum, long-term conservationists hold that racial identities and communities are beneficial, and that racial categorization —suitably reformed —is essential to fostering them (e.g. Outlaw, 1990, 1995, 1996). While extreme forms of conservationism have fewer proponents in academia than the most radical eliminativist positions, many theorists advocate more moderate positions. In between the two poles, there are many who believe that racial categorization is valuable (and perhaps necessary) given the continued existence of racial inequality and the lingering effects of past racism (e.g. Haslanger, 2000; Mills, 1998; Root, 2000; Shelby, 2002, 2005; Sundstrom, 2002; Taylor, 2004; Young, 1989). Such authors agree on the short-term need for racial categorization in at least some domains, but they often differ with regard to its long-term value.
Suppose that I am throwing a perfectly sharp dart uniformly randomly at a continuous target. The chance that I will hit the center is zero. What if I throw an infinite number of independent darts at the target? …
It is suggested that the apparently disparate cosmological phenomena attributed to so-called ‘dark matter’ and ‘dark energy’ arise from the same fundamental physical process: the emergence, from the quantum level, of spacetime itself. This creation of spacetime results in metric expansion around mass points in addition to the usual curvature due to stress-energy sources of the gravitational field. A recent modification of Einstein’s theory of general relativity by Chadwick, Hodgkinson, and McDonald incorporating spacetime expansion around mass points, which accounts well for the observed galactic rotation curves, is adduced in support of the proposal. Recent observational evidence corroborates a prediction of the model that the apparent amount of ‘dark matter’ increases with the age of the universe. In addition, the proposal leads to the same result for the small but nonvanishing cosmological constant, related to ‘dark energy,’ as that of the causet model of Sorkin et al.
Psychophysical supervenience requires that the mental properties of a system cannot change without the change of its physical properties. For a system with many minds, the principle requires that the mental properties of each mind of the system cannot change without the change of the physical properties of the system. In this paper, I argue that Everett’s theory seems to violate this principle of psychophysical supervenience. The violation results from the three key assumptions of the theory: (1) the completeness of the physical description by the wave function, (2) the linearity of the dynamics for the wave function, and (3) multiplicity. For a post-measurement state with two decoherent result branches, multiplicity means that each result branch corresponds to a mindful observer, whose mental properties supervene on the branch, and in particular, whose mental content contains a definite record corresponding to the result branch. Under certain unitary evolution which swaps the two result branches, the post-measurement state does not change, and the completeness of the physical description by the wave function then means that the physical state of the composite system does not change. While the linearity of the dynamics for the wave function requires that each result branch changes, and correspondingly the mental properties of the observer which supervene on the branch also change. Thus the principle of psychophysical supervenience as defined above is violated by Everett’s theory.
Questions about the value of the humanities and the relationship between the sciences and humanities have been very much in the news recently. Just a brief review in the public press shows scientists and humanists weighing in and responding to one another. Public opinion is shifting in favor of science and technological education. There are two related challenges that have been leveled about the value of the humanities.
The essay begins with a taxonomy of the major contexts in which the
notion of ‘style’ in mathematics has been appealed to
since the early twentieth century. These include the use of the notion
of style in comparative cultural histories of mathematics, in
characterizing national styles, and in describing mathematical
practice. These developments are then related to the more familiar
treatment of style in history and philosophy of the natural sciences
where one distinguishes ‘local’ and
‘methodological’ styles. It is argued that the natural
locus of ‘style’ in mathematics falls between the
‘local’ and the ‘methodological’ styles
described by historians and philosophers of science.