I find very persuasive arguments like this:
If theory T is true, then whether I exist now depends on some future events. Facts about what exists now do not depend on future events. So, theory T is not true. …
Can there be ‘peaceful coexistence’ between quantum theory and special relativity? Thirty years ago, Shimony hoped that isolating the culprit (i.e. the false assumption) in proofs of Bell inequalities as Outcome Independence would secure such peaceful coexistence: or, if not secure it, at least show a way—maybe the best or only way—to secure it. In this paper, I begin by being sceptical of Shimony’s approach, urging that we need a relativistic solution to the quantum measurement problem (Section 2). Then I analyse Outcome Independence in Kent’s realist one-world Lorentz-invariant interpretation of quantum theory (Section 3 and 4). Then I consider Shimony’s other condition, Parameter Independence, both in Kent’s proposal and more generally, in the light of recent remarkable theorems by Colbeck, Renner and Leegwater (Section 5). For both Outcome Independence and Parameter Independence, there is a striking analogy with the situation in pilot-wave theory. Finally, I will suggest that these recent theorems make some kind of peaceful coexistence mandatory for someone who, like Shimony, endorses Parameter Independence.
Is replication in the cultural domain ubiquitous, rare, or non-existent? And how does this relate to that paradigmatic case of replication, the copying of DNA in living cells? Theorists of cultural evolution are divided on these issues. The most important objection to the replication model has been leveled by Dan Sperber and his colleagues. Cultural transmission, they argue, is almost always reconstructive and transformative, while ‘replication’ can be seen as a rare limiting case at most. Though Sperber’s critique is valuable, I argue that a purely informational approach to replication can clear up some confusion. By means of some thought experiments, I make a distinction between evocation and extraction of cultural information, and apply these concepts at different levels of resolution. I conclude that, even after having taken Sperber’s important points on board, we can still talk about replication in the cultural domain, which is an important issue for certain explanatory projects (i.e. understanding cumulative culture and cultural adaptation).
« Not the critic who counts
The problem with Uber
I just spent a wonderful and exhausting five days in the Bay Area: meeting friends, holding the first-ever combined SlateStarCodex/Shtetl-Optimized meetup, touring quantum computing startups PsiCorp and Rigetti Computing, meeting with Silicon Valley folks about quantum computing, and giving a public lecture for the Simons Institute in Berkeley. …
.A world beyond p-values? I was asked to write something explaining the background of my slides (posted here) in relation to the recent ASA “A World Beyond P-values” conference. I took advantage of some long flight delays on my return to jot down some thoughts:
The contrast between the closing session of the conference “A World Beyond P-values,” and the gist of the conference itself, shines a light on a pervasive tension within the “Beyond P-Values” movement. …
It’s currently fashionable to take Putnamian model theoretic worries seriously for mathematics, but not for discussions of ordinary physical objects and the sciences. In this paper, I will attack this combination of views in two ways. First, I’ll (quickly) suggest there’s an analogy between the challenge of understanding realist reference to physical possibility and that of understanding reference to the kind of logical/combinatorial possibility invoked when we say that second order quantifiers range over ‘all possible subsets’ or it would be ‘logically impossible’ for a property to apply to 0 and the successor of any number it applies to but not all the numbers. Second, I will argue that (under certain mild assumptions about the physical possibility of infinite stochastic physical systems) merely securing determinate reference to physical possibility suffices to rule out nonstandard models of our talk about number theory. So anyone who accepts realist reference to physical possibility faces pressure to also accept such reference to (at least) the standard model of the natural numbers.
Many bricks, when configured appropriately, constitute one house. How is it possible for plurality to yield unity? This is the metaphysical problem of unity. Introducing another thing, say, the configuration of the bricks, into the picture would not solve it, for the bricks plus the configuration are still plurality. This is the famous Bradley’s regress, as applied to the problem of unity. Something must unify the bricks, but it cannot be any additional thing on pain of Bradley’s regress. Therefore, Graham Priest (2014) infers, the metaphysical glue – called ‘gluon’ by Priest – that unifies the bricks must be one of the bricks. I would like to offer a modest critique of Priest’s gluon theory.
This paper analyses the anti-reductionist argument from renormalisation group explanations of universality, and shows how it can be rebutted if one assumes that the explanation in question is captured by the counter-factual dependence account of explanation.
The following line of thought is commonly found in analytic philosophy of mind: the reason calcluators, for instance, are not minds is that the symbols they manipulate in order to solve mathematical problems to not mean anything to them (the calculators). …
Although Moritz Schlick (1882–1936) made a lasting mark in the
philosophical memory by his role as the nominal leader of the Vienna
Circle of Logical Positivists, his most lasting contribution includes
a broad range of philosophical achievements. Indeed, Schlick’s
reputation was established well before the Circle went public. In
1917, he published Space and Time in Contemporary Physics, a
philosophical introduction to the new physics of Relativity which was
highly acclaimed by Einstein himself as well as many others. The
following year, the first edition of his influential General
Theory of Knowledge appeared and, in 1922, he was appointed to
the prestigious chair of Naturphilosophie at the University
Attempts to ‘naturalize’ phenomenology challenge both traditional phenomenology and traditional approaches to cognitive science. They challenge Edmund Husserl’s rejection of naturalism and his attempt to establish phenomenology as a foundational transcendental discipline, and they challenge efforts to explain cognition through mainstream science. While appearing to be a retreat from the bold claims made for phenomenology, it is really its triumph. Naturalized phenomenology is spearheading a successful challenge to the heritage of Cartesian dualism. This converges with the reaction against Cartesian thought within science itself. Descartes divided the universe between res cogitans, thinking substances, and res extensa, the mechanical world. The latter won with Newton and we have, in most of objective science since, literally lost our mind, hence our humanity. Despite Darwin, biologists remain children of Newton, and dream of a grand theory that is epistemologically complete and would allow lawful entailment of the evolution of the biosphere. This dream is no longer tenable. We now have to recognize that science and scientists are within and part of the world we are striving to comprehend, as proponents of endophysics have argued, and that physics, biology and mathematics have to be reconceived accordingly. Interpreting quantum mechanics from this perspective is shown to both illuminate conscious experience and reveal new paths for its further development. In biology we must now justify the use of the word “function”. As we shall see, we cannot prestate the ever new biological functions that arise and constitute the very phase space of evolution. Hence, we cannot mathematize the detailed becoming of the biosphere, nor write differential equations for functional variables we do not know ahead of time, nor integrate those equations, so no laws “entail” evolution. The dream of a grand theory fails. In place of entailing laws, a post-entailing law explanatory framework is proposed in which Actuals arise in evolution that constitute new boundary conditions that are enabling constraints that create new, typically unprestatable, Adjacent Possible opportunities for further evolution, in which new Actuals arise, in a persistent becoming. Evolution flows into a typically unprestatable succession of Adjacent Possibles. Given the concept of function, the concept of functional closure of an organism making a living in its world, becomes central. Implications for patterns in evolution include historical reconstruction, and statistical laws such as the distribution of extinction events, or species per genus, and the use of formal cause, not efficient cause, laws.
My topic is a certain view about mental images: namely, the ‘Multiple Use Thesis’. On this view, at least some mental image-types, individuated in terms of the sum total of their representational content, are potentially multifunctional: a given mental image-type, individuated as indicated, can serve in a variety of imaginative-event-types. As such, the presence of an image is insufficient to individuate the content of those imagination-events in which it may feature. This picture is argued for, or (more usually) just assumed to be true, by Christopher Peacocke, Michael Martin, Paul Noordhof, Bernard Williams, Alan White, and Tyler Burge. It is also presupposed by more recent authors on imagination such as Amy Kind, Peter Kung and Neil Van Leeuwen. I reject various arguments for the Multiple Use Thesis, and conclude that instead we should endorse SINGLE: a single image-type, individuated in terms of the sum total of its intrinsic representational content, can serve in only one imagination event-type, whose content coincides exactly with its own, and is wholly determined by it. Plausibility aside, the interest of this thesis is also in its iconoclasm, as well as the challenge it poses for the diverse theories that rest on the truth of the Multiple Use Thesis.
Recent work in cognitive and computational neuroscience depicts human brains as devices that minimize prediction error signals: signals that encode the difference between actual and expected sensory stimulations. This raises a series of puzzles whose common theme concerns a potential misfit between this bedrock informationtheoretic vision and familiar facts about the attractions of the unexpected. We humans often seem to actively seek out surprising events, deliberately harvesting novel and exciting streams of sensory stimulation. Conversely, we often experience some wellexpected sensations as unpleasant and to-be-avoided. In this paper, I explore several core and variant forms of this puzzle, using them to display multiple interacting elements that together deliver a satisfying solution. That solution requires us to go beyond the discussion of simple information-theoretic imperatives (such as 'minimize long-term prediction error') and to recognize the essential role of species-specific prestructuring, epistemic foraging, and cultural practices in shaping the restless, curious, novelty-seeking human mind.
We argue that, to be trustworthy, Computational Intelligence (CI) has to do what it is entrusted to do for permissible reasons and to be able to give rationalizing explanations of its behavior which are accurate and graspable. We support this claim by drawing parallels with trustworthy human persons, and we show what difference this makes in a hypothetical CI hiring system. Finally, we point out two challenges for trustworthy CI and sketch a mechanism which could be used to generate sufficiently accurate as well as graspable rationalizing explanations for CI behavior.
Can purely predictive models be useful in investigating causal systems? I argue “yes”. Moreover, in many cases not only are they useful, they are essential. The alternative is to stick to models or mechanisms drawn from well-understood theory. But a necessary condition for explanation is empirical success, and in many cases in social and field sciences such success can only be achieved by purely predictive models, not by ones drawn from theory. Alas, the attempt to use theory to achieve explanation or insight without empirical success therefore fails, leaving us with the worst of both worlds – neither prediction nor explanation. Best go with empirical success by any means necessary. I support these methodological claims via case studies of two impressive feats of predictive modelling: opinion polling of political elections, and weather forecasting.
On 7 July 1688 the Irish scientist and politician William Molyneux
(1656–1698) sent a letter to John Locke in which he put forward
a problem which was to awaken great interest among philosophers and
other scientists throughout the Enlightenment and up until the present
day. In brief, the question Molyneux asked was whether a man who has
been born blind and who has learnt to distinguish and name a globe and
a cube by touch, would be able to distinguish and name these objects
simply by sight, once he had been enabled to see.
Henri Poincaré was a mathematician, theoretical physicist and a
philosopher of science famous for discoveries in several fields and
referred to as the last polymath, one who could make significant
contributions in multiple areas of mathematics and the physical
sciences. This survey will focus on Poincaré’s
philosophy. Concerning Poincaré’s scientific legacy, see
Browder (1983) and Charpentier, Ghys, Lesne (2010). Poincaré’s philosophy is primarily that of a scientist
originating in his own daily practice of science and in the scientific
debates of his time. As such, it is strongly influenced by the
reflections of Ernst Mach, James Maxwell and Hermann von Helmholtz.
Probability has played an important role in the foundations of QM from the beginning and continues to play an important role today. The choice of an interpretation of probability affect the interpretation of QM. Recent developments in Quantum information theory has led to new way to look at the foundations of QM, including a greater emphasis on possible role of subjective probability in QM. Several works claims that the QM can be view as information theory. According these works ,the description of physical systems in terms of information and information processing, is the only way to describe physical system. For instance, according Bub’s words (Bub, 2008): I argue that QM is fundamentally a theory about the representation and manipulation of information, not a theory about the mechanics of nonclassical waves or particles. The notion of quantum information is to be understood as a new physical primitive. The author give at the information an ontic statute, in this context it is possible, for instance, deduce the physical laws and the matter from the information. We note others extreme positions on this topic, for instance, Zeilinger (Zeilinger,2005), where he claims that: "The discovery that individual events are irreducibly random is probably one of the most significant findings of the twentieth century, even for single particles, it is not always possible to assign definite measurement outcomes independently of and prior to the selection of specific measurement apparatus in the specific experiment. For this reason, the distinction between reality and our knowledge of reality, between reality and information, cannot be made. The same position is the following statements of von Baeyer (von Baeyer, 2005) : Information as physical reality: in 1905 Einstein proposed that the world is not what it seems.
We introduce ‘model migration’ as a species of cross-disciplinary knowledge transfer whereby the representational function of a model is radically changed to allow application to a new disciplinary context. Controversies and confusions that often derive from this phenomenon will be illustrated in the context of econophysics and phylogeographic linguistics. Migration can be usefully contrasted with concept of ‘imperialism’, that has been influentially discussed in the context of geographical economics. In particular, imperialism, unlike migration, relies upon extension of the original model via an expansion of the domain of phenomena it is taken to adequately described. The success of imperialism thus requires expansion of the justificatory sanctioning of the original idealising assumptions to a new disciplinary context. Contrastingly, successful migration involves the radical representational re-interpretation of the original model, rather than its extension. Migration thus requires ‘re-sanctioning’ of new ‘counterpart idealisations’ to allow application to an entirely different class of phenomena. Whereas legitimate scientific imperialism should be based on the pursuit of some form of ontological unification, no such requirement is need to legitimate the practice of model migration. The distinction between migration and imperialism will thus be shown to have significant normative as well as descriptive value.
The Einsteinian research programme can be summarized in the following way: Physical theories are attempts at saying how things are. The world is comprehensible. The above statement is a very general one, indeed this statement seems to be not enough to characterize uniquely Einstein’s programme. In fact, that state- ment is also perfectly adaptable to the Galilean, Cartesian, Newtonian, Leibnizian, Maxwellian and several other scientific programmes. According to Einstein, quan- tum objects are concrete entities existing in a space-time where causality holds. In the following statement the Einstein’s thought is more precise: Physical theories (including QM) are attempts at saying how things are (including quantum objects). The objective world is comprehensible. By the simultaneous help of space-time and causal conceptual categories we can study this comprehensible world.
I present in detail the case for regarding black hole thermodynamics as having a statistical-mechanical explanation in exact parallel with the statistical-mechanical explanation believed to underly the thermodynamics of other systems. (Here I presume that black holes are indeed thermodynamic systems in the fullest sense; I review the evidence for that conclusion in the prequel to this paper.) I focus on three lines of argument: (i) zero-loop and one-loop calculations in quantum general relativity understood as a quantum field theory, using the path-integral formalism; (ii) calculations in string theory of the leading-order terms, higher-derivative corrections, and quantum corrections, in the black hole entropy formula for extremal and near-extremal black holes; (iii) recovery of the qualitative and (in some cases) quantitative structure of black hole statistical mechanics via the AdS/CFT correspondence. In each case I briefly review the content of, and arguments for, the form of quantum gravity being used (effective field theory; string theory; AdS/CFT) at a (relatively) introductory level: the paper is aimed at students and non-specialists and does not presume advanced knowledge of quantum gravity.. My conclusion is that the evidence for black hole statistical mechanics is as solid as we could reasonably expect it to be in the absence of a directly-empirically-verified theory of quantum gravity.
I distinguish between two versions of the black hole information-loss paradox. The first arises from apparent failure of unitarity on the space-time of a completely evaporating black hole, which appears to be non-globally-hyperbolic; this is the most commonly discussed version of the paradox in the foundational and semipopular literature, and the case for calling it ‘paradoxical’ is less than compelling. But the second arises from a clash between a fully-statistical-mechanical interpretation of black hole evaporation and the quantum-field-theoretic description used in derivations of the Hawking effect. This version of the paradox arises long before a black hole completely evaporates, seems to be the version that has played a central role in quantum gravity, and is genuinely paradoxical. After explicating the paradox, I discuss the implications of more recent work on AdS/CFT duality and on the ‘Firewall paradox’, and conclude that the paradox is if anything now sharper. The article is written at a (relatively) introductory level and does not assume advanced knowledge of quantum gravity.
It is known that quantum mechanics is problematic in the sense that it is incomplete and needs the notion of a classical device measuring quantum observables as an important ingredient of the theory. Due to this, one accepts that there exist two worlds: the classical one and the quantum one. In the classical world, the measurements of classical observables are produced by classical devices. In the framework of standard theory, in the quantum world the measurements of quantum observables are produced by classical devices, too. Due to this, the theory of quantum measurements is considered as something very specifically different from classical measurements.
Despite these important advances, it was still only a handful of physicists who were deeply interested in entanglement. Philosophers of physics recognized the importance of entanglement and Bell’s work, but many continued to think of entanglement as an "all or nothing" phenomenon and described entanglement as simply a spooky action-at-a-distance or mysterious holism. In the last two decades new discoveries, many of which are associated with the investigation of quantum information, have shown that much philosophical and foundational work remains to be done to deepen our understanding of entanglement and non-locality.
Some theoreticians argue that nonlocality has a role in interpreting quantum phenomena. Others suggest that quantum nonlocality may be interpreted as a holis- tic, nonseparable relational issue.
It is argued that quantum theory is best understood as requiring an ontological duality of res extensa and res potentia, where the latter is understood per Heisenberg’s original proposal, and the former is roughly equivalent to Descartes’ ‘extended substance.’ However, this is not a dualism of mutually exclusive substances in the classical Cartesian sense, and therefore does not inherit the infamous ‘mind-body’ problem. Rather, res potentia and res extensa are proposed as mutually implicative ontological extants that serve to explain the key conceptual challenges of quantum theory; in particular, nonlocality, entanglement, null measurements, and wave function collapse. It is shown that a natural account of these quantum perplexities emerges, along with a need to reassess our usual ontological commitments involving the nature of space and time.
In this brief paper, we argue about the conceptual relationship between the role of observer in quantum mechanics and the von Neumann Chain.
QM is a mathematical model of the physical world that describes the behavior of QS. A physical model is characterized by how it represents physical states, observables, measurements, and dynamics of the system under consideration. A quantum description of a physical model is based on the following concepts: A state is a complete description of a physical system. QM associates a ray in Hilbert space to the physical state of a system.
The work done in the philosophy of modeling by Vaihinger (1876), Craik (1943), Rosenblueth and Wiener (1945), Apostel (1960), Minsky (1965), Klaus (1966) and Stachowiak (1973) is still almost completely neglected in the mainstream literature. However, this work seems to contain original ideas worth to be discussed. For example, the idea that diverse functions of models can be better structured as follows: in fact, models perform only a single function – they are replacing their target systems, but for different purposes. Another example: the idea that all of cognition is cognition in models or by means of models. Even perception, reflexes and instincts (animal and human) can be best analyzed as modeling. The paper presents an analysis of the above-mentioned work.
In this paper I investigate, within the framework of realistic interpretations of the wave function in nonrelativistic quantum mechanics, the mathematical and physical nature of the wave function. I argue against the view that mathematically the wave function is a two-component scalar field on configuration space. First, I review how this view makes quantum mechanics non- Galilei invariant and yields the wrong classical limit. Moreover, I argue that interpreting the wave function as a ray, in agreement many physicists, Galilei invariance is preserved. In addition, I discuss how the wave function behaves more similarly to a gauge potential than to a field. Finally I show how this favors a nomological rather than an ontological view of the wave function.