The purpose of this paper is to present a paraconsistent formal system and a corresponding intended interpretation according to which true contradictions are not tolerated. Contradictions are, instead, epistemically understood as conflicting evidence, where evidence for a proposition A is understood as reasons for believing that A is true. The paper defines a paraconsistent and paracomplete natural deduction system, called the Basic Logic of Evidence (BLE ), and extends it to the Logic of Evidence and Truth (LETJ). The latter is a logic of formal inconsistency and undeterminedness that is able to express not only preservation of evidence but also preservation of truth. LETJ is anti-dialetheist in the sense that, according to the intuitive interpretation proposed here, its consequence relation is trivial in the presence of any true contradiction. Adequate semantics and a decision method are presented for both BLE and LETJ , as well as some technical results that fit the intended interpretation.
Network analysis needs tools to infer distributions over graphs of arbitrary size from a single graph. Assuming the distribution is generated by a continuous latent space model which obeys certain natural symmetry and smoothness properties, we establish three levels of consistency for non-parametric maximum likelihood inference as the number of nodes grows: (i) the estimated locations of all nodes converge in probability on their true locations; (ii) the distribution over locations in the latent space converges on the true distribution; and (iii) the distribution over graphs of arbitrary size converges.
Molecular biologists exploit information conveyed by mechanistic models for experimental purposes. In this contribution, I make sense of this aspect of biological practice by developing Keller’s idea of the distinction between ‘models of’ and ‘models for’. ‘Models of (phenomena)’ should be understood as models representing phenomena and they are valuable if they explain phenomena. ‘Models for (manipulating phenomena)’ suggest new types of material manipulations and they are important not because of their explanatory force, but because of the interventionist strategies they afford. This is a distinction between aspects of the same model; in molecular biology, models may be treated either as ‘models of’ or as ‘models for’. By analyzing the discovery and characterization of restriction-modification systems and their exploitation for DNA cloning and mapping, I identify the differences between treating a model as a ‘model of’ or as a ‘model for’. These lie in a cognitive disposition of the modeler towards the model. A modeler will look at a model as a ‘model of’ if he/she is interested in its explanatory force, or as a ‘model for’ if the interest is in the material manipulations it can possibly afford.
The problem of the direction of the electromagnetic arrow of time is perhaps the most perplexing of the major unsolved problems of contemporary physics, because the usual tools of theoretical physics cannot be used to investigate it. Even the clues provided by the CP violation of the K 2 meson, which have led to a profound insight into the dominance of matter over antimatter in the universe, have not shed any light on the problem of the origin of the electromagnetic arrow of time.
One response to the problem of logical omniscience in standard possible worlds models of belief is to extend the space of worlds so as to include impossible worlds. It is natural to think that essentially the same strategy can be applied to probabilistic models of partial belief, for which parallel problems also arise. In this paper, I note a difficulty with the inclusion of impossible worlds into probabilistic models. Under weak assumptions about the space of worlds, most of the propositions which can be constructed from possible and impossible worlds are in an important sense inexpressible; leaving the probabilistic model committed to saying that agents in general have at least as many attitudes towards inexpressible propositions as they do towards expressible propositions. If it is reasonable to think that our attitudes are generally expressible, then a model with such commitments looks problematic.
It is clear that members entering a community are formed, via the process of enculturation, by a "cultural framework". It is also clear that such a "cultural framework", in turn, is produced by the members of the community. The nature of this dialectical movement – producing the framework while being produced by it – has been investigated for a long time; however, it is only recently that some scholars have been coming to realize the central importance of rules and norms for the adequate description of the process. In this paper I argue that to understand we must give pride of place to norms quite radically – we must realize how deeply normative creatures we humans are. I argue that even the most promising accounts of this movement, such as those based on the concept of "mindshaping" or on the idea of "social niche construction" must be seen as essentially normative enterprises.
Bruno Bauer (6 September 1809–13 April 1882), philosopher,
historian, and theologian. His career falls into two main phases,
divided by the Revolutions of 1848. In the 1840s, the period known as
the Vormärz or the prelude to the German revolutions of March
1848, Bauer was a leader of the Left-Hegelian movement, developing a
republican interpretation of Hegel, which combined ethical and
aesthetic motifs. His theory of infinite self-consciousness, derived
from Hegel’s account of subjective spirit, stressed rational autonomy
and historical progress. Investigating the textual sources of
Christianity, Bauer described religion as a form of alienation, which,
because of the deficiencies of earthly life, projected irrational,
transcendent powers over the self, while sanctioning particularistic
sectarian and material interests.
There are various equivalent formulations of the Church-Turing thesis. A common one is that every effective computation can be carried out by
a Turing machine. The Church-Turing thesis is often misunderstood,
particularly in recent writing in the philosophy of mind.
Anthropic reasoning based on the apparent fine-tuning of physical parameters – scientific theory’s possession of values in an apparently tiny range allowing life – has been reinvigorated with the realization that string theory, far from determining the parameter values at issue, has models of great diversity. From this fact, many are convinced that the fine-tuning evidence is best explained by the observed universe’s selection as life-permitting (from many real sub-universes most of which do not allow the creation of life). Others see the universe as purposeful and perhaps designed. However, all fine-tuning arguments presuppose a governing conception of laws of nature. This paper argues that a David Lewis-style best-system account of scientific law disarms the anthropic argument. Indeed, in light of the fact that even rejected scientific theories are in many cases fine-tuned, anthropic reasoning may point toward a deflationary metaphysics rather than the extravagant designer-or-multiverse alternative.
Ammonius (ca. 435/445–517/526) taught philosophy at Alexandria,
where his father Hermeias had taught earlier. Known primarily for his
commentaries on Aristotle, which were said to be of greater benefit
than anyone else’s, he was also distinguished in geometry and
astronomy. Himself a pupil of Proclus at Athens, at Alexandria
Ammonius taught most of the important Platonists of the late
5th and early 6th centuries: Asclepius,
Damascius and Simplicius, Eutocius, and Olympiodorus; Elias and David
are considered indirect pupils of his. Damascius, who went on to head
the school at Athens, heard Ammonius lecture, but attached himself
rather to the mentorship of Isidore and followed him to Athens.
After a brief presentation of Feynman diagrams, we criticizise the idea that Feynman diagrams can be considered to be pictures or depictions of actual physical processes. We then show that the best interpretation of the role they play in quantum field theory and quantum electrodynamics is captured by Hughes' Denotation, Deduction and Interpretation theory of models (DDI), where “models” are to be interpreted as inferential, non-representational devices constructed in given social contexts by the community of physicists.
The project of naturalistic metaphysics appears straightforward. Start with one’s best scientific theories and infer one’s metaphysical commitments from what these theories say exist, the sort of ideological frameworks they employ. Yet, as many have noted, naturalism poses challenges for metaphysics as it is typically practiced. In particular, once scientific theories themselves offer verdicts about the sort of things that exist, the properties they have, and the spatiotemporal structures they occupy, what more is there for metaphysicians to contribute than simply repeating what is already known? Even if the work is straightforward, in becoming naturalistic, metaphysics seems to promote its own obsolescence. The goal of this paper is to evaluate one influential response to this concern, one that has been appealing to many contemporary metaphysicians who are naturalists. This is to argue that although it might appear that metaphysics and science are aimed at a common set of questions about the sorts of entities the world contains and what they are like, this appearance is misleading. Metaphysicians rather address a distinctive subject matter, a subject matter more fundamental than that of science.
[Editor's Note: The following new entry by David Vander Laan replaces the
on this topic by the previous authors.] In the philosophy of religion, creation is the action by
which God brings an object into existence, while conservation
is the action by which God maintains the existence of an object over
time. The major monotheisms unambiguously affirm that God both created
the world and conserves it. It is less clear, however, whether
creation and conservation are to be conceived as distinct kinds of
actions. The question has its roots in medieval and early modern
characterizations of divine action, and it has received renewed
attention in recent decades.
It is tempting to make the final “proportionality” condition of the Principle of Double Effect say that the overall consequences of the action are good or neutral, perhaps after screening off any consequences that come through evil (cf. …
By perfectly fine I mean: not at all morally blameworthy. By aiming I mean: being ready to calibrate ourselves up or down to hit the target. I would contrast aiming with settling, which does not necessarily involve calibrating down if one is above target. …
Joseph Halpern and Judea Pearl () draw upon structural equation models to develop an attractive analysis of ‘actual cause’. Their analysis is designed for the case of deterministic causation. I show that their account can be naturally extended to provide an elegant treatment of probabilistic causation.
Here’s a simple “construction” of a non-measurable set out of coin-toss sequences, i.e., an event that doesn’t have a well-defined probability going back to Blackwell and Diaconis, but simplified not to use ultrafilters. …
For a long time I have wondered, with an uneasy feeling that there was something I couldn't see, about the relationship between two-dimensional semantics and my approach to analysing subjunctive necessity de dicto. …
September’s general elections have brought Germany its own Brexit/Trump moment. For the first time since 1945 a far right nationalist party is part of the German national parliament. The Alternative for Germany, AfD, gained 12,6 % of German votes. …
This paper is about the putative theoretical virtue of strength, as it might be used in abductive arguments to the correct logic in the epistemology of logic. It argues for three theses. The first is that the well-defined property of logical strength is neither a virtue nor a vice, so that logically weaker theories are not—all other things being equal—worse or better theories than logically stronger ones. The second thesis is that logical strength does not entail the looser characteristic of scientific strength, and the third is that many modern logics are on a par—or can be made to be on a par—with respect to scientific strength.
As Feynman (1982) observed, “we always have had a great deal of difficulty in understanding the world view that quantum mechanics represents” (471). Among the perplexing aspects of quantum mechanics is its seeming, on a wide variety of presently live realist interpretations (including but not limited to the so-called ‘orthodox’ interpretation), to violate the classical supposition of ‘value definiteness’, according to which the properties—a.k.a. ‘observables’—of a given particle or system have precise values at all times. Indeed, value indefiniteness lies at the heart of what is supposed to be distinctive about quantum phenomena, as per the following classic cases:
Facts, philosophers like to say, are opposed to theories and to values
(cf. Rundle 1993) and are to be distinguished from things, in
particular from complex objects, complexes and wholes, and from
relations. They are the objects of certain mental states and acts,
they make truth-bearers true and correspond to truths, they are part
of the furniture of the world. We present and discuss some
philosophical and formal accounts of facts.
“Intuitionistic logic” is a term that unfortunately gains
ever greater currency; it conveys a wholly false view on
intuitionistic mathematics. —Freudenthal 1937
Intuitionistic logic is an offshoot of L.E.J. Brouwer’s
intuitionistic mathematics. A widespread misconception has it that
intuitionistic logic is the logic underlying Brouwer’s
intuitionism; instead, the intuitionism underlies the logic, which is
construed as an application of intuitionistic mathematics to language. Intuitionistic mathematics consists in the act of effecting mental
constructions of a certain kind. These are themselves not linguistic
in nature, but when acts of construction and their results are
described in a language, the descriptions may come to exhibit
In this paper we compare two different notions of ‘power’, both of which attempt to provide a realist understanding of quantum mechanics grounded on the potential mode of existence. For this propose we will begin by introducing two different notions of potentiality present already within Aristotelian metaphysics, namely, irrational potentiality and rational potentiality. After discussing the role played by potentiality within classical and quantum mechanics, we will address the notion of causal power which is directly related to irrational potentiality and has been adopted by many interpretations of QM. We will then present the notion of immanent power which relates to rational potentiality and argue that this new concept presents important advantages regarding the possibilities it provides for understanding in a novel manner the theory of quanta. We end our paper with a comparison between both notions of ‘power’, stressing some radical differences between them.
The method of explication has been somewhat of a hot topic in the last ten years. Despite the multifaceted research that has been directed at the issue, one may perceive a lack of step-by-step procedural or structural accounts of explication. This paper aims at providing a structural account of the method of explication in continuation of the works of Geo Siegwart. It is enhanced with a detailed terminology for the assessment and comparison of explications. The aim is to provide means to talk about explications including their criticisms and their interrelations. There is hope that this treatment will be able to serve as a foundation to a step-by-step guide to be established for explicators. At least it should help to frame and mediate explicative disputes. In closing the enterprise will be considered an explication of ‘explication’, though consecutive explications improving on this one are undoubtedly conceivable.
This paper introduces and examines the prospects of the recent research in a holographic relation between entanglement and spacetime pioneered by Mark van Raamsdonk and collaborators. Their thesis is that entanglement in a holographic quantum state is crucial for connectivity in its spacetime dual. Utilizing this relation, the paper develops a thought experiment that promises to probe the nature of spacetime by monitoring the behavior of a spacetime when all entanglement is removed between local degrees of freedom in its dual quantum state. The thought experiment suggests a picture of spacetime as consisting of robust nodes that are connected by non-robust bulk spacetime that is sensitive to changes in entanglement in the dual quantum state. However, rather than pursuing the thought experiment in further detail, the credibility of the relation between spacetime and entanglement in this zero entanglement limit is questioned. The energy of a quantum system generally increases when all entanglement is removed between subsystems, and so does the energy of its spacetime dual. If a system is subdivided into an infinite number of subsystems and all entanglement between them is removed, then the energy of the quantum system and the energy of its spacetime dual are at risk of diverging. While this is a prima facie worry for the thought experiment, it does not constitute a conclusive refutation.
This paper argues that scale-dependence of physical and biological processes offers resistance to reductionism and has implications that support a specific kind of downward causation. I demonstrate how insights from multiscale modeling can provide a concrete mathematical interpretation of downward causation as boundary conditions for models used to represent processes at lower scales. The autonomy and role of macroscale parameters and higher-level constraints are illustrated through examples of multiscale modeling in physics, developmental biology, and systems biology. Drawing on these examples, I defend the explanatory importance of constraining relations for understanding the behavior of biological systems.
The previous chapter examined the inductive logic applicable to an infinite lottery machine. Such a machine generates a countably infinite set of outcomes, that is, there are as many outcomes as natural numbers, 1, 2, 3, … We found there that, if the lottery machine is to operate without favoring any particular outcome, the inductive logic native to the system is not probabilistic. A countably infinite set is the smallest in the hierarchy of infinities. The next routinely considered is a continuum-sized set, such as given by the set of all real numbers or even just by the set of all real numbers in some interval, from, say, 0 to 1.
Aristotle (b. 384 – d. 322 BCE), was a Greek philosopher,
logician, and scientist. Along with his teacher Plato, Aristotle is
generally regarded as one of the most influential ancient thinkers in
a number of philosophical fields, including political theory. Aristotle was born in Stagira in northern Greece, and his father was a
court physician to the king of Macedon. As a young man he studied in
Plato's Academy in Athens. After Plato's death he left Athens to
conduct philosophical and biological research in Asia Minor and
Lesbos, and he was then invited by King Philip II of Macedon to tutor
his young son, Alexander the Great.
The notion of transworld identity—‘identity across
possible worlds’—is the notion that the same object exists
in more than one possible world (with the actual world treated as one
of the possible worlds). It therefore has its home in a
‘possible-worlds’ framework for analysing, or at least
paraphrasing, statements about what is possible or necessary. The subject of transworld identity has been highly contentious, even
among philosophers who accept the legitimacy of talk of possible
worlds. Opinions range from the view that the notion of an identity
that holds between objects in distinct possible worlds is so
problematic as to be unacceptable, to the view that the notion is
utterly innocuous, and no more problematic than the uncontroversial
claim that individuals could have existed with somewhat different