-
1619.887205
How to explain the Aharonov-Bohm (AB) effect remains deeply controversial, particularly regarding the tension between locality and gauge invariance. Recently Wallace argued that the AB effect can be explained in a local and gauge-invariant way by using the unitary gauge. In this paper, I present a critical analysis of Wallace’s intriguing argument. First, I show that the unitary gauge transforms the Schrodinger equation into the Madelung equations, which are expressed entirely in terms of local and gauge-invariant quantities. Next, I point out that an additional quantization condition needs to be imposed in order that the Madelung equations are equivalent to the Schrodinger equation, while the quantization condition is inherently nonlocal. Finally, I argue that the Madelung equations with the quantization condition can hardly explain the the AB effect, even if in a nonlocal way. This analysis suggests that the unitary gauge does not resolve the tension between locality and gauge invariance in explaining the AB effect, but highlights again the profound conceptual challenges in reconciling the AB effect with a local and gauge-invariant framework.
-
1639.887417
The geometry of the universe is today widely believed to be flat based on combined data obtained during the 2000s. Prior to this, the geometry of the universe was essentially unknown. However, within the relevant literature one finds claims indicating a strong preference for a (nearly) closed universe, based on philosophical and other “non-experimental” reasons. The main aim of this article is to identify these reasons and assess the extent to which philosophical reasoning influenced the establishment of the dark matter hypothesis and the development of models for a closed universe. Building on groundwork laid by de Swart (2020), this study expands the discussion by (a) arguing that opinions on the geometry of the universe during the 1970s and 1980s were more divided than often assumed, (b) uncovering a lesser-known Machian argument for flat geometry proposed by Dennis Sciama, and (c) presenting a fine-tuning argument stemming from the ‘coincidence problem’ articulated by Robert Dicke. The study provides a nuanced perspective on how philosophical considerations contributed to shaping early views on cosmology and dark matter and highlights the significant role philosophical reasoning can play in guiding scientific inquiry in physics.
-
1658.887433
We introduce what we call the paradox of self consultation: This is the question of how apriori inquirers, like philosophers, mathematicians, and linguists, are able to (successfully) investigate matters of which they are initially ignorant by systematically questioning themselves. A related phenomenon is multiple grades of access: We find it extremely hard to think up analyses of our concepts that do not suffer from counterexamples; moderately hard to think up counterexamples to proposed analyses; and trivial to verify that a provided counterexample is genuine. We consider a range of potential explanations, including two-system approaches, and show why they are unsatisfactory, despite being on the right track. We then proceed to give a naturalistic solution to the paradox and multiple grades of access. In doing so, we present a novel theory of epistemic work, which we connect to formal learning theory.
-
1697.887444
This paper is about a problem which arose in mathematics but is now widely considered by mathematicians to be a matter “merely” for philosophy. I want to show what philosophy can contribute to solving the problem by returning it to mathematics, and I will do that by elucidating what it is to be a solution to a mathematical problem at all.
-
117011.887454
The article summarizes the present state of research into the conceptual foundations of the periodic table. We give a brief historical account of the development of the periodic table and periodic system, including the impact of modern physics due to the discoveries of Moseley, Bohr, modern quantum mechanics etc. The role of the periodic table in the debate over the reduction of chemistry is discussed, including the attempts to derive the Madelung rule from first principles. Other current debates concern the concept of an “element” and its dual role as simple substance and elementary substance and the question of whether elements and groups of elements constitute natural kinds. The second of these issues bears on the question of further debates concerning the placement of certain elements like H, He, La and Ac in the periodic table.
-
118494.887465
Discussions on the compositionality of inferential roles concentrate on extralogical vocabulary. However, there are nontrivial problems concerning the compositionality of sentences formed by the standard constants of propositional logic. For example, is the inferential role of AB uniquely determined by those of A and B? And how is it determined? This paper investigates such questions. We also show that these issues raise matters of more significance than may prima facie appear.
-
437132.887475
laying down a program for this study. It is written for everyone who is curious about the world of symbols that surrounds us, in particular researchers and students in philosophy, history, cognitive science, and mathematics education. The main characteristics of mathematical notations are introduced and discussed in relation to the intended subject matter, the language in which the notations are verbalized, the cognitive resources needed for learning and understanding them, the tasks that they are used for, their material basis, and the historical context in which they are situated. Specific criteria for the design and assessment of notations are discussed, as well as ontological, epistemological, and methodological questions that arise from the study of mathematical notations and of their use in mathematical practice.
-
519875.887512
While the traditional conception of inductive logic is Carnapian, I develop a Peircean alternative and use it to unify formal learning theory, statistics, and a significant part of machine learning: supervised learning. Some crucial standards for evaluating non-deductive inferences have been assumed separately in those areas, but can actually be justified by a unifying principle.
-
574073.887523
Incurvati and Schlöder (Journal of Philosophical Logic, 51(6), 1549–1582, 2022) have recently proposed to define supervaluationist logic in a multilateral framework, and claimed that this defuses well-known objections concerning supervaluationism’s apparent departures from classical logic. However, we note that the unconventional multilateral syntax prevents a straightforward comparison of inference rules of different levels, across multi- and unilateral languages. This leaves it unclear how the supervaluationist multilateral logics actually relate to classical logic, and raises questions about Incurvati and Schlöder’s response to the objections. We overcome the obstacle, by developing a general method for comparisons of strength between multi-and unilateral logics. We apply it to establish precisely on which inferential levels the supervaluationist multilateral logics defined by Incurvati and Schlöder are classical. Furthermore, we prove general limits on how classical a multilateral logic can be while remaining supervaluationistically acceptable. Multilateral supervaluationism leads to sentential logic being classical on the levels of theorems and regular inferences, but necessarily strictly weaker on meta- and higher-levels, while in a first-order language with identity, even some classical theorems and inferences must be forfeited. Moreover, the results allow us to fill in the gaps of Incurvati and Schlöder’s strategy for defusing the relevant objections.
-
574182.887534
We all perform experiments very often. When I hear a noise and deliberately turn my head, I perform an experiment to find out what I will see if I turn my head. If I ask a question not knowing what answer I will hear, I am engaging in (human!) …
-
578294.887544
There is no doubt that a theory that is unified has a certain appeal. Scientific practice in fundamental physics relies heavily on it. But is a unified theory more likely to be empirically adequate than a non-unified theory? Myrvold has pointed out that, on a Bayesian account, only a specific form of unification, which he calls mutual information unification, can have confirmatory value. In this paper, we argue that Myrvold’s analysis suffers from an overly narrow understanding of what counts as evidence. If one frames evidence in a way that includes observations beyond the theory’s intended domain, one finds a much richer and more interesting perspective on the connection between unification and theory confirmation. By adopting this strategy, we give a Bayesian account of unification that (i) goes beyond mutual information unification to include other cases of unification, and (ii) gives a crucial role to the element of surprise in the discovery of a unified theory. We illustrate the explanatory strength of this account with some cases from fundamental physics and other disciplines.
-
578314.887554
Although the electron density can be calculated with the formal resources of quantum mechanics, in physics it does not play the leading role that the quantum state does. In contrast, the concept of electron density is central in quantum chemistry, in any of its different approaches: the Hartree-Fock Method, the Density Functional Theory, and the Quantum Theory of Atoms in Molecules.
-
578340.887564
Bell’s conclusion from his famous inequality was that any hidden variable theory that satisfies Local Causality is incompatible with the predictions of Quantum Mechanics (QM) for Bell’s Experiment. However, Local Causality does not appear in the derivation of Bell’s inequality. Instead, two other assumptions are used, namely Factorizability and Settings Independence. Therefore, in order to establish Bell’s conclusion, we need to relate these two assumptions to Local Causality. The prospects for doing so turn out to depend on the assumed location of the hidden states that appear in Bell’s inequality. In this paper, I consider the following two views on such states: (1) that they are states of the two-particle system at the moment of preparation, and (2) that they are states of thick slices of the past light cones of measurements. I argue that straightforward attempts to establish Bell’s conclusion fail in both approaches. Then, I consider three refined attempts, which I also criticise, and I propose a new way of establishing Bell’s conclusion that combines intuitions underlying several previous approaches.
-
578398.887578
The article examines the question of priority and simultaneous discovery in the context of the discovery of the periodic system. It is argued that rather than being anomalous, simultaneous discovery is the rule. Moreover, I argue that the discovery of the periodic system by at least six authors in over a period of 7 years represents one of the best examples of a multiple discovery. This notion is supported by a new view of the evolutionary development of science through a mechanism that is dubbed Sci-Gaia by analogy with Lovelock’s Gaia hypothesis.
-
629475.88759
In this paper, I contrast two broad decompositional approaches to verb semantics. One, especially associated with David Dowty, involves translating verbs using a set of precisely interpreted primitive predicates such as cause and become, in order to facilitate semantic generalizations such as patterns of entailment between sentences. Another, with multiple origins in both temporal semantics and theories of the syntax/semantics interface (including, notably, work by Pustejovsky and Piñón), involves developing a theory of the internal part structure of the eventualities that verbs and other expressions describe; I refer to this approach, following Pianesi and Varzi, as mereotopological. These two approaches to decomposition are not, strictly speaking, incompatible, and they have sometimes been combined; however, perhaps surprisingly, comparison of them has been unsystematic. I address this gap by describing more systematically how the approaches differ from each other, illustrating with differences in the insights they offer into specific aspects of the semantics of simple change of state verbs and unselected object resultatives. I especially aim to promote interest in the development of more sophisticated, cross-linguistically applicable theories of so-called event structure through appeal to a wider range of notions from mereotopology.
-
635150.8876
There has long been an impression that reliabilism implies externalism and that frequentist statistics, due to its reliabilist nature, is inherently externalist. I argue, however, that frequentist statistics can plausibly be understood as a form of internalist reliabilism—internalist in the conventional sense, yet reliabilist in certain unconventional and intriguing ways. Crucially, in developing the thesis that reliabilism does not imply externalism, my aim is not to stretch the meaning of ‘reliabilism’ merely to sever the implication. Instead, it is to gain a deeper understanding of frequentist statistics, which stands as one of the most sustained attempts by scientists to develop an epistemology for their own use.
-
635163.887609
I once received a simple test for whether I am a frequentist or Bayesian. A coin has just been tossed, but the outcome is hidden. What is the probability that it landed heads just now? According to the test, you are a Bayesian if your answer is ‘50%, because I am 50% sure that it landed heads, and equally sure that it didn’t.’ And you are a frequentist if your answer is ‘the probability is unknown but equals either 1 or 0, depending on whether the coin actually landed heads or tails, because probabilities are frequencies of events.’ Unfortunately, this test is too simplistic to reveal the complexity underlying the seemingly binary question: ‘To be a frequentist or Bayesian?’ There is actually a spectrum of potential answers, extending from radical frequentism to radical Bayesianism, with nuanced positions in between. Let me build up the spectrum one step at a time.
-
635307.887619
The debate between scientific realism and anti-realism remains at a stalemate, making reconciliation seem hopeless. Yet, important work remains: exploring a common ground, even if only to uncover deeper points of disagreement and, ideally, to benefit both sides of the debate. I propose such a common ground. Specifically, many anti-realists, such as instrumentalists, have yet to seriously engage with Sober’s call to justify their preferred version of Ockham’s razor through a positive account. Meanwhile, realists face a similar challenge: providing a non-circular explanation of how their version of Ockham’s razor connects to truth. The common ground I propose addresses these challenges for both sides; the key is to leverage the idea that everyone values some truths and to draw on insights from scientific fields that study scientific inference—namely, statistics and machine learning. This common ground also isolates a distinctively epistemic root of the irreconcilability in the realism debate. Keywords: Scientific Realism, Instrumentalism, Ockham’s Razor, Statistics, Machine Learning, Convergence to the Truth.
-
635321.887628
The epistemology of scientific inference has a rich history. According to the explanationist tradition, theory choice should be guided by a theory’s overall balance of explanatory virtues, such as simplicity, fit with data, and/or unification (Russell 1912). The instrumentalist tradition urges, instead, that scientific inference should be driven by the goal of obtaining useful models, rather than true theories or even approximately true ones (Duhem 1906). A third tradition is Bayesianism, which features a shift of focus from all-or-nothing beliefs to degrees of belief (Bayes 1763). It may be fair to say that these traditions are the big three in contemporary epistemology of scientific inference.
-
636202.887637
I would like to begin this review by stating that this is an absolutely wonderful book that is full of gems about the elements and the periodic table. In my own 2007 book on the periodic table I concluded that we should perhaps think of the variety of tables that have appeared as spanning a spectrum running from the most abstract and ‘perfect’ tables such as Janet’s left-step table representation, to the unruly tables that emphasize the uniqueness of elements. To illustrate the latter category, I featured an image of Rayner-Canham’s table that is also the table shown on the front cover of his new book now under review. Rayner Canham’s book is all about the individuality of elements and how so many of the commonly held trends in the periodic table are far more complicated than we normally acknowledge.
-
636222.887647
In this paper, we introduce a concept of non-dependence of variables in formulas. A formula in first-order logic is non-dependent of a variable if the truth value of this formula does not depend on the value of that variable. This variable non-dependence can be subject to constraints on the value of some variables which appear in the formula, these constraints are expressed by another first-order formula. After investigating its basic properties, we apply this concept to simplify convoluted formulas by bringing out and discarding redundant nested quantifiers. Such convoluted formulas typically appear when one uses a translation function interpreting a theory into another.
-
747475.887657
Our second stop in 2025 on the leisurely tour of SIST is Excursion 4 Tour II which you can read here. This criticism of statistical significance tests continues to be controversial, but it shouldn’t be. …
-
747476.88767
When you’re investigating reality as a scientist (and often as an ordinary person) you perform experiments. Epistemologists and philosophers of science have spent a lot of time thinking about how to evaluate what you should do with the results of the experiments—how they should affect your beliefs or credences—but relatively little on the important question of which experiments you should perform epistemologically speaking. …
-
751556.887681
A speculative exploration of the distinction between a relational formal ontology and a classical formal ontology for modelling phenomena in nature that exhibit relationally-mediated wholism, such as phenomena from quantum physics and biosemiotics. Whereas a classical formal ontology is based on mathematical objects and classes, a relational formal ontology is based on mathematical signs and categories. A relational formal ontology involves nodal networks (systems of constrained iterative processes) that are dynamically sustained through signalling. The nodal networks are hierarchically ordered and exhibit characteristics of deep learning. Clarifying the distinction between classical and relational formal ontologies may help to clarify the role of interpretative context in physics (eg. the role of the observer in quantum theory).
-
751583.887692
The anthropic principle suggests that the universe’s fundamental constants are precisely fine-tuned to allow for life. However, by incorporating a dynamic physical perspective of nature, such as the multiscale thermodynamic principle known as Principium Luxuriæ, it is found that fundamental constants and forces of the universe may evolve over time in a non-Euclidean universe. If the universe has this geometry, it would have profound implications, which are discussed in this paper. For example, that the conditions conducive to life are not static and finely tuned but rather transient, undermining the need for a fine-tuned universe. Given that multiscale thermodynamics requires external forces, it’s plausible that the universe’s expansion could be linked to the existence of other phenomena such as other universes acting as external forces, each with their own evolving laws of physics. This suggests that life might be a transient and coincidental occurrence across multiple universes, if they exist. Additionally, the ever-evolving physical laws limit our ability to fully comprehend the universe at any given time. As we inevitably overlook certain aspects of reality, physical systems cannot be fully explained by the sum of their parts. Consequently, emergent phenomena like consciousness could not be studied from a self-referential perspective, as there will always be elements beyond our understanding.
-
751626.887703
The paper studies in detail a precise formal construction of spacetime from matter suggested by the logician John Burgess. We presuppose a continuous and perdurantistic matter ontology. The result is a systematic method to translate claims about the geometry of a flat relativistic, or classical, spacetime into claims about geometrical relations between matter points. The approach is extended to electric and magnetic fields by treating them as multifields defined on matter, rather than as fields in the vacuum. A few tentative suggestions are made to adapt the method to general relativity and to quantum theories.
-
848460.887713
Competency questions (CQs) are broadly used within a variety of domains such as education to design curricula using Bloom’s Taxonomy, performance evaluation to assess an employee’s performance, or assessing fitness of interrogation in a trial [19,27]. Within ontology development, CQs are used throughout the processes to guide the development, such as in the NeOn methodology [23] and test-driven development [15], including scoping the ontology [25], aligning ontologies [24], validating content coverage [4,5,15], and to assist in interrogating the ontological nature of an entity when aligning a domain entity to an entity in a foundational ontology [3].
-
866978.887723
This paper introduces a digital method for analyzing propositional logical equivalences. It transforms the theorem-proof method from the complex statement-derivation method to a simple number-comparison method. By applying the digital calculation method and the expression-number lookup table, we can quickly and directly discover and prove logical equivalences based on the identical numbers, no additional operations are needed. This approach demonstrates significant advantages over the conventional methods in terms of simplicity and efficiency.
-
970398.887733
Report on the Conference on Probabilistic Reasoning in the Sciences which took place at the Marche Polytechnic University in Ancona, Italy. 29-31 August 2024. Keywords Probabilistic reasoning; Science; Methodology.
-
976048.887743
It has been a long day and you are making your way through a paper related to your work. You suddenly come across the following remark: “. . . since ? and ? are eigenvectors of ? with distinct eigenvalues, they are linearly independent.” Wait—how does the proof go? You should really know this. Here ? and ? are nonzero elements of a vector space ? and ? ∶ ? → ? is a linear map. You force yourself to pick up a pen and write down the following argument: Let ?(?) = ?? and ?(?) = ?? with ? ≠ ?. Suppose ?? + ?? = 0. Applying ? and using linearity, we have ??? + ??? = 0. Multiplying the original equation by ?, we have ??? + ??? = 0. Subtracting the two yields (? − ?)?? = 0 and since ? − ? and ? are nonzero, we have ? = 0. The corresponding argument with ? and ? swapped yields ? = 0, so the only linear combination of ? and ? that yields is the trivial one.