Not the Measurement Problem’s Problem: Black Hole Information Loss with Schrödinger’s Cat
Dr. Saaskhi Dulan, Johns Hopkins University
Recently, several philosophers and physicists have increasingly noticed the hegemony of unitarity in the black hole information loss discourse and are challenging its legitimacy in the face of the measurement problem. They proclaim that embracing non-unitarity solves two paradoxes for the price of one. Though I share their distaste over the philosophical bias, I disagree with their strategy of still privileging certain interpretations of quantum theory. I argue that information-restoring solutions can be interpretation-neutral because the manifestation of non-unitarity in Hawking’s original derivation is unrelated to what’s found in collapse theories or generalized stochastic approaches, thereby decoupling the two puzzles.
Information and Dynamics: Competing Metaphors for Biological Intelligence
Prof. Dr. Benjamin Jantzen, Virginia Tech, USA
It is fashionable to consider organism behavior – the variable response of a living thing in a dynamic environment – in terms of information. Organisms are said to harvest or take in information from the environment that is then processed to generate a behavioral response. This informational perspective in turn supports a number of theses concerning behavior, such as the idea that large brains are needed for "complex" behavioral repertoires, or that adaptive behavior involves the storage of states reflecting bits of information about the world external to the organism. There are reasons to be suspicious of these and other consequences of informational thinking, and thus of the underlying metaphor. Here, I will focus just on the connection between internal and external states in producing adaptive behavior.
There is a long tradition of describing organisms as information processors, and their internal states as engaged in storage of information and its transformation by computation. Broadly speaking, this perspective divides behavior along the boundary of the organism. Information enters by why of sensation, is transformed by computation, and exits in the form of behaviors, whether physiological or mechanical. Such a view demands that enough information be passed in from sensory systems at a high enough rate and be transformed quickly enough via computation to produce responsive, adaptive behavior. It thus puts constraints on the relation between the world outside and the world inside an organism. Rate-distortion theory, for instance, tells us, when given some measure of performance as a function of internal and external states, how many bits of mutual information between internal states and the environment are needed to optimize performance. It is presumed that this mutual information between external and internal states is like matter or energy that the organism must then harvest to survive.
However, I argue that two classes of system for realizing adaptive behavior – reservoir computers and systems exhibiting “strong anticipation” – put pressure on this view, or at least on the utility of information talk as anything more than metaphorical. In both cases, it seems that a straightforward application of information theoretic approaches yield contradictory judgments on the amount of information the system must harvest from its environment, and that it's probably best to view these cases in dynamical, rather than information-theoretic terms. This in turn suggests ways of improving our quantitative study and application of such systems by selecting more appropriate metrics of their behavioral capacities.
Disentangling the uses of ‘entropy’ and ‘information’ in classical thermal physics
Prof. Dr. Javier Anta, Universitad de Sevilla
My aim in this talk is fourfold. Firstly, I will systematically analyze how the notions of ‘entropy’ and ‘information’ have been used in classical thermal physics since the 1930s until our days, describing the cluster of meanings involved in those conceptual usages and underlying their historical evolution. Secondly, I will assess the extent to which those all-pervasive entangled use of entropy and information notions can be somehow (e.g., semantically, epistemically [Anta 2021], mathematically, and so) defective in classical physics, such as being poorly-defined, being simply meaningless or even fostering confusion among scientists (see Anta 2023). Thirdly, I will evaluate the two main strategies in the marked directed to improve these defective conceptual practices: (i) substituting the highly-connotated terms of ‘entropy’ and ‘information’ by some non-loaded terms, as for instance it was first argued by Yehoshua Bar-Hillel in the 1950s; or (ii) developing a prescription on how these terms should be correctly employed to be technically meaningful, as it was pioneered by Rudolf Carnap (1977) in Two Essays on Entropy. Finally, I will argue that, in order for some ameliorative strategy to succeed, it would be necessary to develop a well-defined implementational strategy (Anta 2025), by which ameliorative conceptual prescriptions about how to use ‘entropy’ and ‘information’ could have a significant impact in the scientific community.
Is (the concept of) Information Fundamental in Physics?
Prof. Dr. Chris Tompson, University of Oxford
There has been some considerable excitement around the idea that the concept of information provides a new unifying idea in the foundations of physics, with some even going so far as to suggest that information might by physically fundamental. But what does it even mean to claim that something is physically fundamental? I shall suggest that there are a number of different relevant notions of ‘physically fundamental’, and that when one is clear on these distinctions one can get a much better sense of what work the concept of information is doing for us. One immediate lesson is that there is no reason at all to believe that information is ontologically fundamental in physics.