The Social Bubble of Physics

Sabine Hossenfelder is on a tear this week, with two excellent and highly provocative pieces about research practice in theoretical physics, a topic on which she has become the field’s most perceptive critic.

The first is in this month’s Nature Physics, entitled Science needs reason to be trusted. I’ll quote fairly extensively so that you get the gist of her argument:

But we have a crisis of an entirely different sort: we produce a huge amount of new theories and yet none of them is ever empirically confirmed. Let’s call it the overproduction crisis. We use the approved methods of our field, see they don’t work, but don’t draw consequences. Like a fly hitting the window pane, we repeat ourselves over and over again, expecting different results.

Some of my colleagues will disagree we have a crisis. They’ll tell you that we have made great progress in the past few decades (despite nothing coming out of it), and that it’s normal for progress to slow down as a field matures — this isn’t the eighteenth century, and finding fundamentally new physics today isn’t as simple as it used to be. Fair enough. But my issue isn’t the snail’s pace of progress per se, it’s that the current practices in theory development signal a failure of the scientific method…

If scientists are selectively exposed to information from likeminded peers, if they are punished for not attracting enough attention, if they face hurdles to leave a research area when its promise declines, they can’t be counted on to be objective. That’s the situation we’re in today — and we have accepted it.

To me, our inability — or maybe even unwillingness — to limit the influence of social and cognitive biases in scientific communities is a serious systemic failure. We don’t protect the values of our discipline. The only response I see are attempts to blame others: funding agencies, higher education administrators or policy makers. But none of these parties is interested in wasting money on useless research. They rely on us, the scientists, to tell them how science works.

I offered examples for the missing self-correction from my own discipline. It seems reasonable that social dynamics is more influential in areas starved of data, so the foundations of physics are probably an extreme case. But at its root, the problem affects all scientific communities. Last year, the Brexit campaign and the US presidential campaign showed us what post-factual politics looks like — a development that must be utterly disturbing for anyone with a background in science. Ignoring facts is futile. But we too are ignoring the facts: there’s no evidence that intelligence provides immunity against social and cognitive biases, so their presence must be our default assumption…

Scientific communities have changed dramatically in the past few decades. There are more of us, we collaborate more, and we share more information than ever before. All this amplifies social feedback, and it’s naive to believe that when our communities change we don’t have to update our methods too.

How can we blame the public for being misinformed because they live in social bubbles if we’re guilty of it too?

There’s a lot of food for thought in the whole article, and it raises the important question of why the now long-standing dysfunctional situation in the field is not being widely acknowledged or addressed.

For some commentary on one aspect of the article by Chad Orzel, see here.

On top of this, yesterday’s blog entry at Backreaction was a good explanation of the black hole information paradox, coupled with an excellent sociological discussion of why this has become a topic occupying a large number of researchers. That a large number of people are working on something and they show no signs of finding anything that looks interesting has seemed to me a good reason to not pay much attention, so that’s why I’m not that well-informed about exactly what has been going on in this subject. When I have thought about it, it seemed to me that there was no way to make the problem well-defined as long as one lacks a good theory of quantized space-time degrees of freedom that would tell one what was going on at the singularity and at the end-point of black hole evaporation.

Hossenfelder describes the idea that what happens at the singularity is the answer to the “paradox” as the “obvious solution”. Her take on why it’s not conventional wisdom is provocative:

What happened, to make a long story short, is that Lenny Susskind wrote a dismissive paper about the idea that information is kept in black holes until late. This dismissal gave everybody else the opportunity to claim that the obvious solution doesn’t work and to henceforth produce endless amounts of papers on other speculations.

Excuse the cynicism, but that’s my take on the situation. I’ll even admit having contributed to the paper pile because that’s how academia works. I too have to make a living somehow.

So that’s the other reason why physicists worry so much about the black hole information loss problem: Because it’s speculation unconstrained by data, it’s easy to write papers about it, and there are so many people working on it that citations aren’t hard to come by either.

I hope this second piece too will generate some interesting debate within the field.

Note: It took about 5 minutes for this posting to attract people who want to argue about Brexit or the political situation in the US. Please don’t do this, any attempts to turn the discussion to those topics will be ruthlessly deleted.

This entry was posted in Uncategorized. Bookmark the permalink.

71 Responses to The Social Bubble of Physics

  1. Bernardo says:

    If this type of social dynamics is happening in Physics, where any given direction of research is at least driven by the goal of saying something about reality and is expected to be tested in experiments, imagine how bad the situation may be in Mathematics? It is possible that Personalities and social dynamics influence the evolution of math to such extent that, under a small perturbation, the whole field would be unrecognizable. Also, what if all math connected to this field of theoretical physics is a waste of time and, in the long run, will become completely irrelevant?

  2. Peter Woit says:

    Bernardo,
    Mathematics has never been driven by experiment, so has evolved its own different standards and practices which have allowed it to make (and continue to make, unlike physics) great progress. This posting is about the situation in physics, but an interesting question I’ve often raised is whether physics, as it finds itself starved of the new experimental results that in the past kept it moving forward, might learn something from mathematics about how to cope. Unfortunately, this is one topic that Hossenfelder doesn’t address.

  3. Bernardo says:

    Much if not most of mathematics is in some sense driven by applications if not experiments in the sense of discovering something new about nature. (One example is the work on wavelets for which the latest Abel prize was awarded, but of course there are many other examples.) I guess what I meant was that the situation in theoretical physics that was described in your post is the field running into a hard wall and still allowing social dynamics determine its development, while in mathematics there is no such hard wall. I grant you that mathematics keeps making great progress, but these parallels with physics makes me wonder how different mathematics would look like under different circumstances. But I wonder what you think physics can learn from mathematics?

  4. Peter Woit says:

    Bernardo,
    I’ve often written about this, with one of the most important things I think mathematician’s insistence on being clear about precisely what you understand and what you don’t. If you don’t have experiment to keep you honest, the field needs strong internal norms to keep clear the difference between what works and what doesn’t (and strong internal norms to keep people honest in their evaluations of what is working and what isn’t). How to change the culture of physics to move in this direction is a difficult question, especially since my impression is that most physicists at this point do not see this as a problem.

  5. G. S. says:

    I don’t know if it’s fair to generalize the problems of theoretical physics to being problems of all scientific disciplines.

    Sabine’s article mentions a problem common to all fields of science (reproducibility – which she admits is being addressed by the scientific community), but then pivots to the problems particular to theoretical physics. What other fields have these same problems of theoretical physics?

    She mentions that “at its root, the problem affects all scientific communities”, but then chooses to give an example from politics rather than from another scientific discipline.

    Certainly the soft sciences (e.g., economics, sociology, etc.) have examples of theories that will not die despite a lack of testable predictions. Heck, economics is rife in theories that survive despite clear contradictory evidence (e.g., strong forms of the efficient market hypothesis).

    But what about the other hard sciences? Are there similar problems in biology, chemistry, geology, meteorology, botany, engineering, computer science, material science, etc.? If so, what are they? What are the niches of those fields in which social pressure is resulting in deviation from the scientific method?

    They may exist. But until the examples are given, the article would more accurately be titled “Theoretical Physicists need reason to be trusted”. The generalization to all sciences is unfair.

  6. Bernardo says:

    Peter,
    In my opinion, if physicists will try to be more like mathematicians, something precious will be lost. You sometimes must have this attitude to solve the problem by any means necessary to get to the answer, even if it takes a few more decades to make sense out of it.

  7. Peter Woit says:

    Bernardo,
    Don’t worry, based on my experience, there is no danger that theoretical physicists will take an interest anytime soon in behaving more like mathematicians and paying serious attention to the difference between what they understand and what they don’t.

  8. CIP says:

    Particle physics today has some parallels with atomic physics in ancient Greek thought, or medicine in the Middle Ages. There just isn’t good new data. I’m not sure there is a cure in our immediate future.

  9. Klavs Hansen says:

    It is somewhat disturbing to see physics identified with the relatively narrow field of high energy physics theory. There are a number of other disciplines that produce vast amounts of results continuously, and to identify any crisis in physics, the symptoms must be shown by at least a couple more of these fields. I don’t think they are. Although, of course, a lot of the low-hanging fruit of previous centuries has been picked, and everybody finds it more difficult to discover fundamental laws of nature.

  10. neil says:

    Bee describes the results but not the reasons. The problem is extreme in HEP physics because the cost of discovering new experimental results has risen exponentially relative to the cost of producing new theoretical results. In other sciences, and even other fields of physics, this has not happened to the same extent. Theory is relatively cheap (paper, pencils and, hopefully, wastebaskets) so we get a lot of it, often unrelated to our ability to confirm or refute it.

  11. Wyman says:

    I strongly agree with Klavs Hansen here. Theoretical high-energy/particle/fundamental physics only represents a small fraction of theoretical physics as a whole. I would say that in condensed matter, there are far more open problems than there are theorists. In the past couple of decades we’ve seen the emergence of both highly computational condensed matter physics as well as a new focus on topological properties that has by now reached every corner of study. It’s a genuinely exciting time to be a theorist working on condensed matter systems — we have so many questions and so many new tools!

  12. Anonymous says:

    Producing papers is part of physicist job. This is how they get paid for. Whether or not their theory can be tested is not important.

  13. Bee says:

    Thanks for the mention 🙂

  14. Bee says:

    Btw, the title of the Nature Physics comment was chosen by the editor.

    @Wyman

    Indeed, I have even collected some numbers here

    http://backreaction.blogspot.com/2013/05/what-do-most-physicists-work-on.html

    As I point out in my article, the overproduction problem is specific to the foundations of physics. But it’s a symptom of a much wider problem, which is that scientists are blind to the problems in the academic system.

    @neil

    I mentioned this in my article, and I also explain why this argument is as tiresome as wrong.

  15. Bunsen Burner says:

    How in the world did the problems of physicists representing less than 1% of all physics become a crisis for the whole discipline. I see no crisis in quantum computation, condensed matter, stellar interiors, computational fluid dynamics, high precision astronomy, and thousands of other topics all more worthwhile than quantum gravity.

    If the last 20 years represent a crisis in physics, then my answer is: Please God, let the crisis continue.

  16. Peter Morgan says:

    The foundational problems that Physics has are not limited to HEP. One of the consequences of the lack of foundational clarity in QM and of mathematical clarity in QFT is that condensed matter and other users of QM/QFT have become rather a hodgepodge of approaches. There are some patterns, which people have become used to thinking are good enough, but things could be better organized. One can read Bee’s lament as about HEP becoming more hodgepodge-like, like condensed matter, etc.

  17. M says:

    Wyman and Burner, can you please give examples of the most interesting discoveries recently made in condensed matter and quantum computation? It seems to me that these fields have their own problems: the only future of condensed matter seems doing more complicated versions of what was understood decades ago. And quantum computation looks like a bubble, like string theory 20 years ago.

  18. Low Math, Meekly Interacting says:

    To be fair, the life sciences have revealed a kind if evil twin of this situation: when you are literally awash in data, when doing experiments is relatively easy because of the nearly endless array of possibilities, the result can be a vacuum almost as profound. I say “almost” because, while someone can always produce an unimportant piece of crap and boost their citation count without penalty, most truly important crap invites rapid and widespread experimental challenge. Stimulus-triggered pluripotency is a good case study of how that situation plays out. Barring those corrective reflexes, the field would be a cesspool of trivial fraud. There is simply no substitute for the ability to subject scientific hypotheses to reproducible observational challenge. Never was, never will be, ever. Humanity has proven this time and again, and the intelligence and integrity of individuals is clearly no defense against societal pressures. To the extent we surrender to the supremacy of objective reality, we make progress. To the extent we’re disconnected from it, we inevitably lose our way.

  19. Bunsen Burner says:

    M. Seriously? Condensed matter is stagnant and quantum computation is a bubble? Perhaps rather than pollute Peter’s blog with the back and forth of examples that I’m sure will never convince you, why not try asking your question on a blog by an expert in one of these fields – such as Scott Aaronson.

  20. Nick M. says:

    Peter Woit wrote:

    April 7, 2017 at 5:54 pm

    “… This posting is about the situation in physics, but an interesting question I’ve often raised is whether physics, as it finds itself starved of the new experimental results that in the past kept it moving forward, might learn something from mathematics about how to cope. Unfortunately, this is one topic that Hossenfelder doesn’t address.”

    Hi Peter! I believe that Sabine Hossenfelder may have touched upon this very topic in a blog article (dated December 8, 2016) from her “Backreaction” website titled

    No, physicists have no fear of math. But they should have more respect.

    Although the front portion of this article deals with an entirely different issue, the “But they [physicists] should have more respect [for math]” portion addresses the topic that you expressed concern about.

    If I may quote the last few relevant paragraphs from Sabine’s blog entry:

    So, I don’t think physicists are afraid of math. Indeed, it sometimes worries me how much and how uncritically they love math.

    Math can do a lot of things for you, but in the end it’s merely a device to derive consequences from assumptions. Physics isn’t math, however, and physics papers don’t work by theorems and proofs. Theoretical physicists pride themselves on their intuition and frequently take the freedom to shortcut mathematical proofs by drawing on experience. This, however, amounts to making additional assumptions, for example that a certain relation holds or an expansion is well-defined.

    That works well as long as these assumptions are used to arrive at testable predictions. In that case it matters only if the theory works, and the mathematical rigor can well be left to mathematical physicists for clean-up, which is how things went historically.

    But today in the foundations of physics, theory-development proceeds largely without experimental feedback. In such cases, keeping track of assumptions is crucial – otherwise it becomes impossible to tell what really follows from what. Or, I should say, it would be crucial because theoretical physicists are bad at this.

    The result is that some research areas can amass loosely connected arguments that follow from a set of assumptions that aren’t written down anywhere. This might result in an entirely self-consistent construction and yet not have anything to do with reality. If the underlying assumptions aren’t written down anywhere, the result is conceptual mud in which case we can’t tell philosophy from mathematics.

    One such unwritten assumption that is widely used, for example, is the absence of finetuning or that a physical theory be “natural.” This assumption isn’t supported by evidence and it can’t be mathematically derived. Hence, it should be treated as a hypothesis – but that isn’t happening because the assumption itself isn’t recognized for what it is.

    Another unwritten assumption is that more fundamental theories should somehow be simpler. This is reflected for example in the belief that the gauge couplings of the standard model should meet in one point. That’s an assumption; it isn’t supported by evidence. And yet it’s not treated as a hypothesis but as a guide to theory-development.

    And all presently existing research on the quantization of gravity rests on the assumption that quantum theory itself remains unmodified at short distance scales. This is another assumption that isn’t written down anywhere. Should that turn out to be not true, decades of research will have been useless.

    In lack of experimental guidance, what we need in the foundations of physics is conceptual clarity. We need rigorous math, not claims to experience, intuition, and aesthetic appeal. Don’t be afraid, but we need more math.

    As usual, very nicely said Sabine!

  21. Forbes just published this article:
    https://www.forbes.com/sites/chadorzel/2017/04/06/why-are-there-too-many-papers-in-theoretical-physics/#68ee330d37ee
    Gist:
    1. There are too many theoretical particle physicists fearful of publishing anything that might jeopardize their careers;
    2. There is a dearth of high sigma experimental results that could lead to “new” physics;
    3. Therefore, anytime there is a glitch in some experimental result, there is a stampede of publications attempting to explain it;
    4. When glitch is fixed (shown to be statistically insignificant), no one retracts or feels embarrassed by their publication(s), for everyone does it. It is socially acceptable.

    I could go on and on, but I already have:

    https://www.amazon.com/Fire-Night-Functioning-Maverick-Theoretical/dp/154305644X

    Pay particular attention to everything after chapter, “Imagine you’re a dolphin”.

  22. Another Anon says:

    Wyman, Bunsen Burner, and all the rest of you condensed matter physicists, and fluid dynamicists, and all of you getting annoyed because you don’t see a problem – seriously, this isn’t about you. It’s a problem with fundamental physics.

    It really bugs me when people say there’s no problem in physics because they’ve just invented a new transistor. We are talking *fundamental physics*. Please!

  23. AnonNth says:

    Another Anon, the problem isn’t that condensed matter physicists do not see a problem with fundamental physics, but rather that high energy physicist are becoming obsolete because of the great success that condensed matter physicists are having solving fundamental AND high energy physics problems with the far more experimentally accessible mathematical and theoretical machinery of condensed matter physics.

    String theorists in particular, are having a hard time saying that they just plain wrong, and seem to be unable to cope with condensed matter theory stealing their glory..

  24. Peter Woit says:

    All,
    Will delete any more arguments between fields of physics. Yes, the problem at issue is one of certain specific fields, not all of physics, and not all of science.

    Interestingly, no one seems to deny that that there is a problem…

  25. cthulhu says:

    G.S. asks about other fields including engineering. I’m an engineer specializing in aircraft flight control, an area which has been one of the two primary drivers of control theory over the last 50 years (the other being industrial process control). As we have pushed our aircraft designs to deliver more capabilities in numerous areas, we have demanded much more of the flight control systems, and there has developed a huge interest in design tools that give us guarantees about stability and robustness to uncertainties.

    I won’t go into details but will state without offering proof that these techniques have turned out to be difficult to apply in the real world, mask much of our physical understanding of what the so-called bare airframe (the aircraft without any active controls) is doing, and have over the years led to several expensive redesigns on multiple programs, affecting both investors’ and taxpayers’ pocket books in multiple countries; the redesigns have usually been done primarily with techniques that, while perhaps less mathematically elegant, have the distinct virtue that they have been shown to work on countless prior programs. The “modern” techniques have found a role in the analysis side of things, helping us understand the limits of our designs, and this is a valuable thing to have. Mistakes in the flight control design are probably the easiest way to cause a catastrophic accident short of pilot error.

    So, my answer to the question is that yes, at least some branches of engineering have been afflicted with paths that over-promise and under-deliver, but at least for aircraft flight control we still have the ultimate Diogenes to keep us honest: does it actually fly or not?

  26. Ted says:

    Perhaps the best way to change the sociology/mechanics of academia is to pave a new route for the people entering the field. As a student who has been searching for a PhD research direction with promise in fundamental physics, its a depressing situation. The obscene amount of research, coupled with no significant results coming out for decades is clearly the symptom of a system that needs to start supporting academics (especially new ones) to explore new directions – and to question why our current ones aren’t working. Strongly agree with what was said in the quote in the comment above – “In lack of experimental guidance, what we need in the foundations of physics is conceptual clarity”.

  27. Thomas says:

    Peter, Bee,

    the lack of new and interesting data and the overproduction of speculations, both in high-energy physics and quantum gravity, could be due to a simple reason:

    Maybe there is nothing to be discovered any more in these two fields?

    We all know that such a statement has been made repeatedly in the past, and it has always been wrong. But could it be correct this time? This is the touchy issue swept under the rug in all these discussions. This is the touchy issue avoided like the plague by everybody. This is the touchy issue that led to string theory, loop quantum gravity and other speculations not related to experiment but with a large number of followers.

    We need an objective debate of this touchy fundamental issue, not a discussion of its side effects. Why is there no such debate? Just look at the answers to such a proposal. You will see ridicule, contempt or aggression – but no objective debate.

    The world is full of smart physicists who have been trained and imbued with the conviction that there is still a lot to discover. Maybe this conviction is wrong?

  28. GoletaBeach says:

    Burt Richter’s 1999 closing talk at the Lepton Photon discussed these issues before… https://arxiv.org/abs/hep-ex/0001012 . His talk itself used to be online… now 404. In person he was substantially more caustic toward the theory community.

    “There are more of us, we collaborate more, and we share more information than ever before.”

    I’m not at all sure that there are more experimentalists since the 1980s… hard to tell because so many leave after PhD and postdoc… lately for high-paying data science jobs.
    In the late 1990’s for all sorts of telecom jobs. And a constant move to Wall Street.

    Einstein did synthesize GR from only foundational notions… current theorists keep trying to pull a similar rabbit out of a hat. Somehow I don’t find it surprising that all the social techniques familiar for millennia to define the “in crowd”, “the outsiders”, Cassandras, rebels, syncophants, etc are in play in theoretical particle physics. It is a social activity.

    With luck, though, out of all the dung a really delicious mushroom might sprout.

  29. Peter Woit says:

    All,

    I hope people will actually address the issues discussed by Sabine Hossenfelder, since I think they’re interesting, and a serious discussion of what might be done to address them is important and desperately needed.

  30. Jeffrey says:

    Six hundred papers interpreting what turned out to be a statistical fluke? From what I gather it is irrelevant. They are getting direct deposits in their bank account anyways, why change? Who cares if was a fluke? Seriously, they could have known it was a fluke anyways, and said, “hey, we know this result is nonsense, lets publish something anyways because we need to get paid.”

    It would be win-win. They write up some interpretations, they get paid, the results come out as being a fluke, then they can blame the fluke on the machine. Easy. As a matter of fact it would be best for the machine to make mistakes and to have issues, and be so big that parts randomly break at intervals. Job security forever!

    Sabine does not mention money in the article. Sabine can sit here and talk all day, and write up all the interesting social bubble stuff, but unless the cash stops flowing, then nothing is going to change.

  31. Tom Andersen says:

    Here is a situation that is happening today that is new. Instant worldwide intercommunication. Its of course great to have, but in theoretical physics it is also causing groupthink. No longer can a bright team work without interruption on a maverick idea for a year or three. It seems instead that papers are produced based on perceived consensus.

    Without the magic of the internet and $500 long distance travel, the world of physics was different and more open to strange ideas a century ago.

  32. neil says:

    Why all the hand-wringing about the theorizing on the diphoton anomaly? Yes, it was low sigma but it was observed by both ATLAS and CMS. Given the importance of priority, which science has always respected, who wants to wait around for more data before trying to explain it? I don’t see what harm was done.

  33. GoletaBeach says:

    Well, Peter, the amplification of social feedback in the theory community has widened the gulf between theory and experiment. Experimentalists just roll their eyes and when among themselves are so derogatory they make you look like a string-theory booster.

    Perhaps every particle theory PhD should include a mandatory 1 year doing hardware shifts at the LHC, or on any big experimental effort.

  34. Hi Peter,

    As someone who’s often trying to get the straight dope about the topics she writes about, I appreciate Sabine’s writing immensely—her bluntness, humor, and obvious relish puncturing hype strike a chord with me as they apparently do with you.

    But regarding the information paradox, I think there’s an actual technical point at issue, which creates a fork in the road before one enters into any questions of faddishness, sociology, etc. Namely: should the black hole entropy, which we presumably agree is governed by Bekenstein’s bound, count the log of the number of orthogonal microstates? I think yes—I don’t understand what entropy even MEANS if it’s not counting microstates—while Sabine apparently thinks no. But crucially, if the answer is yes, then what Sabine advocates as the “obvious, inexplicably ignored” solution to the paradox—i.e., that all the information stays in the black hole, even as the latter gets microscopically tiny—is indeed ruled out. So then, unless we want to give up on unitary QM, or have the information escape through the singularity into a baby universe, as far as I can see we’re indeed forced to consider the proposal that the infalling information is holographically encoded on or near the horizon, from where it can enter the radiation. So in that case, we don’t need to offer a sociological explanation for why people remain so interested in the information paradox 40+ years after Hawking raised it: a scientific explanation suffices!

  35. Another Anon says:

    Peter will almost certainly delete my comment for being off-topic, but I agree with Scott in saying that – if the information does not escape as the black hole evaporates – how on earth can all the information that has ever fallen into a black hole be encoded into an progressively smaller, effectively infinitely small, region of space? Bee thinks the problem is solved if the singularity is removed. Surely not.

  36. spacetime says:

    I bit strange to read

    “But we have a crisis of an entirely different sort: we produce a huge amount of new theories and yet none of them is ever empirically confirmed. Let’s call it the overproduction crisis.”

    and

    “I’ll even admit having contributed to the paper pile because that’s how academia works. I too have to make a living somehow.”

  37. Anonyrat says:

    I think the “over-promise, under-deliver” problem is a bit different from the one Bee is talking about. In the information technology world, this “over-promise, under-deliver” is represented, e.g., by the Gartner Hype Cycle, the 2016 version of which can be viewed here ( http://www.gartner.com/newsroom/id/3412017 ) (I am not affiliated with Gartner.)

    Per Gartner’s analysts, there is a natural progress curve, on top of which is added inflated expectations due to hype. Hype is driven by marketing, confirmation bias, novelty preference, social contagion, competitive pressure, irrational exuberance and (positively) overcoming inertia and imagination. The dangers of not understanding the hype cycle for corporations is they adopt technology too early, give up too soon on a technology, adopt too late or hang on too long to old technology. Understanding the hype cycle also opens up opportunities, which Gartner lists in some of their publications, that I won’t repeat here.

    The problem Bee describes has some factors in common with the technology hype cycle, such as social contagion and competitive pressure (though it is academic competition rather than commercial competition). Novelty preference may also play a role in the fad-driven research cycle of fundamental physics.

  38. Peter Woit says:

    Scott,
    I see Sabine has responded to this on her blog, pointing to
    http://backreaction.blogspot.de/2012/07/bekenstein-hawking-entropy-strong-and.html

    First I should make clear that I haven’t thought much about this, partly because of a personal allergy to spending a lot of time on a topic that many smart people have been active in for a long time, producing lots of papers, but no obvious progress. One aspect of this is that as this kind of thing goes on for decades, the barrier to entry to someone like me who likes to first try and understand what others have done becomes higher and higher.

    As a graduate student nearly 40 years ago I spent a fair amount of time learning about Hawking radiation and was aware of the paradox, but it seemed that the framework of what was understood was not rigid enough to address the issue (in part because it was silent about “what is going on at the singularity?”). Over the years I’ve made periodic attempts to read up on the latest on this, but my impression has always been that this situation hasn’t really changed. Yes, you can make various assumptions about what entropy is in this situation, about the Bekenstein bound, or about what “holography” is and what it implies here, and you can derive various consequences from various assumptions. But no compelling picture seems to emerge from this activity. Given this, there’s a real sociological question about why this is such a popular thing to be doing. I guess the best argument for it is not that anyone is going to resolve the issue, but that thinking about this problem may lead people to something more fruitful.

    Another Anon,

    I just don’t see how arguments like “how on earth can all the information that has ever fallen into a black hole be encoded into an progressively smaller, effectively infinitely small, region of space?” can resolve anything. At the crudest level, the singularity is a place that we don’t understand, where everything has gone. Why shouldn’t the answer to the paradox be there rather than at the event horizon? But my question is just as rhetorical and unscientific as yours. You need a well-defined theory to answer such questions, and that doesn’t seem to exist.

  39. Lee Smolin says:

    Dear Scott and Peter,

    Sabine and I discussed in detail the issue of the meaning of the BH entropy in our arXiv:0901.3156. We argue there that the original and standard argument from Bekenstein only supports a weak form of the BH entropy which is that the surface area is a measure of the capacity of the black hole horizon as a channel for the transmission of information. The strong form, that you name, that the BH entropy is the log of the number of microstates of the interior of the horizon, is an unjustified extrapolation.

    Regarding the size of the interior of the horizon, it is understood since the early days of the subject that the volume of the interior of the horizon can be very large compared to the area, this is the so-called “bag of gold.” We discuss in detail other objections and issues such as those having to do with remnants. In addition, note that there areworked out examples where the singularity is removed leading to a resolution of the paradox: Ashtekar et al’s study of CGHS black holes: arXiv:0801.1811. and by Rovelli et al, Planck stars.

    The weak and strong form of the BH entropy then imply weak and strong forms for the holographic principle, as discussed in hep-th/0003056.

    Let me put the issue simply: The “conservative solution” that quantum gravity effects eliminate the singularity does indeed resolve the information paradox, as well as related puzzles such as the firewall paradox. Further, this requires no modification of standard physics at the horizon, no modification of the principle of unitarity, and no radical non-locality, but only modifies physics at singularities, where we already know we need new physics. Given this, I would think we need an extraordinary motivation to ignore this conservative and natural solution in favour of speculations that impose radical new physics on the horizon, where the curvature may be very weak.

    Thanks,

    Lee

  40. TVR says:

    If you assume that singularity forms in gravitational collapse, then you have to trace out the modes that presumably end up in the singularity and keep only the outgoing modes. That is the origin of the information loss paradox. You remove part of the original state, and then obviously something is missing, i.e. the density matrix does not describe a pure state.

    If there is no singularity, then you do not trace out any modes. Then this paper Phys.Rev.Lett. 114 (2015) no.11, 111301 practically answers the question. Most of information is not in Hawking quanta, but actually in subtle correlations between them. These correlations are described by the off-diagonal elements of the density matrix. They start off very small at the beginning, but grow as the Hawking evaporation progresses. So “information” is not released at the very end where you need Planckian physics. These correlations are able to restore unitarity. The whole process is unitary all the time, but an observer collecting only the Hawking quanta might have an impression that the information is lost.

    Note that the singularity and event horizon are much related notions. If there is no singularity, the horizon cannot be a global event horizon. If singularity is replaced by the normal region of strong gravitational fields, once enough mass is lost to evaporation, anything that was trapped inside must be released. An apparent horizon might exists for a while, but it will disappear at the end. No remnants with huge amount of “information” in them are needed.

  41. neil says:

    Does anyone know whether the Event Horizon Telescope will be able to exclude the “information released early” (firewall) possibility?

  42. reader says:

    Lee,

    This proposal has features that make it seem not so conservative after all. As is well known, if the thermality of the radiation only breaks down once the black hole becomes of Planckian mass, then the remaining object must decay over a very long timescale for its radiation to purify the full state. Indeed this timescale can be made arbitrarily long by increasing the mass of the original black hole. So this scenario requires very long lived remnants. More seriously perhaps, it seems impossible to make this work for a black hole formed in AdS. Putting aside anything to do with AdS/CFT, AdS acts like a finite sized box, so a state of radiation of total energy M_Pl has a small maximum entropy. In this case there is no way for a Planckian remnant to decay into a gas of (arbitrarily) high entropy quanta whose state would purify that of the Hawking radiation emitted earlier. So while it is logically possible for black holes to release their information in Hawking radiation in asymptotically flat space (while only modifying Planck scale physics) this seems impossible in AdS. I find that highly problematic.

  43. Giotis says:

    “When I have thought about it, it seemed to me that there was no way to make the problem well-defined as long as one lacks a good theory of quantized space-time degrees of freedom that would tell one what was going on at the singularity and at the end-point of black hole evaporation.”

    Actually is the other way around.

    Your attitude with respect to the information loss paradox suggests the kind of theorist you are.

    If you still believe that information is lost or in remnants, then almost certainly you are a GRist; if on the other hand you believe that the information escapes via the Hawking radiation and to complementarity then you are almost certainly a particle/String theorist.

    If you believe in Mathur’s fuzzballs then you are Mathur; LOL, that was a joke, actually fuzzballs have many supporters, far far more than those believing in remnants who are indeed a small minority nowadays.

  44. Peter Woit says:

    Giotis,
    My attitude with respect to the information loss paradox is that I don’t know the answer. Don’t know what kind of physicist that makes, me, maybe it makes me a mathematician…

    Your schema breaking up physicists into different tribes depending on their ideological beliefs about black holes is a scary vision of the postfactual future (or present?).

  45. Zack Yezek says:

    It appears that this malaise is really limited to ‘fundamental’ or ‘high energy’ physics, especially ‘grand unified theories’ or ‘quantum gravity’. And there actually IS healthy progress there, if you consider things like neutrino physics to be a part of it.

    After all, the null results of the past 40 years ARE significant. Arguably the non-detection of WIMPs & SUSY are our era’s versions of the Michelson-Morley experiment. The problem isn’t with theorists who knock off a new model that can explain tentative new experimental results, like the CERN 750 GeV one. No, the problem is much of this “theorizing” doesn’t produce useful theories. Or at least it hasn’t after 40 years of research, if you define a real theory as one that makes quantitative, falsifiable predictions. SUSY, String Theory, etc. don’t do that. They come in so many distinct flavors, with so many free parameters, that they basically make no hard predictions and can evade any negative experimental result.

    The real solution is to draw a hard line between mathematical frameworks, hypotheses, speculations, and genuine theories. This will let people work on speculative ideas or big frameworks like SUSY, but also make it clear that there is a qualitative difference between those and genuine theories of physics like General Relativity or QCD that quantitative, testable predictions. The expectation then should be that an idea must meet certain “meta” criteria to graduate from speculation to hypothesis, and then from hypothesis to theory. This system would also make it clear that comparing things like SUSY and GR is a category error, and allow them to be judged on different criteria. That way you can really enforce a necessary, fair rule like “If your hypothesis evolve into a testable theory within 40 years, we’re no longer giving it research priority”.

  46. Pingback: Sunlight bursts the social bubble of physics and math | RNA-Mediated

  47. Giotis says:

    Right or wrong the dichotomy is there, I didn’t invent it.

    Check around min 1:23:10 of this video and you will understand it within the context of information loss paradox resolution proposals and remnants in particular.

    https://youtu.be/3EOpHHjv5g8?t=4991

  48. twistorial says:

    The notion that HEP should only be about finding *the* new model is an oversimplification, just as is the notion that the standard model is completely understood. There are several healthy communities working on understanding just standard model physics. This may even be the best bet of finding new physics, although it almost certainly would not be of the romantic smoking gun variety.

    I agree that those areas of HEP that are in danger of forming tadpoles or vacuum bubbles in theory space should take a good hard look at themselves though. Proposed yardstick: articles starting with “recent progress…”.

  49. Bernhard says:

    “The only response I see are attempts to blame others: funding agencies, ..”

    Well, meap culpa, as I’m one of those blaming “others” than. Funding agencies still determine everything. They determine which ideas are to be pursued or not and it is not as simple as saying that the reviewers are just scientists/ourselves. These agencies are more than the simple sum of the reviewers – there clear guidelines, written and not uncommonly unwritten in order to determine which persons and groups should get the funding, which ideas should be rewarded and which should be punished. Sorry to be simplistic here, but the issue 99% of the time boils to money. The reward idea pre-Higgs was the Higgs and SUSY now things are clearly shifting to dark matter. What if you’re not interested in dark matter, Higgs or anything mainstream but still want to do conservative, sober physics? Sorry, but unless your CV is stellar, and sometimes even so, you’re not getting the money and the following year you will obey. We all have mortgages to pay. The social problem with physics is that we cannot get rid of this model. I believe many of us would try crazier, harder things, that would take longer to publish if risking were to be rewarded. About the fact that “they don’t to waste money on useless research” – define waste. Waste for them is something that attracts no attention – if something does attract attention than no matter how big the hype, it was all worth it. And what power do we actually have here (the cute idea that “they rely on us, the scientists, to tell them how science works”)? We have zero power. This is a closed loop circle where people who were rewarded by this system end up being the future judges of it, doing all they can to keep the status quo. This kind of thing happens not only to theoriests, even experimentalists will be accused of having a “too narrow” program if what they’re doing is not somehow connected to Higgs, SUSY or dark matter. Funding agencies taking a big chunk of the blame is the conservative answer to this problem she’s failing to see.

  50. Peter Woit says:

    Bernhard,

    It’s true that changes in how decisions about funding are made could have large effects. I find it remarkable that, at least in the US, I’m unaware of any significant discussion at DOE or NSF about whether the problems of theoretical HEP physics could be addressed by changing in some way how the system of grants work.

    I also haven’t heard any news since December about any response to the “DOE Theory Letter” from a large fraction of US HEP theorists asking for an explanation of why grants to theorists have been cut. Unfortunately it may be that, absent any discussion of how to improve the situation, someone at the DOE has decided the thing to do is keep the system the same, just slowly defund it.

Comments are closed.