Eclipses and Decibels
I took a trip to Casper, Wyoming to look at the total solar eclipse. It was as awesome as people promised, but one thing about the experience stood out to me: it cooled off a lot sooner than it got darker. For almost the entire 1.5-hour period leading up to "totality," it was pretty bright outside, and then suddenly it wasn't. In contrast, the sun went from annoyingly hot, to kinda chilly, very gradually.
The sensation of heat on your skin is related linearly to the power (wattage) being delivered, in this case by the sun. We know that, in general, temperature is roughly linear with power, because if you use a 1000 watt heater instead of a 100 watt heater, the temperature rises 10x as fast.
The amount of power delivered by the sun should be proportional to the visible area, which changes pretty steadily over the course of the eclipse. I simulated the process of two circles (the moon about 3% larger) slowly overlapping. You can see the power drops fairly linearly until totality, at which point it stays at zero for a while, then rises again. (It's not quite a straight line because we're intersecting circles, not squares.) The red bit represents totality.
In contrast, visual (and incidentally, audio) perception is logarithmic (although the way our biology achieves this is pretty complicated). For logarithmic scales, we typically measure in decibels of attenuation rather than percentage, where 100% power = 0 dB, 10% power = -10 dB, 1% power = -20 dB, and so on. Here's the same plot in decibels:
Amazingly, it seems your eyes can reliably measure a range of at least 90 dB, or 1 billion times. In other words, you can detect incremental differences in brightness from 100% all the way down to 0.0000001%. (This is a lot like how wifi hardware can detect very small signals.)
When the sun is completely blocked by the moon (which appeared about 3% larger than the sun during this particular eclipse), the total power delivered becomes 0% for a while, which is -infinity dB, which looks a lot less bright than 10% or 1%. My numerical simulation can't do -infinity, but even the simulation clearly shows a very steep dropoff in perceived brightness. (Anyway, there is never really "zero" radiant energy to detect. In reality there's always some kind of background noise, maybe at -90 dB or so.)
All this is also why if you're not in the "path of totality" (where the sun really does get down to -60 dB or less, not just, say, 1%), the eclipse is boring. You feel a little chillier, but you might not even see the difference, because your eyes adjust to it (and you shouldn't stare at the sun). Even down to 1%, without eclipse glasses, the sun doesn't look much different.
All this means solar eclipses are a truly great source of superstition. Imagine: you don't know what an eclipse even is. You're working out in the field all afternoon, and even though there's not a cloud in the sky, and there's no wind, and the sun is shining brightly, you notice a sudden chill in the air. That's creepy enough as it is.
After an hour or so, it suddenly goes completely dark - for maybe a couple of minutes - and looks totally freaky, with the visible corona and whatnot. Then it comes back just as suddenly, and over the next hour or so, starts to warm up again.
At that point you might think you're crazy, but what about this: you experienced totality. But back in your village, maybe just a few km away (if you were on the edge of the totality region), they didn't. Maybe for them it got down to only 1% instead of 0%, so it didn't look like anything. They might have noticed a chill in the air, but that's easy to write off when it doesn't build up to something. Maybe only you were out of town that day and saw the crazy 0% totality effect, and nobody else did.
It would be pretty easy to decide you had some kind of spiritual experience.