LogScales

 

Stars differ widely in brightness. Some of that is due to variation in distance: light spreads out, and geometry causes intensity to fall off like an inverse-square, so that a source ten times farther away is only one hundredth as bright. Stars also have different intrinsic luminosities — supergiants shine far more intensely than dwarfs (and have correspondingly shorter lifetimes).

To measure and compare light levels, astronomers define a magnitude scale. It's both simple and useful, once one learns a few landmarks. Every five magnitudes is a factor of 100. The most prominent stars in the Earth's sky are about 0 magnitude. The dimmest stars visible to the naked eye, about +5 or +6 magnitude, are one percent as bright; the brightest planets, Mars, Jupiter and Venus, are roughly -2 to -4. A full Moon is about -13 and the Sun itself is -27. The faintest stars that can be detected via long exposures on large telescopes are almost as far in the apposite direction, +25 magnitude or so.

Those are apparent magnitudes, how bright things seem. Imagine putting stars at a standard distance (chosen to be 10 parsecs, about 33 light-years, for historical reasons) and you get absolute magnitudes. On that basis the Sun drops to +5, and several ordinary-looking first-magnitude stars like Betelgeuse and Rigel jump to -5 or brighter. (They are many hundreds of light-years away.)

The stellar magnitude scale is logarithmic: each step doesn't add, it multiplies. One magnitude is a factor of about 2.5; five steps multiply together, 2.5*2.5*2.5*2.5*2.5 which comes to the ratio of 100 mentioned earlier. Ratio systems are extraordinarily good for measuring things that vary over a huge range — like sound (decibels), earthquakes (Richter scale), or skill at chess (Elo ratings).

Wednesday, February 23, 2000 at 20:06:27 (EST) = 2000-02-23

TopicScience


(correlates: CaveThought, StrobingTailLights, RainpostsAndGodrays, ...)