One can calculate the intensity of a light source as seen by some particular observer by using one particular form of the inverse square law:
(Luminosity of source) Intensity = --------------------- 2 4 * pi * distance
For example, a 100-Watt light bulb when seen from a distance of 10 meters would have an intensity of
100 W Intensity = ------------------ = 0.08 W/m^2 2 4 * pi * (10 m)
Two stars of very different luminosity can appear to us to have the same intensity if their distances are just right:
(Luminosity of A) (Luminosity of B) ------------------- = ------------------- 2 2 4 * pi * (dist A) 4 * pi * (dist B)
Magnitudes are somewhat strange for two reasons: first, they run backwards -- bright stars have small magnitudes. Second, they are defined logarithmically, not linearly; that means that a star of magnitude 2 is not twice as faint as a star of magnitude 1. Instead, their relative brightnesses are given by a formula like so:
Intensity(A) -0.4 * (magA - magB) ------------- = 10 Intensity(B)
So, for example, if the star 61 Cygni is magnitude 6, then it is fainter than the standard star Vega by
Intensity(61 Cyg) -0.4 * (6 - 0) ----------------- = 10 = 0.0040 Intensity(Vega)
Or, in other words, the star 61 Cygni has 0.0040 times the brightness of Vega.
To convert the other way, from a given ratio of intensities into a difference of magnitudes, one must calculate
Intensity(A) m(A) - m(B) = - 2.5 * log ( ------------ ) Intensity(B)where the "log" means the logarithm base 10 function. Note the negative sign!
The regular magnitudes we use, apparent magnitudes, are of course due in part to a star's luminosity, but also to its distance. If we could place all stars at the same common distance (10 parsecs), then we could remove the dependence on distance. All that would remain would be the actual luminosity -- the power -- of the stars.
To convert from apparent magnitude to absolute magnitude, you need to know the distance to a star in parsecs. Then use this formula:
(abs mag M) = (app mag m) + 5 - 5 * (log d)where
Or, to go backwards, with the apparent and absolute magnitudes, you can calculate the distance in parsecs like so:
(m - M + 5)/5 d = 10
Astronomers use absolute magnitudes as one way to describe the power output of a star. Now, there is a way to convert back and forth from absolute magnitude to luminosity, so you might wonder "Why don't astronomers just stick with luminosity, and get rid of absolute magnitudes?" That's a good question. As you will see later, since astronomers make measurements in (apparent) magnitudes, it's sometimes very simple to convert to absolute magnitudes, much simpler than it would be to calculate luminosity; and this convenience is the reason astronomers sometimes use absolute magnitudes.
Copyright © Michael Richmond. This work is licensed under a Creative Commons License.