# Glossary of terms relevant to stellar luminosity calculations

brightness
A measure of the amount of light which reaches some particular observer. It depends on several factors: the power of the light sources, its distance from the observer, extinction by intervening material, etc. "Brightness" is a rather subjective term; when measured quantitatively, it is called intensity (see below).

intensity
A quantitative measure of the amount of energy striking some area in a given time. The units are Watts per square meter (W/m^2). For example, on a clear, sunny day, the intensity of sunlight reaching the ground is about 1000 W/m^2; that means that if you had a panel of solar cells, one meter by one meter, you could in theory generate 1000 Watts (but the best current solar cells are only about 20% efficient, so you'd only get about 200 W).

One can calculate the intensity of a light source as seen by some particular observer by using one particular form of the inverse square law:

(Luminosity of source)
Intensity  =   ---------------------
2
4 * pi * distance

For example, a 100-Watt light bulb when seen from a distance of 10 meters would have an intensity of

100 W
Intensity  =   ------------------  =  0.08  W/m^2
2
4 * pi * (10 m)

Two stars of very different luminosity can appear to us to have the same intensity if their distances are just right:

(Luminosity of A)         (Luminosity of B)
-------------------   =   -------------------
2                          2
4 * pi * (dist A)          4 * pi * (dist B)
luminosity
The power of a light source; the amount of energy it emits each second. In ordinary life, we measure luminosity in watts (W): a light bulb may emit 100 Watts. The Sun emits about 4 x 10^(26) Watts. In order to avoid writing big numbers, astronomers often describe the luminosity of stars in terms of "solar luminosities": a star with 2 solar luminosities emits about 8 x 10^(26) Watts, a star with 0.1 solar luminosities emits 0.4 x 10^(26) Watts, and so forth.

(apparent) magnitude
The way astronomers describe the brightness of celestial sources. Magnitudes are defined in a relative sense: how much brighter or fainter is this star than that one? For our purposes, we can use the bright star Vega as the standard object. The magnitudes you see listed for stars in tables and on web sites describe "how bright is this star compared to Vega?"

Magnitudes are somewhat strange for two reasons: first, they run backwards -- bright stars have small magnitudes. Second, they are defined logarithmically, not linearly; that means that a star of magnitude 2 is not twice as faint as a star of magnitude 1. Instead, their relative brightnesses are given by a formula like so:

Intensity(A)        -0.4 * (magA - magB)
-------------  =  10
Intensity(B)

So, for example, if the star 61 Cygni is magnitude 6, then it is fainter than the standard star Vega by

Intensity(61 Cyg)        -0.4 * (6 - 0)
-----------------   =  10                  =  0.0040
Intensity(Vega)

Or, in other words, the star 61 Cygni has 0.0040 times the brightness of Vega.

To convert the other way, from a given ratio of intensities into a difference of magnitudes, one must calculate

Intensity(A)
m(A) - m(B)    =  - 2.5 * log ( ------------  )
Intensity(B)
where the "log" means the logarithm base 10 function. Note the negative sign!

absolute magnitude
The absolute magnitude of a star is not the magnitude we see in the real sky; instead, it is a hypothetical value: the magnitude we would see if the star was moved to a standard distance of 10 parsecs away from the Earth.

The regular magnitudes we use, apparent magnitudes, are of course due in part to a star's luminosity, but also to its distance. If we could place all stars at the same common distance (10 parsecs), then we could remove the dependence on distance. All that would remain would be the actual luminosity -- the power -- of the stars.

To convert from apparent magnitude to absolute magnitude, you need to know the distance to a star in parsecs. Then use this formula:

(abs mag M)   =   (app mag m)  +  5   -  5 * (log d)

where
• absolute magnitude is the capital M
• apparent magnitude is the lower-case m
• d is the distance to the star in parsecs
• "log" is the logarithm base 10 function

Or, to go backwards, with the apparent and absolute magnitudes, you can calculate the distance in parsecs like so:

(m - M + 5)/5
d   =   10

Astronomers use absolute magnitudes as one way to describe the power output of a star. Now, there is a way to convert back and forth from absolute magnitude to luminosity, so you might wonder "Why don't astronomers just stick with luminosity, and get rid of absolute magnitudes?" That's a good question. As you will see later, since astronomers make measurements in (apparent) magnitudes, it's sometimes very simple to convert to absolute magnitudes, much simpler than it would be to calculate luminosity; and this convenience is the reason astronomers sometimes use absolute magnitudes.