Basic Astrophysics

The Brightness of Stars

Apparent Magnitude

The most apparent property of a star in the sky is its brightness. The ancient Greek astronomer Hipparchus originally classified stars into six brightness categories, with the first-magnitude stars being the brightest and the sixth-magnitude the dimmest. Subsequent astronomers making more precise measurements needed a less subjective classification system; in the 1850s astronomer N. R. Pogson suggested standardizing the apparent magnitudes of stars using a scale based on the relationship:

m = constant - 2.5 * log10(F)

Here F stands for the flux, or quantity of starlight per unit area, seen on Earth. The constant defines the zero point of the magnitude system (the star Vega has an apparent visual magnitude very close to 0). This relationship gives visible stars magnitudes very similar to those assigned by Hipparchus, but allows astronomers to make precise measurements of magnitudes from instrument readings. Note that increasing flux by a factor of 100 decreases magnitude by 5.

Absolute Magnitude

The apparent magnitude is the brightness of a star as seen on Earth. However, two different stars of the same apparent magnitude might have very different luminosities, if the stars are at different distances.

The absolute magnitude of a star is conventionally defined as what its visual magnitude would be if it were at a distance of 10 parsecs (32.6 light-years) from Earth. Of course, few stars are at that distance from Earth, and it's certainly not possible to move a star to that distance just to get a magnitude measurement, so in nearly all cases the figure for the absolute magnitude of a star is a guess based on other measured properties of the star. Where it is possible to determine an approximate absolute magnitude for a star, however, it is related to the logarithm of the luminosity of the star, and is therefore useful for characterizing the "real" brightness of the star independent of its distance.

The Colors of Stars

The other, less apparent property of most stars is their color. Most stars don't have very noticeable colors and look whitish or bluish because they are so dim; our eyes are only capable of seeing the colors of sufficiently bright light, and dim light is seen in shades of grey. A few brighter stars look red, orange, or yellow, but most still look whitish or bluish. This is because whitish or bluish stars tend to be intrinsically brighter; the dim stars we see of those colors are often just very far away, and few red or yellow stars are close enough or bright enough to be as easily visible.

The physical property that most determines a star's color is its surface temperature. Typical stars have surface temperatures that are nearly uniform and that change very little over time. In physical terms, a star is very nearly in a state of thermodynamic equilibrium, and hence stars are well-approximated by what physicists call "black bodies". Here "black" refers to the property of an object emitting all the radiation that is seen without reflecting any from its surroundings; at low temperatures a black body really looks black. Ideal black bodies have a radiation spectrum first characterized by Planck:

B(w, T) = 2 * h * c^2 / (w^5 * (exp(h * c / (k * w * T)) - 1))

This rather formidable expression gives the amount of radiation emitted at a particular wavelength w by a black body with temperature T. h is Planck's constant (6.63e-34 J*s) and k is Boltzmann's constant (1.38e-23 J/K). The actual units of B are J/(m^2*s*sr)/m, or in longer terms Joules per square meter of black body per second of observation per steradian per unit change in wavelength. Integrating this expression over all wavelengths also shows that the total radiative output of a black body is proportional to T^4, so hot objects increase dramatically in brightness with increasing temperature.

For those of you whose eyes just glazed over, a less technical explanation is that black bodies emit radiation that peaks at a characteristic wavelength, and which falls off in strength for wavelengths away from that peak. Hotter objects emit radiation at shorter wavelengths. If something is hot enough, it will emit a visible glow -- like a light bulb or a heater element. The Sun has a surface temperature of about 5800 degrees Kelvin, and its peak emission wavelength is about 500 nm, which we have evolved to consider as right in the middle of the range of visible light. Typical star temperatures range from about 2000 degrees Kelvin to as much as 28,000 degrees Kelvin. The coolest stars actually radiate mostly in the infrared, and we see them as reddish because the tail of the radiation curve that falls into the visible range is brightest in the red; the hottest stars radiate in the ultraviolet, and we see them as bluish-white because the tail of the radiation curve is brightest in the blue, but doesn't fall off as rapidly. Because the luminosity of a star is approximately proportional to its temperature to the fourth power, doubling the temperature of a star would increase its luminonsity by a factor of 16. The red stars we see in the sky are actually supergiant stars that make up for the low output of their relatively cool surfaces by being huge -- the largest known red supergiants would fill our Solar System to the orbit of Jupiter. A red star the size of our Sun would be too dim to be seen at any great distance. Blue stars, on the other hand, are visible for hundreds or thousands of light-years.

Source material in this section is taken primarily from Astrophysics and Stellar Astronomy by Thomas L. Swihart (John Wiley & Sons, 1968).


Steve VanDevender
Last modified: Wed Feb 23 15:02:50 PST 2000