Magnitude Scale
In order to quantify the entire night sky, astronomers for ages have used the intensity and position of stars. Over all of the electromagnetic spectrum, intensity of light can be polarized and a spectrum can be created that lists the magnitude of a specific star. The first astronomer to catalog the "magnitude" of stars was the greek thinker Hipparchus who used a scale system from 1->6 to list the brightness of certain stars and their locations. 1 labeled the brightest stars and 6 the faintest. As technology improved and statistical methods were refined it became clear that the scale between "apparent" magnitudes not linear, but actually a log scale. With spectrometers, astronomers found that a difference of 5 magnitudes actually corresponded to a 100-fold increase in brightness giving us the now-standard quantity of a 2.5 log step between magnitudes. For example, one step in magnitude is equivalent to \( 2.5^2 \approx 6.25 \) times brighter.
Now astronomers can detect incredibly distinct magnitudes down to the .01 difference. They have also extended Hipparchus' scale (1->6) to a much larger scope (-26.83 (Sun) -> 30 (faintest stars)). This gives \(\approx 10^23 \) ratio between the brightness of our sun and the faintest stars.
To quantify the magnitude scale a bit more, we call the "brightness" of a star in terms of luminosity and radiant flux. Radiant flux is defined as the amount of energy (over all wavelengths) that a star gives off per second that is received on a square meter detector. This flux is based on the luminosity of star which is simply the energy exuded per second. Thus the relationship between flux and luminosity is
Where r is the distance from the observed star to the detector, L is the luminosity of the star, and F is the radiant flux. This relationship can be pulled back to the apparent magnitude of a star through the following relationship of radiant fluxes between two comparative stars.
Where F is the flux of a respective star and m is its apparent magnitude. Once the relationship between the apparent magnitude and flux is determined, astronomers use what is called "absolute" magnitude, or the apparent magnitude a star would have exactly 10 parsecs away from the observer. Considering the fact that a magnitude change of 5 corresponds to a flux difference of 100. Expanding this equation outward to distance we can get the final relationship between apparent magnitude, absolute magnitude, and distance to a star as:
With these quantities in mind, astronomers can calculate distance to a star solely based on the energy they receive across all wavelengths. Similarly, if the distance to a star is known based on its geometry, brightnesses can be compared and the lifecycle of a star can be approximated.
No comments:
Post a Comment