Stellar Magnitude: Understanding the Brightness of Stars and Planets

Astronomers use a numerical measure called “magnitude” to describe the brightness of stars, planets, and other objects in the night sky. Here’s how it works.

By convention, brighter objects have a smaller numerical value of magnitude than fainter objects. So a star with magnitude 4 is brighter than a star with magnitude 5, for example.  It’s a little like a ranking system, where brighter stars are assigned a smaller number.  Again, by convention, stellar magnitudes are defined so that an object with magnitude 1.0 is 100 times brighter than an object with magnitude 6.0. So each step of 1.0 in magnitude is the “fifth root” of 100, which is a factor of 2.512.  That means a star of magnitude 3.0 is 2.512 times as bright as a star of magnitude 4.0, which is 2.512 times as bright as a star of magnitude 5.0, and so on. Try it yourself, if you have a calculator handy.

With your naked eye, you can see objects down to 6th magnitude; with a pair of 7×50 binoculars you can see down to magnitude 10.5 or so; and with an 8” telescope, perhaps magnitude 13.5. Using sophisticated cameras and software, the Hubble can detect objects to about 30th magnitude… about 4 billion times fainter than you can see with your eye.

For numerical convenience, an object brighter than 0th magnitude has a negative magnitude; the brightest star, Sirius is magnitude -1.4; the full moon is magnitude -13, and the Sun is a blazing -26th magnitude.

The “apparent” magnitude measures how bright a star appears in the sky, regardless of how bright is truly is.


“Absolute” magnitude is a measure of the true, intrinsic brightness of a star. It’s defined as the apparent magnitude of an object if it was 32.616 light-years away.  While the sun has an apparent magnitude of -26, it has a modest absolute magnitude of 4.7.  Deneb, the brightest star in Cygnus, has an absolute magnitude of -8.73, more than 250,000 times as bright as our Sun. But its apparent magnitude is only 1.25 because it’s so far away, roughly 3,200 light-years from Earth.

The ancient Greek astronomer Hipparchus developed the system of magnitude we use today back in 120 B.C. He used his system to catalog the brightness and position of 1,080 stars. In 1996, a European satellite named after Hipparchus created the most accurate catalog to date. It lists the precise positions of over 120,000 stars, and is available online to anyone who wants to use it.