Magnitudes: How astronomers measure brightness and use it to measure distances
Absolute magnitude is a concept that was invented after apparent magnitude when astronomers needed a way to compare the intrinsic, or absolute brightness of celestial objects.
The apparent magnitude of an object only tells us how bright an object appears from Earth. It does not tell us how bright the object is compared to other objects in the universe. For example, from Earth the planet Venus appears brighter than any star in the sky. However, Venus is really much less bright than stars, it is just very close to us. Conversely, an object that appears very faint from Earth, may actually be very bright, but very far away.
Absolute magnitude is defined to be the apparent magnitude an object would have if it were located at a distance of 10 parsecs. So for example, the apparent magnitude of the Sun is -26.7 and is the brightest celestial object we can see from Earth. However, if the Sun were 10 parsecs away, its apparent magnitude would be +4.7, only about as bright as Ganymede appears to us on Earth.
Image credit: Alice Hopkinson, LCO