Suppose that a significant fraction of white dwarfs have absolute magnitudes down to M(G) = 23. Would we see them on this diagram? The difference between apparent magnitude m and absolute magnitude M is the distance modulus, (m - M). It depends on distance d in pc as follows: (m - M) = 5 * log(d) - 5 So, at the maximum distance of this sample, d = 10 pc, we expect (m - M) = 5 * log(10) - 5 = 0 --> m = M Since our sample is cut off at an apparent magnitude m = 18, that means that at the maximum distance of d = 10 pc, we can only detect stars of absolute magnitude M = 18. Any stars with fainter absolute magnitudes, M > 18, would have to be closer to the Sun for Gaia to measure it at m < 18. In particular, a star of absolute magnitude M = 23 would need (m - M) = -5, which corresponds to a distance of just d = 1 pc. Suppose that a significant fraction of stars have absolute magnitudes as bright as M(G) = -3. Would we see them on this diagram? Very luminous stars should be detected whether d = 1 pc or d = 5 pc or d = 10 pc, or even much larger distances. For example, a star with M = -3 would have the critical apparent magnitude m = 18 at a distance given by (m - M) = (18 - (-3)) = 21 = 5 * log(d) - 5 Solving for the distance d at which this star would just barely make it into the Gaia sample, 0.2 * (21 + 6) d = 10 = 160,000 pc So, yes, we ought to see it on this diagram. Would your answers to either question change if we included stars out to a distance larger than 10 pc? If we look at larger distances, we should NOT find any more very low-luminosity stars, because they would be too faint to be detected; in other words, their apparent magnitude would be larger than m = 18. On the other hand, even at distances much larger than d = 10 pc, very luminous stars should still appear to have apparent magnitudes brighter than m = 18, and so they should be detected.