This entry is in specific response to the “Where Have All the Remnants Gone?” article from Creation Science Evangelism, though it is espoused in other creation literature.
There is a young-Earth creationism argument that goes as follows: Stars that are much more massive than the Sun end their lives by exploding their outer layers into space in a process called a “supernova.” These outer layers of stellar debris are heated and lit up by the energy from the supernova event. The claim then goes that there is a certain expected rate of these (this particular article claims 1 every 25 years in our Milky Way galaxy). Then, if you take the number of observed remnants (around 200) and multiply by the rate of occurrence, you get an age of our galaxy of around 5000 years.
Seems pretty bad for a 13.7-billion-year-old Universe, right? Well sure it does when you’re fed half-truths.
The real story is a little more complicated, though I’m going to work a little backwards through this problem. First, almost no astronomer says that a supernova should occur in our galaxy once every 25 years. Rather, the canonical number is about 1 every 100 years (in fact, this was featured in an episode of Star Trek Voyager, “The Q and the Gray”). Revisions over the past few years have pinned it down more at once every 50 years.
So now, if we do straight multiplication, we have about 50 * 200 = 10,000 years. Isn’t that exactly what creationists say (more or less) the age of the Universe should be? Yep, but there’s more.
We cannot observe supernova remnants across our entire galaxy – basically nebulae. Supernova events we can see across the visible universe, but the actual gaseous remnants are much fainter because they are more diffuse. Because of dust and gas in the way, we cannot see all the objects in our own Galaxy. Probably the farthest we can see into the galaxy is maybe to a distance of 10,000 light-years. The galaxy is about 100,000 light-years across. Doing a simple calculation of the area of a disk 10,000 light-years vs. 100,000 light-years (but 50,000 light-years in radius) yields an area of our galaxy about 25 times larger that we can NOT survey for supernova remnants vs. what we can.
So now, we need to multiply our 10,000 years by 25, giving us 250,000 years for the age of the galaxy.
The next part is that supernova remnants don’t just form out of nothing, they form from the explosions of dying stars. The stars that live and die the fastest still take about 10,000,000 years before they “go nova” and release a cloud of debris that will later become what we observe. That’s pretty much the minimum time a star can “live” during the current epoch of the Universe. Only after that will we see a supernova form.
So, add that to our estimate of the age by the number of stars and we have 10,250,000 years, or 10.25 million years for the age of the galaxy. You should note at this point I’ve been saying “age of the galaxy.” That’s because this would only be used to date our galaxy, not the Universe as a whole. So you need to add in the time for galaxy formation … which is still a number that’s hotly debated, but no respected astronomer will say happens instantaneously.
BUT, there’s another complication to this situation which shows why this apparent “method” for dating our galaxy isn’t valid: Supernova remnants fade! They only are visible for a few tens of thousands of years. What does this mean for our estimate of 1,000,000 years for the age of our galaxy? Well, by the time the “oldest” supernova is fading, we starting to observe supernova 200! We should only expect to see in the neighborhood of a few hundred supernova remnants in our vicinity, regardless of how old our galaxy actually is.