Richard C. Hoagland. Yes, another post about some of his claims (not him). Things had been quiet from Mr. Hoagland for several months, apparently because of his latest work, spending 4 months attempting to show that the Chinese lunar mission, Chang’e 3 (嫦娥三号), and its rover Yutu (Jade Rabbit, 玉兔), had found evidence of the same thing he thinks he sees in Apollo photographs (that was likely dirt on his scanner): Ancient glass towers on the horizon.
Where to start? What to address? His “paper” on the subject, which you can find at his “Enterprise Mission dot com” website (sorry, I’m not going to link), is massive. And has 136 markup errors and 23 warnings according to the W3C markup validation service. It is nearly 14,000 words long, Richard says it has over 100 photographs, and if I were to print it from Safari would come out at 86 pages.
That is a long way of saying there’s no way I’m going to even come close to addressing it all, or even try to. There’s even so much I could write about the specific topic I want to talk about – image noise and sensor non-uniformity – that I’m only going to be able to talk about in broad brushstrokes, but hopefully it’s understandable.
What I’m Not Talking About
Richard has been making the circuit of the late-night paranormal shows, podcasts, etc. Tonight he was on Jimmy Church’s show, the one I was on two weeks ago. I think that, given that I heard his Coast to Coast interview, his “The Unexplained” interview, and “Fade to Black” interview, I have a reasonable idea of his argument (keep in mind, that’s about 5.5 non-commercial-hours of listening to Richard talk about this, so forgive me for not reading another 13,700 words).
I’m not going to talk about his numerology:
- 19.5° … anything.
- Landed at 19.5° … LONGITUDE, not latitude.
- Landed at 44°N which was a message to the 44th President … Obama.
I’m not going to talk about his conspiracy and symbolism:
- Obama made some mention of carrot seeds in a gift to the Pope, which was a hidden message about the Jade Rabbit lunar rover.
- All his previous NASA conspiracy stuff coming in.
- Disclosure is going to happen within a few months (I seem to recall him saying 2010 was the Year of Disclosure and then in 2011 when being called on it (a rare instance of being called out), he said it was, we just hadn’t noticed it).
- Brookings Report
I’m not going to discuss his pareidolia, since that doesn’t really play much a role in this set of claims (to the extent it does with grids, that will be discussed).
And so, of the four things that comprise the vast majority of Richard Hoagland’s claims (numerology, conspiracy, pareidolia, shoddy image analysis), it will be the image analysis that I will delve into.
Why not this other stuff? Why isn’t that as important? Because none of it is actual objective evidence for anything. It is supposition, ancillary to the claimed photographic evidence that I’ll be discussing in the rest of this post. Since the photography is the only (or the most) objective part, that’s what’s important to examine.
The Images (One of Them)
Here is the hallmark image that Richard has been sending to radio hosts. And I have included the original caption.
Ah, noise. Most of us are familiar with audio noise. Turn speakers on, when they’re not connected to anything else, and put the gain up all the way. You will hear static. That’s random electrons being picked up by the circuitry and being amplified as, literally, noise.
The same thing happens with digital cameras. They work by converting photons (little packets of light) into energy, and recording how much energy is recorded. Some pixels are more sensitive than others. Some pixels are always on, some are always off. Usually, because of the manufacturing process, it’s entire rows and/or columns of pixels that will be slightly more sensitive than others. And there’s the statistical fluctuations that have to do with counting statistics.
When cameras get warm, the molecules have more energy (definition of heat), and are more likely to randomly emit an electron that will be recorded … as in, noise. That is why professional – and even enthusiast – astronomy CCDs are cooled, sometimes with liquid nitrogen. It reduces the noise. If your sensor is unevenly heated, that can cause uneven noise across it (more noise where it’s warmer). Just a degree temperature difference will do it.
All of those mean that ANY digital detector will have noise – I don’t care how good it is, how much you paid for it, how many pixels it has, if it’s color or B&W … whatever about it – it will have noise. The fact that it has a temperature above absolute zero means it will have noise.
Here is an excellent tutorial on image noise. If at this point you don’t know what I’m really talking about, please read it, or at least look at the images. There is a small link to a part 2 at the bottom. Going forward, I’m going to assume that you have a a reasonable grasp of noise. This is already a long post.
What Is “Equalization”?
“I just brightened up the images a little bit.” –RCH
This is a hallmark of much of Richard Hoagland’s types of claims. Brightening the image, increasing contrast, increasing saturation, etc.
Equalization itself can have innumerable types of algorithms, but the basic idea is this: Many photographs of a typical scene have a little bit of dark, a little bit of bright, and a lot in the middle. That’s not how you have to shoot a photo, but that’s typical (go to this post and look for “Histograms”). What Equalize does it want to put the same number of pixels at every brightness level.
So, in that example, it will move some of the slightly darker middle colors to be darker, and it will move slightly brighter middle colors lighter. That way, if your image has 256 pixels, and you’re in 8-bit mode so there are 256 levels of brightness, one pixel will have a brightness 0, one will have a brightness 1, one will have a brightness 2, and so on.
Inevitably, this has the effect of stretching at least some brightness levels in the image. More on that in a bit.
This can be good! You take a wedding photo and Equalize can help bring out detail in both the bride’s white dress and the groom’s black tux (if we’re talking about a Western-style heterosexual marriage). That’s because the image, as-shot, would have a lot of dark pixels and a lot of bright pixels, so Equalize will bring them more to the middle.
But this can also be bad or silly, as I show in the next section.
Below is an image I took of the moon last year.
The top image shows my nice, well-exposed photograph of the moon.
Then I saved the image as a JPG. The middle row shows what happened when I pressed Photoshop’s “Equalization” option. The left column is from the original image, before I saved it. The right column shows what happened when I pressed Equalize on the JPG-saved image. The bottom row is what happened after I converted both to greyscale, just for completeness.
So, what is this showing? Noise! (The pixel noise I talked about before and the JPG compression artifacts, though I’m not going to talk about those JPG artifacts in this post.) As I talked about above, different rows and columns of pixels are very slightly more or less sensitive than others. It doesn’t matter how good your sensor is, it will still have imperfections.
Since this particular sensor is three-color (RGB pixels), then different rows and columns of colors have different sensitivities, hence the red/pink checkerboard feature. The green pixels in this sensor apparently had better noise properties than the red and blue.
Notice also that it’s brighter around the moon. As if it’s surrounded by tall glass structures! After all, all of this stuff is showing perfect rectalinear geometry.
But why did I say that using equalization on this is silly? It’s because it is. The moon was surrounded by black sky. If you go to the original image on my computer (please don’t hack me), the pixel values – the number of photons recorded – scaled between 0 and 255, is 0-2 in that dark area. That, dear reader, is noise.
What about close to the moon? It raises to 8-20. (The Moon itself is 150-230.) The 8-20 pixel brightnesses are both noise AND, more importantly in this case, scattered light. This gets to another thing about optics: I don’t care how good your optics are, what kind of special coatings it has … unless you are in a clean room with ZERO dust, and perhaps using the clearest of crystals as your optics, your optics are not perfect and they will scatter light. Meaning the light won’t just pass through as it should, a few photons will be deflected and go somewhere else.
What that means for this case is that the moon is a bright circle on this image. A few of those photons are going to scatter within the optics of my camera and the probably 15 different pieces of glass that form the lenses. Probably, they won’t scatter far. That’s why right next to the moon, it’s 8-20 photons. But just 10% of the image away, we’re back at the background level of 0-2.
This all gets back to Richard’s images. I can’t figure out exactly which image Richard used as his main one, but another he uses comes from here, the bottom one with the caption 嫦娥三号着陆器地形地貌相机拍摄的“玉兔”号月球车照片。(Chang’e 3 lander topography camera “rabbit” No. rover photos. –via Google Translate).
Here’s how Richard presents it:
When I look at the image, the pixel values in the sky on the left half are 0-5. The right half is 5-10. The lunar features being around 100-250. It’s the same on a higher-resolution image that “Dee” found and posted over on Expat’s Dork Mission blog. Again, noise. And, I think some scattered light.
The moon is dusty. And even if it weren’t, the scattered light in the right half makes sense, getting back-scatter from the sun coming from the right, scattered in all directions but a bit more back to the right, and into the camera lens on that side. Another explanation is that the right half of the sensor was very slightly warmer than the left half. That will also give you noise of this exact type.
And, if these are glass towers, one must also ask why they stop just above the horizon!? On Richard’s “Enhancement” via Equalization, he shows the lunar surface, and just above it is a black line, and then are his glass towers.
But back to Equalize, what happened? Well, about half the image is really dark, pixel brightness values between about 0 and 10. Half the image is middle to bright, with brightness values between about 100 and 255. Because Equalize demands that the same number of pixels be at every level of brightness, it has to make a lot of those dark pixels brighter. Since half-way between 0 and 255 is about 127, it barely has to do anything to the lunar surface part. It’ll make some of the pixels a little darker, and some of them a little brighter, but the most drastic change will be to the sky area because that’s half the image, and so that half the image now must be mapped instead from 0-10 brightness to about 0-127 brightness. (Since it’s a little less than half, it only gets mapped up to 90, but you get the idea.)
Richard Says it Could Be Noise, But It’s Not Because It’s Geometric and Not Below Horizon
Richard said words to that effect at about 9:04 PM on Jimmy’s radio program (just over 2 hrs into it). He said it could either be noise or glass structures. He said it’s not noise because it’s geometric and because it doesn’t show below the horizon (the surface of the moon).
I reject both of those as explanations for why it’s not noise. The geometry argument because of my example above with my image of the moon, and see that tutorial on banding noise. If he thinks that image noise is not geometric (the noise from the sensor and noise from JPG compression), he is either more ignorant or delusional than I thought or, well, not telling the truth. Sorry, it’s hard to listen to him for 3 hrs and write 2800 words and not get in a small ad hominem.
I reject the part about it not showing below the horizon as evidence it’s not noise because of … the numbers. Even Richard often says, “It’s about the numbers!” In this case, you are talking about pixel values of about 5-10 brightness. Let’s say that’s noise. Just give that to me for a moment. Now look at the actual lit part of the moon. Pixel values 100-250. Noise of 5-10 photons on top of 0 is HUGE. Noise of 5-10 photons on top of 100-250 is miniscule. In other words, I say that the noise is still there in the part below the horizon, you just don’t see it.
Again, I’ll refer to the tutorial I linked to, specifically the first two images. The top one shows the same noise level, but a large signal (like the lunar surface). The second one shows the same noise level, but a very small signal (like the sky, though I’d say there’s no signal, it’s all noise).
Other Relevant, Miscellaneous Statements
“The data are replicable” therefore the fact that he sees this in Apollo and the Chang’e 3 images means it’s real, it’s “stunning confirmation.”
Yes, it certainly does mean that image noise is real and banding noise is also a real type.
“The Chinese have gone to the Moon and sent back the message — ‘Hoagland was right.'”
This one in particular struck Expat from the Dork Mission blog. I kinda agree. I find it typical of a decent number of claims by conspiracists in general and (personally) I find it somewhat arrogant to think that THEY are the only ones who can decode these secret messages, and even that the encoders are speaking directly to them!
Other “Enterprise Mission scientists” agree with him.
That is not peer review, that is an echo chamber. There’s a reason that Richard Hoagland is making the circuit on the paranormal shows and not anything else with this stuff.
Since the images are still up on the various Chinese websites, even after Richard Hoagland’s disclosure last week, that means that they are admitting that this stuff is real.
No, it means that most of us laugh this off as not understanding anything about photographic noise. Those of us among the very small community of scientists who actually follow these kinds of topics.
Where Are These “Miles-High, Miles-Across” Features in Meter-Scale Orbital Photographs?
As with the lunar ziggurat saga, and even as Richard stated (just his example doesn’t qualify), science demands repetition of objective data. Richard has claimed that these features are massive, miles across and miles high and miles wide. The Lunar Reconnaissance Orbiter’s Narrow Angle Camera records images with pixel scales of around 0.5 meters. Wide-Angle Camera is ~60 meters per pixel. The Japanese Kaguya craft had a terrain camera that recorded images at ~10 meters per pixel. And that’s just in the last few years.
But, none of these craft show these features. Even Richard hasn’t pointed to any of these images that would allegedly show them.
And yet, Richard claims that they are shimmering, glittering, multi-colored, glassy … and miles across. Where are they in other imagery? Yes, you’re looking down from above, but glass refracts light (index of refraction is 1.5-2.0-ish) so we should see weird distortions. And, he says it glitters, so we should see specular reflections, especially off the parts that are “damaged” (as he put it) and haven’t been repaired yet by robots (which he said are there in the C2C interview).
So either they should be there in the other images, or they don’t exist. Or, massive conspiracy. *cough*
I wasn’t going to talk about this stuff. When I first listened to it on C2C last week, my almost knee-jerk response was, “This falls into the category of Not Even Wrong.” In other words, there is simply so little correct, so little grounding in reality, that (a) there’s no way to even start to address it all, and (b) there’s usually not any reason to.
As RationalWiki put it, a Not Even Wrong statement is not of the form “2 + 2 = 6,” but rather, “2 + zebra ÷ glockenspiel = homeopathy works!”
Then I saw how much press he was getting on these various shows. On “The Unexplained,” he made a comment to the effect that he can reach 2% of the population (US population? world population?) “without breaking a sweat.” And he’s right. Or at least close to it. If the audience numbers that I’ve heard reported for these various shows are correct, he’s probably easily reached over a million listeners at this point, or 1/3 of a percent of the US population.
I don’t believe for a Plank time that all those people believe what Richard says. I also, honestly, don’t think it’s incredibly important whether they do or not. But, the credulity that even entertaining this kind of “analysis” fosters transfers into other fields. Like medicine. Or believing in a soon-to-be apocalypse. And there, your choices, your belief in various “alternative” views, can kill you. Or, in cases of psychics, astrologers, and others, they can bankrupt you.