Exposing PseudoAstronomy

March 14, 2015

Podcast Episode 128: The Saga of Comet Hale-Bopp and its Fugacious Companion, Part 2


Great Comet Hale-Bopp,
Part 2: On remote viewing
The comet’s partner.

Second in the three-part series: The saga of the great and powerful Comet Hale-Bopp and the conspiracy, mystery, intrigue, lies, schemes, hoaxes, and suicides that accompanied it. The idea came when I started listening to a new Art Bell set of interviews that I had obtained, and I realized early in the episode (November 14, 1996) that I was listening to THE interview that started the whole thing. I found another dozen or so interviews and decided to make an episode out of it that has blossomed into three episodes.

The three episodes are meant to be stand-alone in that they don’t need the others to be understandable. But, put them together and they tell the story in a lot more depth. This second part is about one of the primary drivers behind the Hale-Bopp companion, Courtney Brown, and his remote viewing claims. While he provided the hoaxed photographs to Art Bell and Whitley Strieber (per Part 1), he claimed that all of his evidence for the companion was “good data” and based on remote viewing.

Part 3 will be on the Heaven’s Gate cult and aftermath and continued conspiracy, including a brief entry by Richard Hoagland.

I have decided that, while I may do my interview with Dave Draper on potentially pseudoscientific conference abstracts before Parts 2 or 3 are finished, I will wait to put it out, such that Parts 1-3 will be back-to-back-to-back.

While there was one logical fallacy in the episode (argument from authority), I instead used the segment to discuss part of the skeptical toolkit: The BS Meter. And, what should trigger it and what you should do about it. The bottom-line is that you should question any claim that sets off your BS meter, and even when something seems innocuous and small and not even part of what could have led to the anomalous result, you should still check it.

And, finally, I plan to do a small tribute to Leonard Nimoy, no earlier than April 1. The tribute will be from you: If he or any of his characters affected you (especially as perhaps related to an interest in science or astronomy or critical thinking), please send in a few sentences. Or, record no more than 30—60 seconds and send the file to me. I will read/play them either on episode 130 or 131.

Finally, this episode is coming out a bit early because I’m leaving for a week for a planetary science conference and won’t be able to do much of anything else while I’m there.

March 2, 2015

Podcast Episode 127: The Saga of Comet Hale-Bopp and its Fugacious Companion, Part 1


Great Comet Hale-Bopp,
Part 1: On the claimed photos
Of your companion.

I’ve been working on this episode for awhile: The saga of the great and powerful Comet Hale-Bopp and the conspiracy, mystery, intrigue, lies, schemes, hoaxes, and suicides that accompanied it. The idea came when I started listening to a new Art Bell set of interviews that I had obtained, and I realized early in the episode (November 14, 1996) that I was listening to THE interview that started the whole thing. I found another dozen or so interviews and decided to make an episode out of it. About three months and over 10,000 words of notes and transcripts later, this is the release of Part 1 of what will be a three-part series on Hale-Bopp.

The three episodes are meant to be stand-alone in that they don’t need the others to be understandable. But, put them together and they tell the story in a lot more depth. This first part is about the image – the “hard science” – claims about the companion. Next one will be on the remote viewing claims and aftermath, and the third will be on the Heaven’s Gate cult and aftermath and continued conspiracy, including a brief entry by Richard Hoagland.

I have decided that, while I may do my interview with Dave Draper on potentially pseudoscientific conference abstracts before Parts 2 or 3 are finished, I will wait to put it out, such that Parts 1-3 will be back-to-back-to-back.

There were two logical fallacies pointed out in this episode: Argument against authority, and correlation ≠ causation (cum hoc ergo propter hoc).

And, finally, I plan to do a small tribute to Leonard Nimoy, no earlier than April 1. The tribute will be from you: If he or any of his characters affected you (especially as perhaps related to an interest in science or astronomy or critical thinking), please send in a few sentences. Or, record no more than 30—60 seconds and send the file to me. I will read/play them either on episode 129, 130, or 131.

February 4, 2015

New Horizon’s First Images of Pluto from Its Approach Phase – What’s Going On?


Introduction

New Horizons is a spacecraft headed to Pluto. It launched nearly a decade ago, but it will arrive in July of this year and do a fly through the system. Doing lots of amazing science.

One of the instruments is LORRI, a long-focal-length camera that will be the prime imager for much of the mission because it will be able to take the highest spatial resolution images. It will also be used (was being used and is being used, too) for optical navigation — make sure we’re headed in the right direction.

Just a few minutes ago, on the anniversary of Clyde Tombaugh’s birth (the guy who discovered Pluto), NASA released the first image from LORRI of Pluto and its main satellite, Charon, taken during the Approach Phase. There’s a lot going on here – one point in particular that I just know is prone to misunderstanding later on – so I want to talk a bit about this image.

Disclaimer

I am involved with the New Horizons mission. I am not a NASA employee. This is my personal blog and everything on it is my opinion, are my words, and is done completely independently (time-wise, resource-wise, person-wise) from my work on New Horizons. In fact, it is on record that this blog is legally distinct from my professional work. Nothing I say here should be taken as an official statement by NASA or the New Horizons team.

Resolution / Pixel Scale

That out of the way, let’s get to the meat of this post. Also, I’m going to use “resolution” and “pixel scale” a bit loosely here, so pedants need to forgive me right away.

LORRI is an amazing camera. It is a 1024×1024 pixel detector, and each pixel has an effective angular size of 4.95 µrad (micro radians, or about 1.02 arcsec). 1 arcsec is about the width of a human hair from 10 meters (33ft) away. (source)

At the moment, New Horizons is around 200,000,000 km away from Pluto. That’s okay, it still has 5.5 months to get there. Pluto is approximately 1180 km in radius. That means, from some simple trigonometry (remember SOHCAHTOA?), Pluto is about 1.2 arcsec in radius, or 2.4 arcsec in diameter. Charon is very roughly half Pluto’s diameter, so it’s around 1.2 arcsec in diameter. Charon and Pluto orbit on opposite sides of their center of mass, which means they are around 8.6 Plutos away from each other, or around 9.1 arcsec separated.

Okay, lots of numbers there. Basically, that means that right now, if we had perfect optics, Pluto is about 2 pixels across, Charon 1, and they’d be around 8 pixels away from each other, max (since their orientation on the plane of the sky is not perpendicular to the spacecraft right now).

(No) Perfect Optics

No such thing exists. Given the best, most perfect optics ever, you can never get infinitely fine details. This is because light will behave as a wave, and give rise to Airy disks and patterns meaning that the light will spread out as it travels through the optics. Unless you had an infinitely wide optical system.

When you factor everything together about the optics and system and detector and other things, from a point source of light, you get a point-spread function (PSF). This is the practical, measured spreading out of the light. In astronomy, we often measure the PSF based on fitting a Gaussian distribution to a star, since a star should be a point source and just cover one, single pixel.

With a telescope aperture of 208mm for LORRI, and a passband of light centered around 0.6 µm (red light), the Airy disk should be around 1.22*0.0006/208 = 6.8 µrad. That’s around 1.4 LORRI pixels. Amazing coincidence!

Actually, not. When designing an instrument, you typically want to just about over-sample the Airy disk. You don’t want to under-sample because then you’re losing valuable resolution and information. You don’t want to over-sample because then you’re just wasting money on a detector that is too “good” for your optics, and other issues that come about when you have small pixels. So, designing a system that’s around 1-3 pixels per Airy disk is good.

When you go to a practical PSF, it’s going to be a bit bigger just because no system is perfect.

What’s the Point?

Oh yeah, back to Pluto.

First New Horizons Image of Pluto and Charon from Approach Phase

First New Horizons Image of Pluto and Charon from Approach Phase (©NASA/APL/SwRI)

Let’s put these parts together: Right now, Pluto should be around 2 pixels across, Charon 1, and a separation of around 7-8 pixels. But, add in the PSFs due to the laws of optics. That means that the light should now be spread out a bit more.

And that is why this image looks like it does. It’s also been enlarged by 4x, such that each original LORRI pixel has now been resampled. So, if you look at the image NASA released, and you blow it up a lot, Pluto looks like it’s around ten pixels across, and Charon around five.

To repeat: The released image shows Pluto to be around 10 pixels wide, and Charon around 5. Despite the theoretical values now (2 pixels and 1 pixel, respectively). That’s because (1) the PSF spreads the light out because we live in a world with real and not ideal optics, and (2) the released image was enlarged by a factor of 4.

Moving Forward

New Horizons is zipping quickly along. In May, it will surpass all previous images taken and we will truly be in new territory and a new era of discovery (so far as imaging the Pluto system — note that the other instruments have already taken a lot of data and are learning new things). That best image that exists so far of Pluto shows Pluto to be approximately 8 pixels across.

And that’s why I started this post out by stating, “one point in particular that I just know is prone to misunderstanding later on.” So, today, NASA released an image that shows Pluto with as many pixels across as what it will take in late May, when it will have that number of pixels across.

See why I wanted to bring this up now? I can just hear the pseudoscientists claiming that NASA is lying about the power of the New Horizons telescopes, they’re deliberately down-sizing images (later, based on images released now), and various other things. While they’ll still almost certainly say that, at least you know now why that’s not the case, and what’s really going on now versus then.

There are only 2 (well, about 4, since it’s 2×2) “real” pixels in the Pluto disk right now, the others are interpolated based on expanding the size to make the image look nice for this release, celebrating the image and Clyde Tombaugh’s birthday. In four months, we’ll have all these pixels, but they won’t be based on a computer algorithm, they’ll be “real” pixels across Pluto taken by LORRI. Convolved (“smeared”) with a PSF that’s about 1.5-2 pixels.

December 30, 2014

My First Infographic: What Have Our Planetary Space Probes Photographed Since 1970?



Introduction

This has been over two months in the making: I’m finally releasing my first infographic. It’s entitled, “Planets and Major Moons: Distribution of Non-Lander Spacecraft Photos Since 1970.” (Suitable for printing on A-size paper with a bit of top and bottom margin to spare.) The purpose is to show the number of images taken by different space probes of the planets (and major satellites), the percentage of the total images that were for each body, and for each body, the percentage taken by each different spacecraft.

PDF Version of Spacecraft Imagery Infographic (3.5 MB)
PNG Version of Spacecraft Imagery Infographic (4.7 MB)

Number of Images of Planets Taken by Spacecraft Infographic

Number of Images of Planets Taken by Spacecraft Infographic

Development Process

I’ve been wanting to create infographics for awhile. Really good ones are few and far between, especially for astronomy, but the good ones are often amazing works of art. I don’t pretend that this is an amazing work of art, but hopefully it’s not ugly.

To me, the key is to have a lot of information crammed into a small space in an easy-to-understand way that you don’t have to be an expert to interpret. In my work, I deal a lot with multi-dimensional datasets and so already I have to come up with ways of displaying a lot of information in as few figures as possible and yet still make them readable.

The Idea

An idea that I came up with is based on the claim that “NASA hides all its pictures!” (This is often, hypocritically, almost immediately followed up with NASA spacecraft imagery showing claimed UFOs and other pseudoscientific claims.)

And so, I wanted to investigate this: How many images really have been taken and are available publicly, for free, on the internet? After several days of research, I had the results, and I assembled them into the above infographic.

The Numbers

I was surprised by some of the numbers and I was not surprised by others. One thing that did not surprise me was that the outer planets have very few photographs (relatively speaking) of them, while most imagery has focused on Mars and the Moon (fully 86%).

But, I was not prepared for how very few photographs were taken by our early probes to the outer solar system. Pioneers 10 and 11 were the first craft to venture out, and yet, because of the (now) archaic method of imaging and slow bandwidth, they collectively took a mere 72 images of both Jupiter and Saturn. Compare that with the ongoing Lunar Reconnaissance Orbiter around the moon, which has publicly released over 1.1 million images.

You can also see the marked effect of the Galileo high-gain antenna failure: Only 7.4% of the photos we have of Jupiter were taken by Galileo, despite it being an orbiter in the 1990s. Compare that with the Cassini orbiter of Saturn, which has returned nearly 50 times as many images, despite no dramatic change in technology between the two craft. This means that only 0.4% of our images of planets and moons are of Jupiter, while 1.9% are of Saturn.

You can also see the marked success of modern spacecraft and the huge volumes of images that (I repeat) are publicly available. The pie slices in the infographic are color-coded by approximate spacecraft operation era. Well over 90% of all images were taken after 1995, and the current suite of the latest NASA spacecraft (MESSENGER around Mercury, Lunar Reconnaissance Orbiter around the Moon, and Mars Reconnaissance Orbiter around Mars) account for a sizable fraction of the returned data for that body — especially MESSENGER, which accounts for 98.1% of all Mercury images.

What was I most surprised by? The Clementine mission to the moon. It returned and has publicly archived just shy of 1.5 million images of the lunar surface. I expected the Lunar Reconnaissance Orbiter to have surpassed that. And, it still may, as it continues to operate and return data. We shall see.

Why the Conspiracy Theorists Are Wrong

As I said, one of the primary reasons I made this was to investigate the claim by conspiracy theorists that these space agencies hide photographs. The blame rests almost entirely on NASA by most conspiracists’ accounts. This infographic proves them wrong in two significant ways.

First, at least for the Moon, Mars, and Venus, sizable numbers of images have been taken by and publicly released by non-NASA sources. I specifically have data from the European Space Agency (SMART-1, Venus Express, and Mars Express), and Japanese Space Agency (SELENE / Kaguya). While both the Indian and Chinese space agencies have also sent spacecraft to the moon and Mars (Mars for the Indians with the recently-in-orbit “MOM” craft), and Russia has sent craft to Venus, Moon, and Mars, I could not find the public repositories – if they exist – for these missions. Therefore, I could not include them. But, a lack of those two does not affect the overall point, that non-NASA agencies have released photos of these bodies.

Second, as I’ve repeated throughout this post, these are the publicly released images. Not private. Public. To public archives. In the bottom-left corner, I have the sources for all of these numbers. (Please note that they were compiled in late October and may have increased a bit for ongoing missions — I’ll update periodically, as necessary.)

The total number of lunar images? About 3 million.

Mars? Around 1.6 million. Venus? Over 350,000. Mercury? Over 210,000.

It’s hard to claim that NASA hides lots of images when these numbers are staring you in the face.

What Conspiracists Could Still Claim

I think the only “out” at this point, given this information (and if they acknowledge this information), is for conspiracists to claim that NASA and other space agencies simply obfuscate the “interesting” stuff. I suppose that’s possible, though they’d need armies of people to do it on the millions of returned images. And they apparently do a pretty bad job considering all the images that conspiracists post, claiming that features within them are of alien-origin.

It’s amazing how the “powers that be” are so powerful, and yet so sloppy. Apparently.

What This Infographic Does Not Show

I had to decide to clip a lot of information. We’ve imaged a lot of asteroids and a lot of comets. Those are out. We have had landers on the three closest bodies (Moon, Mars, Venus). Those images were not included.

Also, I focused on visible-light images, mostly. There are some instruments that take more UV images, or far-IR images, or various other wavelengths, but this infographic focused on the visible or near-visible light camera data.

Pretty much the only exception to this is for the Magellan mission at Venus, which took radar swaths of the planet to “image” the surface. I included this because, in early test audiences, I did not have Venus at all, and they requested it. Then, I did not include Magellan, but the test audiences wondered what happened to it. Describing why that data was not present made things wordy and more cluttered, so I, in the end, simply included it and put a footnote explaining the Magellan data.

This also fails to show the volume of data as measured by or approximated by (for the older craft) pixel count. If I were doing this by amount of pixels returned, the Moon and Mars would be far larger in comparison, and the Lunar Reconnaissance Orbiter and Mars Reconnaissance Orbiter would be much larger fractions of their respective bodies.

Final Thoughts

I’m releasing this under the Creative Commons license with attribution required, non-commercial distribution, and no derivative works (please see the CC stamp at the bottom of the infographic). This is so that I can at least have some semblance of version control (see release date at lower right).

I hope you find it useful and interesting. And at least somewhat purdy. If you like it, share it.

December 16, 2014

Podcast Episode 122: Comet 67P/Churyumov-Gerasimenko and Rosetta Conspiracies


Conspiracies of
Comet 67P …
Few, but they are weird.

A timely and listener-requested episode! What’s not to love!? In the episode I talk about several of the conspiracies I’ve seen surrounding the Rosetta mission and Comet 67P. From artificiality (Hoagland makes a guest appearance) to singing so as to raise our consciousness to angelic levels when 2012 failed, I spend nearly a half hour going through 2 to 4 claims (depending on how you count them) that have been making the rounds. I also get to touch on image analysis.

There is also one New News segment this episode, and it refers to the death of the Venus Express mission around (oddly enough) Venus. The news relates to the episodes on uncertainty. Not sure what the connection is? Listen to the episode! The episode also comes in at just over 30 minutes, my target length.

November 14, 2014

The Good and Bad of NASA Publishing Spacecraft Images Online


This was my second blog post for Swift, published late last week:

You wouldn’t know it by listening to many conspiracy theorists, but NASA is by far the most open space agency in the world when it comes to publishing data from spacecraft. By law, the teams that built and run the instruments on these missions must publish their data within six months of it being taken, except in rare cases when an additional six-month extension can be granted.

Contrast that with the Chinese and Indian space agencies, which still haven’t openly published data from missions that completed several years ago. Japan is better, as is the European Space Agency (ESA), but neither of them supply data as readily and easily as NASA.

In addition to the rules for depositing the raw, unprocessed data, NASA’s PR department, along with the PR arms of most missions, publish some of the data online almost as soon as it’s taken. This is great for the public; it’s also terrible for skeptics.

Allow me to explain by way of example: The LCROSS mission. This was the Lunar Crater Observation and Sensing Satellite that infamously sparked conspiracies that NASA was “bombing” the moon. The mission was to launch a projectile at the lunar south pole where there are permanently shadowed regions, and have the spacecraft fly through the plume formed by the projectile’s impact to try to detect water. If water were found, it would be a boon for crewed missions to the moon because astronauts could mine the water there instead of bringing their own.

The big event took place the night (in the US) of October 9, 2009. Within just a few days, photographs taken by the spacecraft were published by NASA online.

This was really good for the public. We got to see early results of what had been a very hyped event with observing parties taking place across the nation, including at the White House. It helped keep public interest longer than just one evening. It shared data with the people who paid for it: taxpayers.

LCROSS Landing Site

LCROSS Landing Site

So what’s the problem? These images show several things: The most basic of photographic processing without things like dust on the camera removed (which is always done for science images), color (the camera was black-and-white, so the color is completely an artifact of the press release image), brightness enhanced a lot such that most of the surface is white, and the PR release image is a JPG file format, meaning that there are JPG compression artifacts that manifest as blocky blobs.

For most of us, that doesn’t matter. We get the point that this is showing a bright glow caused by the impact of the spacecraft’s projectile. In NASA’s before shots, that bright glow is not present. A tiny flash of light that the world was watching for, with tens of thousands of people across the night side of the Earth staring upwards. (Unfortunately, it was cloudy where I was.)

Pseudscientists, on the other hand, don’t get that. There exists a large group of space anomalists that look for anything in a space photograph that they don’t immediately understand and use that to claim fill-in-the-blank. One of the most prolifically published modern people who practice this is Richard C. Hoagland. He took the NASA press release, increased the brightness even more, and claimed that the rectilinear, colored structures, were in reality infrastructure (tubes and pipes) by the “secret space program” and that the public space program had bombed them because the folks at NASA had finally found out about the secret bases on the Moon.

NASA Image PIA10214 with a Close-Up of "BigFoot"

NASA Image PIA10214 with a Close-Up of “BigFoot”

This will seem absurd to most people. But not to some. And, this is just one example; innumerable others exist. Every image published online in the easy-to-access public websites of the Mars rovers are poured over by anomaly hunters in the same way. Among other things, they search for rocks that are then said to look like apartment complexes, fossils, Bigfoot, all kinds of terrestrial and aqueous animal life, boots, a pump, and very recently, a water shut-off valve (to just name a few). Most of these are basic examples of pareidolia (creating a pattern in otherwise random data), or imprints actually caused by the rover equipment, but these are usually facilitated by the low-resolution and highly compressed JPG image format.

Do I think that NASA should stop being so open? No. I think that people are always going to find ways to find anomalies in images and claim it means something special. It’s the nature of the phenomenon, and pseudoscientists are always going to find something anomalous with something. And, the moment that NASA starts to restrict access to data, claims of censorship and hiding things will become even louder than they currently are.

I’m part of the planning team for the New Horizons mission that will reach Pluto in July of 2015. When the PI (Principle Investigator) of the mission, Alan Stern, announced that some of the data would be released on the web as low-resolution JPG images as soon as we get them, I have to admit I cringed just a little bit. And I felt bad for doing it. Dr. Stern has the absolute best of intentions, and he wants to keep people interested in the mission and share the data and let people see results from what is probably a once-in-a-lifetime mission, especially since the data downlink to Earth is going to be done over several weeks (due to the craft’s vast distance from Earth).

But, he will be making it very easy for anomaly hunters to find anomalies made by an intelligence — just not understanding that that intelligence was the software that produced the image.

Going forward, I don’t think there’s any good solution. But, this is something the skeptical community should be aware of, and it shows that there’s always a downside to things, even when you think there isn’t.

June 23, 2014

Podcast Episode 113: The Blue-Haze Limb of Mars


While the color of
Mars is red, some photos show
Blue on the limb. Why?

While I’ve already addressed the True Color of Mars (episode 74), one remaining – and unmentioned – twist is the blue haze limb that is sometimes visible as the upper atmosphere in color images taken from Earth orbit; this episode addresses those. And, it’s a completely different phenomenon than just a crappy understanding of image analysis. Real science ensues!!

Feedback makes up over half of this episode. I talk about Episodes 112 (why Russell Humphreys thinks that magnetic fields should decay to begin with and how he made his prediction), 109 (a follow-up interview of Marshall Masters from just a few days ago), and 111 (general feedback and criticisms of the Cydonia movie).

Finally, TAM is less than 2.5 weeks away, and I’d love to meet my adoring fans you folks who tolerate listening to me every now-and-then. Please let me know if you’re going AND interested in meeting up. Otherwise, I may have to spend all my time with a Hershey chocolate -lover, and we don’t want that now, do we?

And über-finally, I got a special e-mail while I was recording this episode. Listen to it all the way through to hear it. :)

Oh, and super-düper-finally, about the release schedule: Some of you may have noticed has been a bit off lately. The excuses are the usual, but ostensibly, the podcast is “supposed” to come out on the 1st, 11th, and 21st of the month. And that’s how I date them in the RSS feed. But, in the intro, I state that this is an episode for a certain third of the month, so that’s been the justification in my head for being able to get it out a little late. And looking at my upcoming schedule, I think that you can probably expect more of the same at least until September. They should be on or about the 1st, 11th, and 21st, but won’t necessarily be exactly on those dates.

May 31, 2014

Announcing Vodcast 1 and Podcast 111: The Cydonia Region of Mars


Anomalies do
Abound, but, are they really
That rare, unus’al?

Welp, this is it! My first new attempt to create a video that I’m reasonably proud of and shows things the way I’d like them to be shown. On YouTube: You can click this link. Or, there’s a link to the 720p version here. And, of course, the link to the shownotes for the podcast version.

The differences are: On YouTube, you can view up to 1080p (“Hi-Def”), while the version released to the podcast feed is 720p, fewer pixels. The podcast (audio, episode 111) itself is an audio extension of the movie, explaining some of the math (or “maths” for peeps “across the pond”) in more detail and discussing one or two deleted scenes — additional bits that weren’t central to the story so didn’t make it into the final cut of the movie.

As I say at the end, I really do want feedback on this. If negative, then make it constructive. If you’re a fan of Richard Hoagland’s work, and you disagree with the movie, then let me know WHY, not just that you disagree because I’m wrong. That gets us no where and is useless.

And, if you like the movie, then make sure to share it around. Delusions of grandeur don’t manifest on their own, gosh darnit!

May 1, 2014

Is Camera Noise Evidence for Ancient Advanced Civilization on the Moon?


Introduction

Richard C. Hoagland. Yes, another post about some of his claims (not him). Things had been quiet from Mr. Hoagland for several months, apparently because of his latest work, spending 4 months attempting to show that the Chinese lunar mission, Chang’e 3 (嫦娥三号), and its rover Yutu (Jade Rabbit, 玉兔), had found evidence of the same thing he thinks he sees in Apollo photographs (that was likely dirt on his scanner): Ancient glass towers on the horizon.

Where to start? What to address? His “paper” on the subject, which you can find at his “Enterprise Mission dot com” website (sorry, I’m not going to link), is massive. And has 136 markup errors and 23 warnings according to the W3C markup validation service. It is nearly 14,000 words long, Richard says it has over 100 photographs, and if I were to print it from Safari would come out at 86 pages.

That is a long way of saying there’s no way I’m going to even come close to addressing it all, or even try to. There’s even so much I could write about the specific topic I want to talk about – image noise and sensor non-uniformity – that I’m only going to be able to talk about in broad brushstrokes, but hopefully it’s understandable.

What I’m Not Talking About

Richard has been making the circuit of the late-night paranormal shows, podcasts, etc. Tonight he was on Jimmy Church’s show, the one I was on two weeks ago. I think that, given that I heard his Coast to Coast interview, his “The Unexplained” interview, and “Fade to Black” interview, I have a reasonable idea of his argument (keep in mind, that’s about 5.5 non-commercial-hours of listening to Richard talk about this, so forgive me for not reading another 13,700 words).

I’m not going to talk about his numerology:

  • 19.5° … anything.
  • Landed at 19.5° … LONGITUDE, not latitude.
  • Landed at 44°N which was a message to the 44th President … Obama.

I’m not going to talk about his conspiracy and symbolism:

  • Obama made some mention of carrot seeds in a gift to the Pope, which was a hidden message about the Jade Rabbit lunar rover.
  • All his previous NASA conspiracy stuff coming in.
  • Disclosure is going to happen within a few months (I seem to recall him saying 2010 was the Year of Disclosure and then in 2011 when being called on it (a rare instance of being called out), he said it was, we just hadn’t noticed it).
  • Brookings Report

I’m not going to discuss his pareidolia, since that doesn’t really play much a role in this set of claims (to the extent it does with grids, that will be discussed).

And so, of the four things that comprise the vast majority of Richard Hoagland’s claims (numerology, conspiracy, pareidolia, shoddy image analysis), it will be the image analysis that I will delve into.

Why not this other stuff? Why isn’t that as important? Because none of it is actual objective evidence for anything. It is supposition, ancillary to the claimed photographic evidence that I’ll be discussing in the rest of this post. Since the photography is the only (or the most) objective part, that’s what’s important to examine.

The Images (One of Them)

Here is the hallmark image that Richard has been sending to radio hosts. And I have included the original caption.

Richard Hoagland's Lunar Glass Towers from Chang'e 3

“Equalized version” of another official Chang’e-3 lunar surface image, revealing another set of the Moon’s startling “glittering glass towers” standing only a few miles northeast of the the Chang’e-3 landing site. Careful examination of the image will reveal an amazingly coherent geometry to these ancient, heavily meteor-eroded glass structures … including, the surface placement of the still-glowing “colored blue and red panels” appearing at these structures’ base and to the extreme right — apparently energized colored panels “embedded in the ancient glass.”

Noise

Ah, noise. Most of us are familiar with audio noise. Turn speakers on, when they’re not connected to anything else, and put the gain up all the way. You will hear static. That’s random electrons being picked up by the circuitry and being amplified as, literally, noise.

The same thing happens with digital cameras. They work by converting photons (little packets of light) into energy, and recording how much energy is recorded. Some pixels are more sensitive than others. Some pixels are always on, some are always off. Usually, because of the manufacturing process, it’s entire rows and/or columns of pixels that will be slightly more sensitive than others. And there’s the statistical fluctuations that have to do with counting statistics.

When cameras get warm, the molecules have more energy (definition of heat), and are more likely to randomly emit an electron that will be recorded … as in, noise. That is why professional – and even enthusiast – astronomy CCDs are cooled, sometimes with liquid nitrogen. It reduces the noise. If your sensor is unevenly heated, that can cause uneven noise across it (more noise where it’s warmer). Just a degree temperature difference will do it.

All of those mean that ANY digital detector will have noise – I don’t care how good it is, how much you paid for it, how many pixels it has, if it’s color or B&W … whatever about it – it will have noise. The fact that it has a temperature above absolute zero means it will have noise.

Here is an excellent tutorial on image noise. If at this point you don’t know what I’m really talking about, please read it, or at least look at the images. There is a small link to a part 2 at the bottom. Going forward, I’m going to assume that you have a a reasonable grasp of noise. This is already a long post.

What Is “Equalization”?

“I just brightened up the images a little bit.” –RCH

This is a hallmark of much of Richard Hoagland’s types of claims. Brightening the image, increasing contrast, increasing saturation, etc.

Equalization itself can have innumerable types of algorithms, but the basic idea is this: Many photographs of a typical scene have a little bit of dark, a little bit of bright, and a lot in the middle. That’s not how you have to shoot a photo, but that’s typical (go to this post and look for “Histograms”). What Equalize does it want to put the same number of pixels at every brightness level.

So, in that example, it will move some of the slightly darker middle colors to be darker, and it will move slightly brighter middle colors lighter. That way, if your image has 256 pixels, and you’re in 8-bit mode so there are 256 levels of brightness, one pixel will have a brightness 0, one will have a brightness 1, one will have a brightness 2, and so on.

Inevitably, this has the effect of stretching at least some brightness levels in the image. More on that in a bit.

This can be good! You take a wedding photo and Equalize can help bring out detail in both the bride’s white dress and the groom’s black tux (if we’re talking about a Western-style heterosexual marriage). That’s because the image, as-shot, would have a lot of dark pixels and a lot of bright pixels, so Equalize will bring them more to the middle.

But this can also be bad or silly, as I show in the next section.

Stuart’s Example

Below is an image I took of the moon last year.

Example of Why Equalization Is Sometimes Stupid

An original image of the Moon showing what happens when you “Equalize” blackness and the structure of noise.

The top image shows my nice, well-exposed photograph of the moon.

Then I saved the image as a JPG. The middle row shows what happened when I pressed Photoshop’s “Equalization” option. The left column is from the original image, before I saved it. The right column shows what happened when I pressed Equalize on the JPG-saved image. The bottom row is what happened after I converted both to greyscale, just for completeness.

So, what is this showing? Noise! (The pixel noise I talked about before and the JPG compression artifacts, though I’m not going to talk about those JPG artifacts in this post.) As I talked about above, different rows and columns of pixels are very slightly more or less sensitive than others. It doesn’t matter how good your sensor is, it will still have imperfections.

Since this particular sensor is three-color (RGB pixels), then different rows and columns of colors have different sensitivities, hence the red/pink checkerboard feature. The green pixels in this sensor apparently had better noise properties than the red and blue.

Notice also that it’s brighter around the moon. As if it’s surrounded by tall glass structures! After all, all of this stuff is showing perfect rectalinear geometry.

But why did I say that using equalization on this is silly? It’s because it is. The moon was surrounded by black sky. If you go to the original image on my computer (please don’t hack me), the pixel values – the number of photons recorded – scaled between 0 and 255, is 0-2 in that dark area. That, dear reader, is noise.

What about close to the moon? It raises to 8-20. (The Moon itself is 150-230.) The 8-20 pixel brightnesses are both noise AND, more importantly in this case, scattered light. This gets to another thing about optics: I don’t care how good your optics are, what kind of special coatings it has … unless you are in a clean room with ZERO dust, and perhaps using the clearest of crystals as your optics, your optics are not perfect and they will scatter light. Meaning the light won’t just pass through as it should, a few photons will be deflected and go somewhere else.

What that means for this case is that the moon is a bright circle on this image. A few of those photons are going to scatter within the optics of my camera and the probably 15 different pieces of glass that form the lenses. Probably, they won’t scatter far. That’s why right next to the moon, it’s 8-20 photons. But just 10% of the image away, we’re back at the background level of 0-2.

This all gets back to Richard’s images. I can’t figure out exactly which image Richard used as his main one, but another he uses comes from here, the bottom one with the caption 嫦娥三号着陆器地形地貌相机拍摄的“玉兔”号月球车照片。(Chang’e 3 lander topography camera “rabbit” No. rover photos. –via Google Translate).

Here’s how Richard presents it:

RCH's Processing of a Chang'e 3 Image

There — from the institution which forms the foundation of China’s very 21st Century existence–
Was the ghostly … repetitive … glistening glass geometry of “an ancient, Mare Imbrium dome …”–
With the official “Chinese People’s Liberation Army” logo plastered right on top of it (below)!

When I look at the image, the pixel values in the sky on the left half are 0-5. The right half is 5-10. The lunar features being around 100-250. It’s the same on a higher-resolution image that “Dee” found and posted over on Expat’s Dork Mission blog. Again, noise. And, I think some scattered light.

The moon is dusty. And even if it weren’t, the scattered light in the right half makes sense, getting back-scatter from the sun coming from the right, scattered in all directions but a bit more back to the right, and into the camera lens on that side. Another explanation is that the right half of the sensor was very slightly warmer than the left half. That will also give you noise of this exact type.

And, if these are glass towers, one must also ask why they stop just above the horizon!? On Richard’s “Enhancement” via Equalization, he shows the lunar surface, and just above it is a black line, and then are his glass towers.

But back to Equalize, what happened? Well, about half the image is really dark, pixel brightness values between about 0 and 10. Half the image is middle to bright, with brightness values between about 100 and 255. Because Equalize demands that the same number of pixels be at every level of brightness, it has to make a lot of those dark pixels brighter. Since half-way between 0 and 255 is about 127, it barely has to do anything to the lunar surface part. It’ll make some of the pixels a little darker, and some of them a little brighter, but the most drastic change will be to the sky area because that’s half the image, and so that half the image now must be mapped instead from 0-10 brightness to about 0-127 brightness. (Since it’s a little less than half, it only gets mapped up to 90, but you get the idea.)

Richard Says it Could Be Noise, But It’s Not Because It’s Geometric and Not Below Horizon

Richard said words to that effect at about 9:04 PM on Jimmy’s radio program (just over 2 hrs into it). He said it could either be noise or glass structures. He said it’s not noise because it’s geometric and because it doesn’t show below the horizon (the surface of the moon).

I reject both of those as explanations for why it’s not noise. The geometry argument because of my example above with my image of the moon, and see that tutorial on banding noise. If he thinks that image noise is not geometric (the noise from the sensor and noise from JPG compression), he is either more ignorant or delusional than I thought or, well, not telling the truth. Sorry, it’s hard to listen to him for 3 hrs and write 2800 words and not get in a small ad hominem.

I reject the part about it not showing below the horizon as evidence it’s not noise because of … the numbers. Even Richard often says, “It’s about the numbers!” In this case, you are talking about pixel values of about 5-10 brightness. Let’s say that’s noise. Just give that to me for a moment. Now look at the actual lit part of the moon. Pixel values 100-250. Noise of 5-10 photons on top of 0 is HUGE. Noise of 5-10 photons on top of 100-250 is miniscule. In other words, I say that the noise is still there in the part below the horizon, you just don’t see it.

Again, I’ll refer to the tutorial I linked to, specifically the first two images. The top one shows the same noise level, but a large signal (like the lunar surface). The second one shows the same noise level, but a very small signal (like the sky, though I’d say there’s no signal, it’s all noise).

Other Relevant, Miscellaneous Statements

“The data are replicable” therefore the fact that he sees this in Apollo and the Chang’e 3 images means it’s real, it’s “stunning confirmation.”

Yes, it certainly does mean that image noise is real and banding noise is also a real type.

“The Chinese have gone to the Moon and sent back the message — ‘Hoagland was right.'”

This one in particular struck Expat from the Dork Mission blog. I kinda agree. I find it typical of a decent number of claims by conspiracists in general and (personally) I find it somewhat arrogant to think that THEY are the only ones who can decode these secret messages, and even that the encoders are speaking directly to them!

Other “Enterprise Mission scientists” agree with him.

That is not peer review, that is an echo chamber. There’s a reason that Richard Hoagland is making the circuit on the paranormal shows and not anything else with this stuff.

Since the images are still up on the various Chinese websites, even after Richard Hoagland’s disclosure last week, that means that they are admitting that this stuff is real.

No, it means that most of us laugh this off as not understanding anything about photographic noise. Those of us among the very small community of scientists who actually follow these kinds of topics.

Where Are These “Miles-High, Miles-Across” Features in Meter-Scale Orbital Photographs?

As with the lunar ziggurat saga, and even as Richard stated (just his example doesn’t qualify), science demands repetition of objective data. Richard has claimed that these features are massive, miles across and miles high and miles wide. The Lunar Reconnaissance Orbiter’s Narrow Angle Camera records images with pixel scales of around 0.5 meters. Wide-Angle Camera is ~60 meters per pixel. The Japanese Kaguya craft had a terrain camera that recorded images at ~10 meters per pixel. And that’s just in the last few years.

But, none of these craft show these features. Even Richard hasn’t pointed to any of these images that would allegedly show them.

And yet, Richard claims that they are shimmering, glittering, multi-colored, glassy … and miles across. Where are they in other imagery? Yes, you’re looking down from above, but glass refracts light (index of refraction is 1.5-2.0-ish) so we should see weird distortions. And, he says it glitters, so we should see specular reflections, especially off the parts that are “damaged” (as he put it) and haven’t been repaired yet by robots (which he said are there in the C2C interview).

So either they should be there in the other images, or they don’t exist. Or, massive conspiracy. *cough*

Final Thoughts

I wasn’t going to talk about this stuff. When I first listened to it on C2C last week, my almost knee-jerk response was, “This falls into the category of Not Even Wrong.” In other words, there is simply so little correct, so little grounding in reality, that (a) there’s no way to even start to address it all, and (b) there’s usually not any reason to.

As RationalWiki put it, a Not Even Wrong statement is not of the form “2 + 2 = 6,” but rather, “2 + zebra ÷ glockenspiel = homeopathy works!”

Then I saw how much press he was getting on these various shows. On “The Unexplained,” he made a comment to the effect that he can reach 2% of the population (US population? world population?) “without breaking a sweat.” And he’s right. Or at least close to it. If the audience numbers that I’ve heard reported for these various shows are correct, he’s probably easily reached over a million listeners at this point, or 1/3 of a percent of the US population.

I don’t believe for a Plank time that all those people believe what Richard says. I also, honestly, don’t think it’s incredibly important whether they do or not. But, the credulity that even entertaining this kind of “analysis” fosters transfers into other fields. Like medicine. Or believing in a soon-to-be apocalypse. And there, your choices, your belief in various “alternative” views, can kill you. Or, in cases of psychics, astrologers, and others, they can bankrupt you.

April 10, 2014

Alien Lights or Cosmic Rays on Mars


Introduction

I was not going to talk about this because I didn’t think I had much to add. And I thought it was stupid. And, I’ve had run-ins with UFO Sightings Daily before (well, one).

But, people keep talking about it, so it at least deserves a mention here.

Origin Story

Everybody likes a good origin story. Wolverine made quite a lot of money.

The timeline, so far as I can tell, is that UFO Sightings Daily “discovered,” on April 6, 2014, and then posted, on April 7, 2014, the following:

Light on Mars in Curiosity Image (from UFO Sightings Daily)

Light on Mars in Curiosity Image (from UFO Sightings Daily)

An artificial light source was seen this week in this NASA photo which shows light shining upward from…the ground. This light was discovered by Streetcap1 of Youtube. This could indicate there is intelligent life below the ground and they use light as we do. This is not a glare from the sun, nor is it an artifact of the photo process. Look closely at the bottom of the light. It has a very flat surface giving us 100% indiction that it is from the surface. Sure NASA could go and investigate it, but hey, they are not on Mars to discovery life, but there to stall its discovery. SCW

Houston Chronicle Posts

It would’ve been relegated to everything else of random bright spots in images except that the Houston Chronicle‘s reporter Carol Christian decided to write a story about it.

And then two people posted to my podcast’s Facebook page (thanks Linda and Maryann). And Doubtful News picked it up, as did Phil Plait.

What Is It?

It’s a cosmic ray. >99% chance. Here’s what happens: High-energy particles constantly stream throughout the universe. We’ve been detecting them for decades, and their energy varies considerably.

Electronic imagers typically work when a photon – a bit of light – kicks up an electron within a pixel. Those electrons are counted after the exposure is done, and that’s how you get your image.

When high-energy particles randomly stream into a detector, they are higher-energy than the photons we’re usually trying to collect, and they appear as bright streaks. Digital cameras that you use for normal photography have algorithms to remove those as known noise sources, so you typically never see them. We also see them more rarely on Earth because many are blocked by the atmosphere.

Those of us who use research-quality cameras on telescopes, however, see them all the time. In fact, Phil said the exact same thing: “I’ve worked with astronomical cameras for many, many years, and we see little blips like this all the time.” (It’s nice when we agree.)

Right now, some of my research is focusing on using images from the Cassini spacecraft in orbit of Saturn, studying some of Saturn’s moons.

Rhea from Cassini (W1594713967_1)

Rhea from Cassini (W1594713967_1)

Here is one image of Rhea, taken by the ISS camera. It’s a raw image, about as original as you can get with respect to almost no processing has taken place. And look at all those stray bits of light! Pretty much every single one of them, including the two long streaks, and including the dots, are cosmic rays.

More evidence? Courtesy of Phil Plait, we have an animation:

Light, No Light (Phil Plait)

What’s nice is that this is from Curiosity’s NAVCAM, which has a pair of cameras. From the right camera, we have the bright spot. From the left camera, we don’t. The reason that you’re seeing a small shift in position is due to parallax between the two cameras (by design, since this helps tell distance). (FYI, Mike Bara, who addressed this just a half hour ago on Coast to Coast AM, claimed that the cosmic ray was the least likely explanation, and while he posts the parallax GIF on his website, he said he refused to name the source because “I dislike him [Phil Plait] intensely.” Despite showing a another image that Phil linked to, so clearly he read Phil’s blog. Mike’s seemingly only explanation for why it was not a cosmic ray is that he said it didn’t look like other cosmic rays people are pointing to. That’s like me saying that a rose is not a plant because all the examples of plants you’re showing me are trees. It’s a class of object, every cosmic ray on a detector looks a little different, especially when you have blooming factored in (see the next section).)

Why a Rectangle?

Either the cosmic ray hit at an angle, so we see it as a streak (see above example ISS image), or, as is also common with CCD images, when an individual pixel collects too much light, it tends to overflow, and spill over into neighboring pixels, almost always along columns. We call this “blooming.”

But Wasn’t It Seen In a Second Image in the Same Spot a Day Later?

Mike made this claim, and I saw it from a commenter on Phil’s blog. Thus far, no one has actually posted or linked to such a second image that I can find. If anyone has seen this claimed image, please let me know. And by “please let me know,” I mean providing the NASA image ID so I can find it. I know that Mike put an “Enhancement of April 3rd image” on his blog, but it’s useless for proving anything without the ID it came from.

Anything Else?

Maybe? This post might be slightly premature, and it’s a bit stream-of-consciousness, but I wanted to get it up before bed. The station on which I was listening to Mike on C2C decided to cut out the second half hour because of some crash somewhere, something about people dying, breaking news, etc. When I get the full audio, I may add to this, but it sounded like George was taking the interview in a separate direction after the bottom-of-the-hour break, though a caller may have brought it back up.

Let’s be clear about a few things, though:

1. The object is seen in one camera, not in another, despite the two cameras taking an image at the same time of the same spot.

2. There is a claim that it showed up in another image a day later, but so far as I can tell, this is just a claim and no one has pointed to that image. If it exists, I’d like to see it and I’ll re-examine my curt analysis.

3. We see similar artifacts in other Mars images, and we see them all the time in space-based cameras, and we see them generally in all electronic cameras (at least those that don’t get rid of them for us).

4. The story comes from UFO Sightings Daily and only became mainstream because a reporter at a somewhat mainstream paper picked it up.

So, what could it be? Aliens? Architecture that glints just right so it’s only in one camera of two that are right next to each other imaging something a few miles away? An impact flash from a crater forming? A dust devil reflecting the light just right? Lens flare?

Or a cosmic ray? I don’t think any of those previous explanations are likely, I think this is most likely.

Bara, as with other UFO / aliens protagonists, say that Curiosity should live up to its name and drive over there and investigate. Yup, take days, power, money (gotta pay the ground crew), and investigate what is very likely to be a high-energy particle that made it through the atmosphere and onto a camera’s CCD.

What do you think?

Edited to Add (10 hrs later): Per Phil’s latest blog post: “Except not really. Another expert on Mars hardware said it may have actually been a “light leak”, a bit of sunlight that somehow got into the camera through a hole, or crack, or seam somewhere in the hardware. He also says it may be a sharp reflection of sunlight off a glinty rock. Those are certainly plausible, though right now we don’t have enough evidence to say for sure which of these explanations may or may not be the right one.” Yup, another possibility. As is a defect in the camera sensor itself (see discussion in the comments to this blog post).

Next Page »

The Rubric Theme. Blog at WordPress.com.

Follow

Get every new post delivered to your Inbox.

Join 1,517 other followers