Exposing PseudoAstronomy

July 22, 2015

#NewHorizons #PlutoFlyby – The Pseudoscience Flows #6: Data Download


Introduction

I know I’ve promised other parts to this series, but this one will be quick* and I want to get it out there because it feeds into a lot of varied and various conspiracies related to NASA’s New Horizons mission to the Pluto-Charon system, and I’ve even seen many misconceptions on normal science blogs / websites (not to be named): Where’s the data!?

Deep breath people: It’s coming. Slowly.

*I thought it would be quick, but it turned out to be nearly 2000 words. Oops…

The Slowness of Spacecraft Data Transfer

Every space mission – save for one very recent, experimental one – relays data via radio signal. In other words, light. The amount of power that the spacecraft can muster goes into figuring out the data rate it can sustain. Think of it a bit like this: If you have the Bat Signal, but you were using a flashlight, you’d be lucky if someone could just see the flashlight aimed up at the sky. There’s no way they could see details of a bat cut-out. But if you use a really really bright spotlight, you can see it farther, and you can even stick a detailed bat cutout over its front and you can make out that cutout.

Perhaps a bad analogy, but that’s kinda the idea here: If you have a very strong signal, then you can include a lot of detail really quickly. If you have a weak signal, then the data rate is slower. Oh– better analogy: bad wifi reception. You know you have low signal strength when it gets really slow.

Moving on, the New Horizons REX antenna does not have a huge amount of power. New Horizons launched with less plutonium for power than originally intended, and it needs power for running the spacecraft. It has so little power for the antenna that only the 70 meter dishes in NASA’s Deep Space Network (DSN) are big enough to receive the signal at Earth, which is a paltry 3 * 10-19 Watts. (Compare that with a 100 W light bulb.) To me, first off, it’s amazing that we can even receive that faint of a signal.

But once you get over that amazement, the DSN also has to be able to detect changes in that tiny signal. That’s how we get data. Like blinking your flashlight in Morse code, or putting the Bat Signal stencil up. If we have very little signal strength, we can’t change our signal very quickly, or the DSN may not be able to read it. Change more slowly, then they will.

For planning purposes, we were able to send data at 1296 bits per second. I’m old enough (sigh…) to remember dial-up modems in the 1990s. My family’s first modem was the dreaded 14.4 kbps modem which was painfully slow at pulling up AOL’s e-mail. Or Hamster Dance. But even that was over 10 times faster than New Horizons’ data rate. And, let’s convert it to real things, bytes. There are 8 bits to a byte. 1296 bits per second is only 162 bytes per second. I have a thumbdrive attached to my computer that holds 64 GB, or 64 gigabytes. It would take about 4572 hours, at the average New Horizons download rate, to fill that fairly modest thumb drive. That’s 190 days.

Keep in mind that the spacecraft is still taking data. Keep in mind that there are only 3 70m DSN dishes at the correct latitudes to see the spacecraft, ever, from Earth. Keep in mind that there are other missions out there that need the DSN to communicate with Earth. Keep in mind that 1296 is an average planning bit rate, and while the Canberra and Goldstone dishes get more like 2000 bps, Madrid tends to get less due to the elevation of the spacecraft above the horizon.

So, from the get-go, just from considering the data rate (power requirements on the spacecraft, distance to the spacecraft, and timetable of receiving stations on Earth), one should be able to see that it will take a painfully long time to get the data from the spacecraft.

While we could keep up with the data rate and did a large download a month before encounter (which is why data weren’t taken in late May), there’s no way we could get all the data during encounter very soon after it, which is why the craft flew with two 8 GB storage drives, and it filled up 60 Gb during encounter (see what I did there, switching between bit and byte?).

There’s Other Data Besides Images!

And that’s any kind of data. There aren’t just images and “pretty pictures” that many of us want. There is one B&W camera on the craft, but there’s also a color camera, two spectrometers, a dust counter, two plasma instruments, the antenna itself took data, and there’s basic spacecraft housekeeping and telemetry that says things like, “Yes, I really did fire my thrusters at this time when you wanted me to!”

Basic Download Plan

I can discuss this because the basics have been made public. It’s just not “sexy” like pretty pictures so it’s not that easily findable.

Leading up to encounter, data were prioritized as though we were going to lose the spacecraft at any time, so the most important, “Tier 1″ science data were downloaded first. And, critical optical navigation images.

After encounter, the same thing happened, where compression algorithms were used on the data on-board the spacecraft and that lossy-compressed data were sent back to Earth to fulfill as many Tier 1 science goals as possible. That’s how – and why – in the last week we’ve already revolutionized what we know about Pluto. Those first high-res (0.4 km/px) images of the surface were planned out based on Hubble Space Telescope maps of the surface and the spacecraft timing and trajectory to get images that cover different brightness and color patches. (Which takes care of another, minor conspiracy that I’ve seen that claims we “knew” where to point the cameras because the Secret Space Program had leaked us information about what would be interesting.)

But now that we’re more than a week from closest approach, thoughts are turning to what to do next. Originally, a “browse” data set of all the lossy data (only the imagers and spectrometers store lossy-compressed in addition to lossless) were going to be returned first, along with the lossless data from other instruments. That would at least let us at least understand the surface at a lossy JPG quality and for the plasma folks to do their science.

But now people are discussing scrapping that and bringing down the lossless data instead, albeit many times slower because of the larger file sizes.

Planning, Fairness

But, believe it or not, planning of what’s downloaded when is made no more than a few weeks out (except for the closest approach weeks). Right now, we’re working on the late August / September load of commands and deciding what data to bring down in what order.

Each of the four science theme teams (geology geophysics & imaging (GGI), atmospheres, composition (COMP), and particles & plasma (P&P)) puts together a list of their top priorities based on what we’ve seen so far. The Pluto Encounter Planning (PEP) team then sits down and looks at how much they can bring down in what time and puts things in order. The sequencers then take that and try to make it happen in the test computers. Then we iterate. Then it gets reviewed. Extensively. Only then does it get uploaded to the spacecraft to execute.

But besides that priority list, it’s the Principle Investigator who decides how much data each science team gets. For example, while I’m on PEP (it’s what I was initially hired to do), I’ve been adopted by GGI. Wearing my GGI hat, I want images from the LORRI instrument. All the time, and only LORRI. I don’t care what the plasma instrument PEPSSI recorded. But by the same token, the P&P folks don’t care anything about images, they want to know what their instruments recorded as the craft passed through the Pluto system to see how the solar wind interacted with escaping particles from Pluto – or even if it did. (Which it did, as was released in a press conference last Friday.)

So Alan Stern has to make the decision of how to be “fair” to so many competing interests within the large – and broad – science team. So while COMP may want to have 5 DSN playback tracks in a row to bring back just one of their very large spectra data cubes, Alan has to make sure that GGI gets their images and P&P gets their data, too.

The Plan

The decision was made several months ago that after this initial batch of data – what we saw last week, what we see this week – that all of the “low speed” data will come down in August. That’s housekeeping & telemetry, that’s things like how many dark pixels are in any given LORRI image, it’s the two plasma instruments and data recorded by the antenna and dust counter, and that’s about it. After that, we get back to the imagers and spectrometers, per the balance discussed above.

And since it’s not sequenced, and it’s not public, I can’t tell you any more than that.

So we are, unfortunately, not going to see any new images for practically a month, beyond the two navigation images that should come down tomorrow and Friday.

Conspiracy!

Due to the nature of this blog, obviously this is going to fuel conspiracies: NASA’s hiding the data, NASA’s manipulating the data, NASA’s [whatevering] the data, etc.

It’s just not true.

I have known for years that these conspiracies about NASA somehow intercepting the data and manipulating it before even us naïve scientists can get our hands on it would be very difficult, but being on this mission has made me realize that it’s even more difficult to somehow support that conspiracy than I had thought.

Literally, as the data are received by the DSN – before it’s even completely downloaded – it’s on our processing servers and in the processing high-cadence pipeline. On Monday morning when we were supposed to get four new images, we were literally sitting in the GGI room hitting the refresh button and marveling over each new line of pixels that we were getting back in practically real-time. To use a religious analogy, it was every Christmas morning rolled into a one-hour marathon of hitting the refresh button.

And we were all there watching — over 20 of us. And other science team members kept coming in to look.

The idea of secretly having one or two people intercepting the data, “airbrushing” things in or out of it, and only then giving it from On High to the scientists just shows how out of touch from reality conspiracists are. (By the way, I use the term “airbrushing” here because that’s how many conspiracists still talk. Obviously, no one is physically airbrushing things anymore — and I doubt anyone younger than 30 even knows what a real airbrush is.)

To sustain the conspiracy, I can only see one of two choices: (1) Either all of us scientists are in on it, in which case it becomes ridiculously large and unsustainable and scientists suck at keeping secrets about exciting new things, or (2) somehow there’s super secret advanced tech that intercepts the spacecraft signal and at the speed of light “airbrushes” things out and retransmits it to the DSN to get into our processing pipeline. Because we know when stuff is supposed to appear on Earth. Because we write the sequence that does it.

Final Thoughts

Not that I expect this to convince any conspiracy theorist of their folly. The lack of image data for the next month, and the lossy JPG data we have now all contribute to the little anomalies that don’t immediately make sense, and the average conspiracist can easily spin into something that it’s not.

July 20, 2015

#NewHorizons #PlutoFlyby — The Pseudoscience Flows, Part 4


An eagle-eyed Facebooker on the Facebook page for my podcast (thanks Warwick!) pointed this ‘un out to me on the Before It’s News website. Something like the equivalent of the Daily Mail but with more of a UFO bent.

Apparently there are huge cities on Pluto.

This was one of the pseudosciences that I knew going in was going to be prevalent, though I expected it to be more explicit first from Richard Hoagland and Mike Bara who are more vocal about their pareidolia and reading into image artifacts.

The entire crux of this guy’s arguments is that he sees a blockiness in the released images. He claims that he knows and can prove they are not JPEG artifacts for two reasons:

(1) He’s using a TIFF image and not JPEG, and
(2) the blockiness runs at a diagonal and not parallel to the image edges.

For the second reason, I have two words and two punctuation symbols for you: “Rotate & crop.” To add a a few more words: Most released full-disk images have been rotated such that the north pole is “up” in the image. The spacecraft didn’t take them that way. We rotated them to be consistent. Therefore, the original blocky compression artifacts run parallel to the image edges, but now they run diagonal because it’s been rotated! Pretty simple. Yet it has eluded this conspiracist.

Similar elusion is of a simple fact that all because your current file format is one type, that does not mean the original file format was that type. To be more explicit, all because the NASA press release you got this image from happens to be a TIFF on the NASA website, that does not mean that the original image downlinked from the spacecraft was not lossy-compressed JPEG. Which it was. No image downlinked from the craft since July 12 has been lossless, they have all been lossy. Via a 10:1 ratio, meaning they are very lossy. All because I can take a JPEG and use any image software to re-save it as a TIFF does not mean that TIFF will not contain those original JPEG artifacts.

The JPEG blocks are 8 pixels on a side, and many of the released images have been up-sized (I don’t know why, I argued against that, but I have no influence over NASA’s nor APL’s graphics people).

He also assigns the wrong image credit, as “NASA/JPL-Caltech/MSSS.” It’s “NASA/APL/SwRI.” That’s not hard to get right. It’s called reading the caption for the image that you take from NASA.

I’d say that this is one of the sillier conspiracies I’ve heard so far, but it’s really hard to choose. Especially with what’ll be parts 5 and 6.

Perhaps the best line from the video: “No one’s actually accused me of, uh, pixelation, yet, but I’m sure someone will. Uh, some— some spook, probably, some guy from MI6 will come on here … heh! Who knows? Or someone from NASA will try and debunk it. But we’ll see! We’ll see what they have to say.”

Ah…. if only I worked for NASA, or MI6. Maybe I’d drive a nicer car. But it doesn’t take someone from British Intelligence to tell the guy that IT’S JUST BEEN ROTATED!!!

P.S.: Also within the video are other various claims. Like a large hexagonal crater (no, that’s his mind trying to break a circle into line segments), and that NASA purposely brightened the image so that it washes out detail near the pole. No, that’s called the sub-solar point, which is where the sun is directly shining, so you can’t see any topography, only brightness differences of the actual material on the surface. It washes everything out.

June 23, 2015

Podcast Episode 134: Big Bang Denial


The Big Bang theory:
Tot’ly explains the cosmos?
Or, is it a dud?

This episode follows a big from the Black Hole Denial episode, but this time with another aspect of cosmology: The Big Bang. I was able to use a few old blog posts, too, that I wrote practically 7 years ago.

As mentioned, I’m now on a weird – though backdating – release schedule due to the piling on of work as the New Horizons craft nears Pluto. But I’m still trying to do 2 episodes/month, at least.

June 12, 2015

Are We on the Verge of Discovering an Earth-Like Exoplanet?


I announced awhile ago that I was on episode 347 of the Canadian, “The Reality Check” poscast where I talked about exoplanets and some hype — deserved or otherwise — about almost but never quite yet discovering Earth-sized exoplanets.

While they post a lot of links and other things on their website, they don’t post transcripts of what we actually talk about. Since I spent a solid many minutes writing and editing my segment’s text, I thought I’d post it here:

There’s lots of ways to talk about exoplanets, but I’m going to take the traditional approach and start with a very broad but brief overview of how we have found the few-thousand known extra-solar planets, or “exoplanets” for short. There are five main ways.

The most obvious is the most difficult: Direct Imaging. This is where you take your telescope and would look at a star and see the planet around it. This is almost impossible with current technology, and we have less than 20 exoplanets found this way. It’s so hard because the star is so bright relative to the planet and because most star systems are so far away. And obviously, if the planet is larger and farther away from the star, it’ll be easier to see.

The second main method has also only produced about 20 planets so far: Gravitational Microlensing. Einstein showed that large masses bend light, and we can see this in space when an object that’s far away passes behind a massive object that’s a lot closer. The light from the background object gets distorted and magnified, much like a lens … a lens caused by gravity. If the foreground object happens to be a star, and that star has a planet, then that planet can make a detectable contribution to the lensing, not only in amount, but in the exact shape of the lensing effect.

The earliest actual successful method was a special form of what’s called the Timing Method, specifically in this case, pulsar timing. Pulsars are incredibly dense stars called neutron stars, and we get a blast of radio waves every time one of its poles sweeps in the direction of Earth. These are so regular that any tiny perturbation can be detected and attributed to something weird, like a tiny planet tugging on it and so changing that regular spinning signal.

This is the same concept as the highly successful method that found the most exoplanets until a few years ago: Radial Velocity. The idea is that we normally think of a planet, like Earth, orbiting the sun. But it doesn’t really. It *and* the sun orbit a mutual gravitational point called the “barycenter” that is between the two. For Earth and the sun, that point is VERY close to the sun’s center, but it’s not quite in the center. That means that over the course of a year, as Earth goes around that point, the sun will, too (on the opposite side of that point). So, it will wobble very very slightly as it orbits the barycenter.

We can’t possibly observe this tiny tiny motion of other stars. BUT, we can use the light that star emits to do it by using the Doppler shift. That’s the phenomenon where if something is moving towards you, the waves it emits become compressed, and if it’s moving away from you, the waves get stretched out. The common example is a train whistle going from high to low pitch, but in astronomy, this is where the light is shifted to blue and then to red.

So, if the planet around another star is at its closest point to us, the star emits light and we see it all normal. As the planet starts to move away from us, the star starts to move very slightly toward Earth, and so its light will be very slightly blue-shifted. Then, the planet gets to its farthest point, and starts to move towards Earth, which means the star starts to move away, and we see its light red-shifted. This is an incredibly tiny effect, and the smaller the planet, the smaller the shift in the light. Or the pulsar timing change.

There was a lot of progress throughout the late 1990s and early 2000s in very high-resolution spectroscopy in order to get better and better at observing smaller and smaller planets. The easiest ones to observe are the largest because they make the biggest shift in the star’s light, and ones that are closest to their star are easier because you don’t have to observe as long. To observe a planet that has a 10-day orbit, you just have to observe that star for about a month from Earth to get decent statistics.

That’s why all the exoplanets discovered early on were what are called “Hot Jupiters,” since they were very large and very close to their stars.

The final method is the Transit Method. If a fly passes in front of a bright light, you can see a slight decrease in the light. If a bird passes in front of a light, you’ll see a larger decrease in the light. Same thing here: A planet passes in front of the star and temporarily blocks part of the light from the star that we would see at Earth. The big issue with this method is that you have to have the fortuitous geometry alignment where the planet’s orbit is just right so that it passes in front of its star as seen from Earth. The first one wasn’t detected until 1999, but a decade later, the dedicated spacecraft COROT and then Kepler were launched to look for these, monitoring the same fields of the sky, tens of thousands of stars, moment after moment, looking for those brief transits. In 2014, Kepler released over 800 planets discovered with this method, more than doubling the total number known, and that was on top of its other releases and, to-date, it’s found over 1000.

The transit method, despite the issue of geometry, is probably the best initial method. If you have the planet going in front of its star, then you know its alignment and you can follow-up with the radial velocity method and get the mass. Otherwise, the radial velocity method can only give you a minimum mass because you don’t know how the system is oriented, you only know that radial component of velocity, hence its name.

With the transit method, you can see how much light is blocked by the planet. Knowing the star’s type, you can get a pretty good estimate for the star’s size, and knowing how much light is blocked means you can get the cross-sectional area of the planet and hence its diameter. For example, Jupiter would block 1% of the sun’s light, and since area is the square of length, that means Jupiter is about 10% the sun’s diameter. Since the sun is a type G V star, we have a good model for its radius, though of course we know its radius very well because we’re in orbit of it. But that means not only can we get mass, but we can get size and density.

The transit method also lets us see if there’s a large atmosphere. If the light from the star instantly blinks down to the level when the planet passes in front of it, then any atmosphere really thin or nonexistent. If there’s a gradual decrease, then it’s extended. If its extended, we can follow-up with something like the Hubble Space Telescope and actually figure out what that atmosphere is made of by looking at what colors of light from the star are absorbed as it passes through the planet’s atmosphere.

And as with the radial velocity and timing methods, we know how long it takes to go around its parent star, and along with the star’s mass from what kind of star it is, we can get the distance of the planet from the star.

Okay, so much for a brief overview. But for me, I’ve left out a lot.

Moving on, it should be somewhat apparent that the bigger the planet, and the closer to its star, the easier it is to observe with pretty much ANY of these techniques, except direct imaging or microlensing where you want a big planet that’s far from its star. Big means big effect. Fast orbit means you don’t have to observe it for very long to show that it’s a regular, repeating signal best explained by a planet.

So, the question is then, can we detect an Earth-sized planet, and can we detect an Earth-like orbit? These are really two different questions and they depend on the technique you’re using. If we want to focus on a the two main methods – radial velocity and transit – then the unsatisfying answer to the second is that we do finally have good enough technology, it is just a matter of finding it. With the 2014 Kepler data release, there were over 100 exoplanets that are less than 1.25 Earth’s size. With the 2015 release, there are a total of 5 planets smaller than Earth or Venus, but they orbit their 11.2-billion-year-old star in just 3.6 to 9.7 days.

Even if we have observations for more than a year or two, for something as small as Earth, the level of signal relative to noise in the experiment is still pretty small, and you want a big signal relative to the noise. It’s best to build up multiple years’ worth of data to average out the noise to be able to really say that we have an Earth-like planet. For something like Jupiter, which orbits our sun in about 12 years, we’d need to observe at least two transits, meaning we’re just now approaching the time when we would have a long enough baseline of data with some ground-based surveys, but that’s also assuming we catch that planet for the few hours or days when it goes in front of its star versus the years and years that it doesn’t, and that we do this repeatedly and don’t chalk it up to sunspots.

This is why we really need long-term, dedicated surveys to just stare at the same place in space, constantly, measuring the light output of these stars to see if we can detect any sort of dimming, that’s repeated, from a likely planet.

But, even if we find an Earth-like planet in terms of mass and diameter and location in its solar system, that’s not enough to say it’s Earth-like in terms of atmosphere and surface gravity and overall long-term habitability. It’s just a first step. A first step we have yet to truly reach, but one that is reasonably within our grasp at this point.

But it’s from the existing planets we know of that we get some of the hype that hits the headlines every few months, like “astronomers estimate billions of Earth-like planets exist in our galaxy alone.” I’m not going to say that’s fantasy, but it’s loosely informed speculation based on extrapolating from a few thousand examples we now have from a very, VERY young field of astronomy.

Or, we’ll get articles where the first sentence says, “Astronomers have discovered two new alien worlds a bit larger than Earth circling a nearby star.” It’s in the next paragraph that we learn that “a bit larger than Earth” means 6.4 and 7.9 times our mass, and they orbit their star in just a few days.

So as always, this is a case where, when we see headlines, we need to be skeptical, NOT susceptible to the hype, and read deeper. But that said, it is entirely possible that any day now we will find an exoplanet that is at least like Earth in mass, size, and distance from its host star.

May 26, 2015

Podcast Episode 132 – In Search Of Planet X (Live from Denver ComicCon)


In Search: Planet X.
An overview of common
Ideas about it.

This episode is another recording of one of my live presentations, modeled a little after Leonard Nimoy’s “In Search Of” television series. It was presented in front of a live audience at the Denver ComicCon on May 24, 2015, to about 75-100 people. I was bordered on two sides by other sessions that had more people and a lot of laughter, so I played to that a little bit when there were opportune moments. I also suffered a minor A/V issue in the middle but recovered, so you’ll hear some fumbling there.

Unfortunately, there is also some popping that comes in about 10 minutes into the recording. I exploited all the filters that I know of in my Audacity toolkit, and they are less of an issue than they were, but they are definitely present.

I also need to announce that it is that time of year when work is going to get crazy, so episodes may come out a little less regularly, especially during July. I’m still going to keep to the two per month schedule, but they may not be out on exactly the first and sixteenth of the month.

And with that in mind, I have to head to the airport in 45 minutes for more work, after just being back home for 3.5 days. So …

April 4, 2015

February 16, 2015

Podcast Episode 126: The Facts and Misconceptions Behind Funding in Science, with Dr. Pamela Gay @starstryder


The sordid subject
Of the coin: How scientists
Are – and are not – paid.

This is another episode where I don’t focus on debunking a specific topic of astronomy, geology, or physics pseudoscience, but rather I focus on a topic of misconceptions related to science in general: How scientists are funded. This is done via an interview and bit of discussion with Dr. Pamela Gay, who cohosts the very famous “AstronomyCast” podcast and is the director of CosmoQuest.

The topics are varied, but it remains focused on some of the misconceptions of how research is funded and the real process behind it. It’s also a bit depressing, but I can’t always have light-hearted topics like Planet X isn’t coming to kill you.

Since this is an interview, it is a somewhat longer episode (54 minutes), there is no transcript, and there are no other segments.

The episodes for the next two months should be focused on Comet Hale-Bopp and have a brief interlude of another interview with the chair of the program committee for a major planetary science conference, and what they do when they get submissions that seem like pseudoscience.

January 1, 2015

Podcast Episode 123: The Science and Pseudoscience of Communicating with Aliens with @KarenStollznow


Karen Stollznow talks
‘Bout the issues of ET
Communication.

I wanted to start the New Year off on a lighter and different kind of topic, so I interviewed linguist, Dr. Karen Stollznow, about alien communication. This was based a bit on her TAM 2014 talk, and we got into a lot of issues not only with how communication is portrayed in popular media, but how communication is problematic amongst people on our own planet, different language groups on our own planet, and different species on our own planet. We then discussed – within that context – some people who claim they are in contact with aliens and how linguistic analysis shows the claimed languages to be poorly constructed variations on what they already know.

This interview was only meant to be a half hour long, but even after editing, it is just under an hour. That editing included removing a headset issue and two phone calls from my mother (family emergency). I tried to find a possible natural break to get it to two 30-minute episodes, but I found none: the conversation flowed very well, I thought.

There are no other segments in this episode because it is just over an hour long. The next episode should be about black hole denial.

December 30, 2014

My First Infographic: What Have Our Planetary Space Probes Photographed Since 1970?



Introduction

This has been over two months in the making: I’m finally releasing my first infographic. It’s entitled, “Planets and Major Moons: Distribution of Non-Lander Spacecraft Photos Since 1970.” (Suitable for printing on A-size paper with a bit of top and bottom margin to spare.) The purpose is to show the number of images taken by different space probes of the planets (and major satellites), the percentage of the total images that were for each body, and for each body, the percentage taken by each different spacecraft.

PDF Version of Spacecraft Imagery Infographic (3.5 MB)
PNG Version of Spacecraft Imagery Infographic (4.7 MB)

Number of Images of Planets Taken by Spacecraft Infographic

Number of Images of Planets Taken by Spacecraft Infographic

Development Process

I’ve been wanting to create infographics for awhile. Really good ones are few and far between, especially for astronomy, but the good ones are often amazing works of art. I don’t pretend that this is an amazing work of art, but hopefully it’s not ugly.

To me, the key is to have a lot of information crammed into a small space in an easy-to-understand way that you don’t have to be an expert to interpret. In my work, I deal a lot with multi-dimensional datasets and so already I have to come up with ways of displaying a lot of information in as few figures as possible and yet still make them readable.

The Idea

An idea that I came up with is based on the claim that “NASA hides all its pictures!” (This is often, hypocritically, almost immediately followed up with NASA spacecraft imagery showing claimed UFOs and other pseudoscientific claims.)

And so, I wanted to investigate this: How many images really have been taken and are available publicly, for free, on the internet? After several days of research, I had the results, and I assembled them into the above infographic.

The Numbers

I was surprised by some of the numbers and I was not surprised by others. One thing that did not surprise me was that the outer planets have very few photographs (relatively speaking) of them, while most imagery has focused on Mars and the Moon (fully 86%).

But, I was not prepared for how very few photographs were taken by our early probes to the outer solar system. Pioneers 10 and 11 were the first craft to venture out, and yet, because of the (now) archaic method of imaging and slow bandwidth, they collectively took a mere 72 images of both Jupiter and Saturn. Compare that with the ongoing Lunar Reconnaissance Orbiter around the moon, which has publicly released over 1.1 million images.

You can also see the marked effect of the Galileo high-gain antenna failure: Only 7.4% of the photos we have of Jupiter were taken by Galileo, despite it being an orbiter in the 1990s. Compare that with the Cassini orbiter of Saturn, which has returned nearly 50 times as many images, despite no dramatic change in technology between the two craft. This means that only 0.4% of our images of planets and moons are of Jupiter, while 1.9% are of Saturn.

You can also see the marked success of modern spacecraft and the huge volumes of images that (I repeat) are publicly available. The pie slices in the infographic are color-coded by approximate spacecraft operation era. Well over 90% of all images were taken after 1995, and the current suite of the latest NASA spacecraft (MESSENGER around Mercury, Lunar Reconnaissance Orbiter around the Moon, and Mars Reconnaissance Orbiter around Mars) account for a sizable fraction of the returned data for that body — especially MESSENGER, which accounts for 98.1% of all Mercury images.

What was I most surprised by? The Clementine mission to the moon. It returned and has publicly archived just shy of 1.5 million images of the lunar surface. I expected the Lunar Reconnaissance Orbiter to have surpassed that. And, it still may, as it continues to operate and return data. We shall see.

Why the Conspiracy Theorists Are Wrong

As I said, one of the primary reasons I made this was to investigate the claim by conspiracy theorists that these space agencies hide photographs. The blame rests almost entirely on NASA by most conspiracists’ accounts. This infographic proves them wrong in two significant ways.

First, at least for the Moon, Mars, and Venus, sizable numbers of images have been taken by and publicly released by non-NASA sources. I specifically have data from the European Space Agency (SMART-1, Venus Express, and Mars Express), and Japanese Space Agency (SELENE / Kaguya). While both the Indian and Chinese space agencies have also sent spacecraft to the moon and Mars (Mars for the Indians with the recently-in-orbit “MOM” craft), and Russia has sent craft to Venus, Moon, and Mars, I could not find the public repositories – if they exist – for these missions. Therefore, I could not include them. But, a lack of those two does not affect the overall point, that non-NASA agencies have released photos of these bodies.

Second, as I’ve repeated throughout this post, these are the publicly released images. Not private. Public. To public archives. In the bottom-left corner, I have the sources for all of these numbers. (Please note that they were compiled in late October and may have increased a bit for ongoing missions — I’ll update periodically, as necessary.)

The total number of lunar images? About 3 million.

Mars? Around 1.6 million. Venus? Over 350,000. Mercury? Over 210,000.

It’s hard to claim that NASA hides lots of images when these numbers are staring you in the face.

What Conspiracists Could Still Claim

I think the only “out” at this point, given this information (and if they acknowledge this information), is for conspiracists to claim that NASA and other space agencies simply obfuscate the “interesting” stuff. I suppose that’s possible, though they’d need armies of people to do it on the millions of returned images. And they apparently do a pretty bad job considering all the images that conspiracists post, claiming that features within them are of alien-origin.

It’s amazing how the “powers that be” are so powerful, and yet so sloppy. Apparently.

What This Infographic Does Not Show

I had to decide to clip a lot of information. We’ve imaged a lot of asteroids and a lot of comets. Those are out. We have had landers on the three closest bodies (Moon, Mars, Venus). Those images were not included.

Also, I focused on visible-light images, mostly. There are some instruments that take more UV images, or far-IR images, or various other wavelengths, but this infographic focused on the visible or near-visible light camera data.

Pretty much the only exception to this is for the Magellan mission at Venus, which took radar swaths of the planet to “image” the surface. I included this because, in early test audiences, I did not have Venus at all, and they requested it. Then, I did not include Magellan, but the test audiences wondered what happened to it. Describing why that data was not present made things wordy and more cluttered, so I, in the end, simply included it and put a footnote explaining the Magellan data.

This also fails to show the volume of data as measured by or approximated by (for the older craft) pixel count. If I were doing this by amount of pixels returned, the Moon and Mars would be far larger in comparison, and the Lunar Reconnaissance Orbiter and Mars Reconnaissance Orbiter would be much larger fractions of their respective bodies.

Final Thoughts

I’m releasing this under the Creative Commons license with attribution required, non-commercial distribution, and no derivative works (please see the CC stamp at the bottom of the infographic). This is so that I can at least have some semblance of version control (see release date at lower right).

I hope you find it useful and interesting. And at least somewhat purdy. If you like it, share it.

December 16, 2014

Podcast Episode 122: Comet 67P/Churyumov-Gerasimenko and Rosetta Conspiracies


Conspiracies of
Comet 67P …
Few, but they are weird.

A timely and listener-requested episode! What’s not to love!? In the episode I talk about several of the conspiracies I’ve seen surrounding the Rosetta mission and Comet 67P. From artificiality (Hoagland makes a guest appearance) to singing so as to raise our consciousness to angelic levels when 2012 failed, I spend nearly a half hour going through 2 to 4 claims (depending on how you count them) that have been making the rounds. I also get to touch on image analysis.

There is also one New News segment this episode, and it refers to the death of the Venus Express mission around (oddly enough) Venus. The news relates to the episodes on uncertainty. Not sure what the connection is? Listen to the episode! The episode also comes in at just over 30 minutes, my target length.

Next Page »

The Rubric Theme. Create a free website or blog at WordPress.com.

Follow

Get every new post delivered to your Inbox.

Join 1,637 other followers