Future spy satellites may unfold like origami birds, collecting image data along long, flat sensor arrays that weigh almost nothing. By replacing the bulky telescopic lenses that make today’s spy sats among the biggest and most expensive things in space, light-sensitive microchips promise far cheaper access to orbital imagery.
Last month, Lockheed Martin released the first images from its experimental Segmented Planar Imaging Detector for Electro-optical Reconnaissance, or SPIDER, program. Funded by the Defense Advanced Research Projects Agency, or DARPA, SPIDER is basically a telescope on a microchip. But it collects light data very differently from a conventional telescope.
A regular telescope, of the sort you might find in the Hubble Space Telescope or a Keyhole satellite pointed at North Korea, is fundamentally modeled on a human eye. The eye collects data on light intensity —or sees — by filtering the light through its lens and iris to the retina. Similarly, conventional telescopes and cameras collect light through lenses and pass it to detectors. In old cameras, that detector was film. In new digital cameras, the detector is a bed of capacitors. The number of photons that hits the detectors over a certain period of time gives you the light intensity. As that intensity varies across the area that you are trying to take a picture of, you see shapes and objects.
To get a decent image with a conventional telescope, your detector needs to be a surface at a reasonable distance from the lens. Shorten the telescope too much and you get images too small to be useful. And everything must be aligned to tiny tolerances. With a space-based telescope, that’s difficult and power-intensive to do.
The SPIDER chip works differently. Instead of measuring light’s intensity, it collects data wavelength and amplitude — since intensity is really just amplitude squared. A computer then calculates what the intensity would be based on the amplitude and phase data. That removes much of the size, power requirements, and complexity of telescopes modeled after eyes.
“You’re measuring a more fundamental characteristic of the light that’s carrying the information that you want when you measure amplitude and phase. That gives you the ability to manipulate that information in algorithms and software that you don’t have if you only measure intensity somewhere,” said Alan Duncan, a senior fellow at Lockheed Martin.
To read more, click here.