International Space Station Sensor Watches for Lightning - IEEE Spectrum
Lightning is one of the most common natural hazards on Earth, and our warming planet is just beginning to feel the effects of a future with more severe thunderstorms and increased lightning strikes. But there’s a lot that atmospheric scientists don’t understand about how lightning works. Better lightning data could improve severe weather forecasts and warnings, and could help researchers to understand where hazards will increase in the future—and the associated impacts such as wildfires and need for lightning-proofed infrastructure.
A unique optical sensor that just spent two years on the International Space Station could help fill those gaps. Researchers at Western Sydney University, supported by the U.S. Air Force Research Lab, demonstrated the use of an event-based vision sensor (EBVS) to record lightning strike details from above—at lower cost, higher resolution, and lower data rates than before. “The technology is based on how biology works and can see things that a normal camera can’t,” says Gregory Cohen, the deputy director of Western Sydney University’s International Centre for Neuromorphic Systems.
Lightning is bright and extremely fast. We can see it in images, but we don’t know how much energy is emitted at what point during the discharge process. Nor do we understand how lightning is initiated within thunderclouds, how it travels through a cloud, or how often it may discharge. Better understanding and forecasting requires capturing features in cloud-to-cloud and cloud-to-ground lightning that are difficult to pin down.
Vanna Chmielewski, a research physical scientist at the U.S. National Oceanic and Atmospheric Administration’s (NOAA’s) National Severe Storms Laboratory, is one researcher who would appreciate more observations. “We have observations from the ground of where we think lightning strikes or transfers current, and we have a broad view from space,” she says. But trying to align ground and space observations, and understand what different instruments are seeing, poses a challenge.
International and regional networks like the World Wide Lightning Location Network (WWLLN) generate regular maps of lightning activity by monitoring radio signals with ground-based sensors. In space, the Geostationary Lightning Mapper on the Geostationary Operational Environment Satellites (GOES)—a joint NOAA/NASA operation—detects lightning day and night over the Americas using a megapixel near-infrared CCD camera with detector spatial resolution of 10 kilometers. The European Space Agency’s Lightning Imager recently began similar operations for Europe and Africa, and the Chinese Meteorological Agency is launching satellites with similar capabilities.
Lightning researcher Steven Goodman, emeritus senior scientist on the GOES program, says that “by 2040 we’ll have a ring of geostationary satellites mapping lightning around the planet.” But returning reams of data from geostationary orbit is difficult, slow, and expensive. Additionally, there’s a limit in terms of the full-Earth highly pixelated resolution that’s possible from space. “If you think about a cloud, and a lightbulb going off inside, the light gets scattered, but that creates a pool of light that you see from the cloud tops,” Goodman says. Observing the pool’s fine-scale details requires new tools.
That’s where the Falcon Neuro instrument comes in. It’s a pair of neuromorphic vision sensors, or event-based sensors. Instead of capturing an image frame by frame, these devices capture changes in light level within a scene. The raw output is a stream of “events” for each pixel: Every instant at which the brightness change exceeds a certain threshold, with microsecond timing. The result is high-contrast, high-speed observations that break the relationship between frame rate and the volume of data produced. “We get all the benefits that we need for space—low power, high speed, low data rate, high dynamic range—by changing the fundamental way in which we generate information from the visual world,” says Cohen.
Cohen, Geoff McHarg at the University of Iowa, and their colleagues designed the imager using two heavily modified sensors, one looking down and one looking forward. The imager gathers light across the visible and near-IR spectrum from its perch aboard the ISS. The team developed a feature-finding algorithm that identifies probable lightning events in the raw data that’s been processed into images with meter-scale resolution.
To confirm the presence of lightning in data from January 2022 to August 2023, the researchers compared suspected strikes to ones detected by the ground-based radiofrequency sensors of the Global Lightning Detection Network (GLD360). They found that every time the GLD360 recorded a lightning flash, Falcon Neuro recorded a significant increase in clusters of events happening at the same time and place. Moreover, the instrument routinely detected multiple lightning flashes in the same cloud within brief periods of time measuring just tens-of-milliseconds. At the same time, the GLD360 recorded only a single flash—indicating that Falcon Neuro was capturing detailed features of lightning progression that were out of view of its ground-based counterpart. And these high-speed recordings—equivalent to 500 to 1000 frames per second—only generated 3 to 4 megabits per second of data. That’s not something that’s possible from small satellites with any other sensor right now.
Chmielewski says that this first in-orbit test is “exciting.” Meanwhile, Goodman points out that space-based sensors are best for detecting in-cloud lightning and ground-based ones for detecting cloud-to-ground. Ultimately, global lightning monitoring will benefit from a combination of instruments and approaches, with each filling a different gap.
Cohen adds that the demo uses camera components from the 1990s, meaning focal plane arrays that are “a postage stamp compared to what we’ll send up next year.” He’s working on future versions for making high-speed lightning recordings in different wavelength bands. Goodman looks forward to that: “The UV tends to be less bright, so we don’t use it so much. But people want to know what these other spectral emission lines look like,” he says.
Still, comparison to higher-resolution lightning arrays at optical wavelengths is imperative. If it continues to perform well, it’s a great proof-of-concept for other platforms. “Imagine flying it on research planes above storms,” Chmielewski says. Cohen is exploring high-altitude balloons for this purpose, and for monitoring bush fires. “Space is the perfect application of neuromorphic sensors,” he says.