CuttingEdge Fall 2022

Page 1

Volume 19 Issue 1 Fall 2022 www.nasa.gov
National Aeronautics and Space Administration

Office of the Chief Technologist Announces FY22 IRAD Innovator of the Year: Electrical Engineer Scott Hesh

Goddard’s Office of the Chief Technologist has named engineer Scott Hesh as this year’s FY22 Internal Research and Development (IRAD) Innovator of the Year, an honor the office bestows annually on individuals who demonstrate the best in innovation.

An electrical engineer with an insatiable curiosity, Scott Hesh is being honored for groundbreaking work developing a sub-payload dispersal system that tracks and consolidates communications for 4 to 16 science packages ejected from sounding rockets. Since2017,ScottHeshandhisteamworkedhand-inglovewithscienceinvestigatorstodevelopatechnologytosampleEarth’supperatmosphereinmultiple dimensionswithaccuratetimeandlocationdata.

“Scott has this enthusiasm for what he does that I think is really contagious,” said Sounding Rocket Program technologist Cathy Hesh. “He’s an electrical engineer by education, but he has such a grasp on other disciplines as well, so he’s sort of like a

in this issue:

Fall 2022

2 Chief Technologist Announces FY22 IRAD Innovator of the Year: Scott Hesh

> Novel Platform Lofts Science Payloads Into Space

> Innovating at Speed

6 Eye on Autonomous Navigation

> Precision Location by Scanning the Horizon

> Self-Guiding Spacecraft to Cross a Sea of Stars

10 An Artificial Intelligence for Seeing

12 Efficient, High Resolution Infrared Cameras

14 Measuring Helium in the Exosphere

16 A Smaller, Simpler Mass Spectrometer for Exploration

About the Cover

Wallops electrical engineer Scott Hesh prepares to test a sub-payload cannister along with a sounding rocket in a special chamber at Wallops Flight Facility near Chincoteague, Virginia. Hesh received this year’s Internal Research and Development Innovator of the Year Award for his work on the Swarm Communications technology. This innovation provides scientists a novel platform to sample data across a wide area of Earth’s upper atmosphere by ejecting science instruments from a sounding rocket in flight and streaming their data back to the ground.

cutting edge • goddard’s emerging technologies PAGE 2 Volume 19 • Issue 1 • Fall 2022 www.nasa.gov/gsfctechnology
(Image Credit: Berit Bland) Scott Hesh works on a sub-payload cannister that will be part of a science experiment and a demonstration of his Swarm Communications technology. Photo Credit: Berit Bland

systems engineer. If he wants to improve something, he just goes out and learns all sorts of things that would be beyond the scope of his discipline.”

Mechanical Engineer Josh Yacobucci has worked with Scott Hesh for more than 15 years. He said he always learns something when they collaborate.

“Scott brings this great perspective,” Yacobucci

said. “He could help winnow out things in my designs that I hadn’t thought of.”

For his interdisciplinary leadership resulting in game-changing improvements for atmospheric and solar science capabilities, Scott Hesh deserves Goddard’s Innovator of the Year Award. v

Sought-After Sounding Rocket Technology Enables Science Investigations

If you build a better sounding rocket science platform, scientists will beat a path to your payload bay.

Scientists from NASAand partner institutions have a new vehicle to collect multi-point measurements from Earth’s upper atmosphere. The so-called Swarm Communications technology ejects four to 16 science instrument packages from sounding rockets. These sub-payloads spread up to 25 miles while streaming telemetry and science data in real time through the host rocket’s communications system to the ground.

Sounding Rocket Program technologist Cathy Hesh and electrical engineer Scott Hesh assem-

bled a team of engineers to develop this technology in 2017 in response to requests from scientists in NASA’s Sounding Rocket Working group. Today, after four Internal Research and Development (IRAD) grants and three test flights, they can provide a science vehicle offering unprecedented accuracy for monitoring Earth’s atmosphere and solar weather.

“We were lucky enough that we had this core team with the right skill set to get this done,” Scott Hesh said. “We were able to turn this around from concept to flight in under three years.”

Starting with a vapor tracer ejection system used to measure upper atmosphere winds, the team

Volume 19 • Issue 1 • Fall 2022 cutting edge • goddard’s emerging technologies PAGE 3 www.nasa.gov/gsfctechnology cutting edge • goddard’s emerging technologies
The Swarm Communications team displays their sub-payload cannisters at NASA’s Wallops Flight Facility in Chincoteague, Virginia. In the first row (left to right), Taylor Green, Steve Bundick, and Josh Yacobucci hold three of the four swarm sub-payloads that flew on the first tech demo flight which proved the swarm technology worked. On the right, Cathy Hesh holds a 3D-printed prototype used to develop the concept prior to building the flight hardware. In the back row (left to right) are Brian Banks, Christian Amey, Scott Hesh, Chris Lewis, and Alex Coleman (now retired). Not pictured are Marta Shelton and Gerald Freeman. Photo Credit: Berit Bland

developed a standardized sensor platform and data collection architecture.

Mechanical engineer Josh Yacobucci said team members came together with a singular focus on this project.

“We knew from the beginning that we wanted one sub-payload that would be either spring- or rocket-ejected and not have to rely on separate designs for each option,” he said. “Every time we got together, diverse perspectives on the team led to improvements in different systems.”

Sub-payloads deployed with springs can carry larger payloads, but they eject from the rocket at 8 feet per second. This speed allows up to a 0.6-mile radius of separation from the main payload.Adding a small rocket motor limits space inside the cannister but increases their velocity by a factor of 48 for a 15-mile separation. They dubbed their project Swarm Communications, Scott Hesh said, because it communicates with a large number of sub-payloads, though individual cannisters do not operate independently as in other NASAswarm initiatives.

Sounding rockets are sub-orbital launches from locations like NASA’s Wallops Flight Facility near Chincoteague, Virginia. They provide a low-cost platform to test new space-bound technology and conduct science experiments that cannot be accomplished on the ground. The program, along with balloons and air-craft, are part of NASA’s AffordableAccess to Space program that brings these opportunities to scientists, educational institutions, and students.

So far, the swarm technology has been well received in those communities.

Riding on the third launch thisAugust, researcher

Dr Aroh Barjatya’s Sporadic-E ElectroDynamics

Demonstration mission, or SpEED Demon, traveled up to 100 miles altitude on a Terrier-Improved Malemute rocket. He sought to measure conditions of a transitory Sporadic E event: where a cloud of evaporated micrometeor metals can reflect radio signals at a level in the ionosphere that doesn’t normally reflect radio.

“This was an excellent mission,” Barjatya said. “Preliminary analysis shows that we flew through a Sporadic E event on the down leg and the data looks great. We’ll be looking at the performance of all instruments to get us ready for a 2024 launch.”

Barjatya directs the Space andAtmospheric Instrumentation Lab at Embry-RiddleAeronautical University in Daytona Beach, Florida.

“There has been huge interest in our in our experimenter community to put their sensor packages on this platform,” Scott Hesh said. “It’s really not that hard for them to build these sub-payloads now that we have a platform with standard data interfaces and a standardized power supply That takes a lot of design effort off them.”

“We can’t build the sub-payloads fast enough to keep our customers happy,” he added. “That’s a good problem to have.”

Working with scientists from the beginning allowed the team to direct their efforts to provide better science results, Cathy Hesh said.

“We ended up taking on several instruments and they got a lot of science data back even on our first test flight,” she said. “We also got good, real-time feedback from the scientists to help improve the whole project.” v

cutting edge • goddard’s emerging technologies www.nasa.gov/gsfctechnology
Volume 19 • Issue 1 • Fall 2022
Mechanical Engineer Josh Yacobucci prepares to test a sounding rocket sub-payload cannister in the anechoic chamber at NASA’s Wallops Flight Facility near Chincoteague, Virginia.
PAGE 4
CONTACTS Scott.V.Hesh@NASA.gov or 757.824.2102 Catherine.L.Hesh@NASA.gov or 757.824.1408
Photo Credit: Berit Bland

Innovating at Speed

Engineer Scott Hesh credits the NASASounding Rocket Program for providing a fast and agile framework for engineering and innovation, providing scientists low-cost access to space and a fast turnaround, averaging two to three years to develop and fly a mission.

That cadence also provides an ideal environment for engineers to ‘lean forward smartly’ and improve on each new iteration.

Between each of the three swarm test flights, the Wallops team improved the quality and reliability of critical systems – communications, telemetry, and core electronics – as well as maximizing the volume of each sub-payload cannister set aside for science instruments.

One hurdle to collecting science data from distributed instruments is knowing precisely where and when each sensor collected each data point. Sounding rockets’ fast turnaround from concept to flight and the swarm team’s ability to advance the design for each flight solved this challenge.

Hesh’s first launch in 2019 used inertial sensors to provide acceleration, then calculated speed and direction over time for location plots. On the second launch, they added GPS (Global Positioning System) capability to each cannister; however, the system was unable to acquire a GPS lock in flight. For the third flight a year later, the team improved the GPS design based on lessons learned. The new version provided accurate time and position data to the science team in a small, cost-effective system.

Communicating to the ground is tricky with four sub-payloads and a rocket, and most launch facilities don’t have the antenna infrastructure to track that many payloads simultaneously, Hesh said. Instead, each sub-payload sends its data to the sounding rocket, where they combine into a single 6-megabit-per-second or higher stream that is transmitted to the ground, essentially providing five

data sets in one. They can expand up to 40 megabits, he said, allowing a single rocket to eject and track 12 or more sub-payloads.

This, and a portable ground hardware rack, allows the system to be deployed from remote facilities around the world with fewer communication resources.

In ground control, that data is presented in a visual format that doesn’t overload the human technicians and scientists monitoring their measurements. Color-coded streams of data from each sub-payload allow a single person to scan for sensor anomalies as well as data on the sounding rocket’s telemetry and control systems.

The Swarm Communications technology has optimized the capacity of the cannister form factor, Hesh said. Making that into a truly distributed swarm mission would require expanding it to the next level.

He said he envisions leveraging more mature and larger-format CubeSat technology, “Then you could add reaction wheels to allow a stabilized platform, expand core flight software, test new CubeSat technologies, or even have the sub-payloads talk to each other,” Hesh said. v

Volume 19 • Issue 1 • Fall 2022 PAGE 5 cutting edge • goddard’s emerging technologies www.nasa.gov/gsfctechnology
Four identical sub-payload cannisters sit on a bench at NASA’s Wallops Flight Facility near Chincoteague, Virginia. They will fly on a sounding rocket into the upper atmosphere to conduct science observations that depend on multi-point observations across a wide area. Photo Credit: NASA’s Wallops Flight Facility/Swarm Communications Team

Eye on Autonomous Navigation

The farther NASA reaches out into the solar system and beyond, the more missions will need to function autonomously. Critical maneuvers require immediate onboard responses, and for smaller or more distant missions, communications bandwidth to Earth is stretched thin.

The Goddard-developed autonomous Navigation Guidance and Control (autoNGC) software system provides a fully autonomous framework for missions of all sizes, said Cheryl Gramling, Goddard’s mission engineering technology lead. Over the past year, investigators gave autoNGC an eye-opening infusion of optical navigation enhancements to improve autonomy for a variety of mission profiles.

“We are committed to performing missions autonomously when it comes to navigation, guidance and control,” Gramling said. “In optical navigation, for instance, autonomy means you don’t have to send images back to Earth for navigation processing, allowing communications bandwidth to be reserved for science products. Images are some of the highest-density data, and these spacecraft will be taking a lot of images for OpNav.”

Goddard innovators are currently working on multiple ways to improve or expand autonomy in space missions: including celestial OpNav capability to track moons, planets, and other bodies against

background stars (See Page 9) and a panoramic navigation feature that makes sense of the Moon’s rocky horizon (Page 6), among other functions. Goddard’s Internal Research and Development, or IRAD, funding either enabled many of these technologies or played a role in their early development.

At its heart, autoNGC works in the core Flight Software (cFS) environment, a standardized software solution for spacecraft operations designed to save each mission the cost and labor of reinventing their operating system. The autoNGC system incorporates a wide variety of cFS-compatible interfaces to navigation hardware and technology, from measuring velocity through Doppler shifts in communications signals, to onboard inertial sensors, from radar, to lidar or optical navigation using data from a variety of cameras. It could even use X-ray pulsar navigation –a concept developed out of the Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) mission – to navigate deep space by tracking millisecond pulsars.

“We have a lot of different sensor compliments we can fuse together for the navigation solution,” Gramling said, not to mention guidance and control functions such as autonomous trajectory planning and closed-loop control systems.

“It also has different modes it can operate in

Volume 19 • Issue 1 • Fall 2022 cutting edge • goddard’s emerging technologies PAGE 6 www.nasa.gov/gsfctechnology
The OSIRIS-REx spacecraft navigated to take a sample of asteroid Bennu’s surface autonomously using an onboard image software known as Natural Feature Tracking (NFT) – a form of optical navigation. NFT guides the spacecraft by comparing an onboard image catalog with real-time navigation images taken during descent, orienting the spacecraft by specific landmarks on Bennu. This navigation technique allows the spacecraft to accurately target small sites while dodging potential hazards. Image Credit: NASA’s Goddard Space Flight Center

depending on the mission phase and distance to the target,” she added. “As you get closer to your target, you can tune the system to improve the solution accuracy from better-resolution data: such as from a point cloud of lidar data returned on approach to an asteroid.”

The highlights in this issue of CuttingEdge reveal only the tip of the iceberg of opnav applications. Other ongoing adaptations being developed within the same cFS and autoNGC system architecture include: efforts to improve 3D imaging by lidar for steering, navigation, and hazard avoidance; developing an active wavelength scanning lidar for the Concurrent Artificially-intelligent Spectrometry and Adaptive Lidar System (CASALS) science lidar technology; improving optical navigation techniques

for future small-body sampling missions; improving combined optical and lidar navigation solutions for lunar landers; developing and testing a Space Qualified Rover Lidar (SQRLi) for use on planetary missions similar to the Mars Perseverance rover, and investigating how autoNGC could benefit future distributed space missions.

Scalability, adaptability, and a universal operating system for spacecraft make the combination of cFS, autoNGC, and optical navigation a powerful constellation to guide future deep space and planetary missions.

CONTACT

Cheryl.J.Gramling@NASA.gov or 301.286.8002

Steering by Landmarks – On the Moon

Much like how familiar landmarks can give travelers a sense of direction, a NASA engineer is teaching a machine to use horizon features to navigate on the Moon. Goddard research engineer Alvin Yew, started with digital elevation models from sources such as the Lunar Orbiting Laser Altimeter (LOLA) aboard the Lunar Reconnaissance Orbiter. He then uses those models to recreate features on the horizon as they would appear to an explorer on the lunar surface. Those digital panoramas can be used to correlate

known boulders and ridges with those visible in pictures of the lunar horizon taken by a rover or astronaut, providing accurate location identification for any given region.

“Conceptually, it’s like going outside and trying to figure out where you are by surveying the horizon and surrounding landmarks,” Yew said. “While a ballpark location estimate might be easy for a person, we want to demonstrate accuracy on the ground down to less than 30 feet. This accuracy

Volume 19 • Issue 1 • Fall 2022 PAGE 7 cutting edge • goddard’s emerging technologies www.nasa.gov/gsfctechnology
The rim of de Gerlache crater – an Artemis target destination – is well illuminated, while a smaller crater within is draped in shadow. Research engineer Alvin Yew used an image processing tool to identify horizon lines and highlight them in red. He is working on an artificial intelligence application that would compare those lines on images taken by explorers to a 3D model of the moon like this one created by NASA’s Solar System Exploration Research Institute. Image Credit: NASA/MoonTrek/Alvin Yew

opens the door to a broad range of mission concepts for future exploration.”

Making efficient use of LOLA data, a handheld device could be programmed with a local subset of terrain and elevation data to conserve memory. According to work published by Goddard researcher Erwan Mazarico, a lunar explorer can see at most up to about 180 miles (300 kilometers) from any unobstructed location on the Moon. Even on Earth, Yew’s location technology could help explorers in regions where Global Positioning System (GPS) signals are not dependable, or during solar storms that interfere with GPS accuracy.

“Equipping an onboard device with a local map would support any mission, whether robotic or human,” Yew said. “For safety and science geotagging, it’s important to know exactly where they are.”

Yew’s geolocation system will leverage the capabilities of GIANT, the Goddard-developed Image Analysis and Navigation Tool. This optical navigation tool developed primarily by Goddard engineer Andrew Liounis independently verified navigation data processed by the primary navigation team for the Origins, Spectral Interpretation, Resource Identification, Security, Regolith Explorer, or OSIRIS-Rex (CuttingEdge, Summer 2021, Page 15).

In contrast to radar or laser-ranging tools that pulse radio and light at a target to analyze the returning signals, GIANT quickly and accurately analyzes images to measure the distance to and between visible landmarks. cGIANT is the portable, flight version of GIANT, a derivative library to Goddard’s autonomous Navigation Guidance and Control system, or autoGNC, which provides mission autonomy solutions for all stages of spacecraft and rover operations.

In addition to Yew’s technology, isolated regions on the lunar surface may require overlapping solutions derived from multiple sources to assure safety.

To support lunar exploration, NASA is working with industry and other international agencies to develop an interoperable communications and navigation architecture on the Moon: LunaNet. LunaNet will bring “internet-like” capabilities to the Moon and provide astronauts, rovers, orbiters, and more with networking, navigation, and situational awareness services.

“It’s critical to have dependable backup systems when we’re talking about human exploration,” Yew said. “The motivation for me was to enable lunar crater exploration, where the entire horizon would be the crater rim.” v

Volume 19 • Issue 1 • Fall 2022 cutting edge •
emerging technologies PAGE 8 www.nasa.gov/gsfctechnology
goddard’s
Alvin.G.Yew@NASA.gov or 301.286.3734
CONTACT
The collection of ridges, craters, and boulders that form a lunar horizon can be used by an artificial intelligence to accurately locate a lunar traveler. A system being developed by research engineer Alvin Yew would provide a backup location service for future explorers, robotic or human. Image Credit: NASA/MoonTrek/Alvin Yew

Self-Guiding Spacecraft to Cross a Sea of Stars

Ancient mariners learned to guide their ships to new worlds by tracking the stars above. Future spacecraft might sail themselves across that sea of stars by looking back at Earth – and tracking everything else that’s not a star.

An expanded capability of Goddard’s vaunted optical navigation infrastructure would steer spacecraft by triangulating against solar system objects. This capability would provide deep space navigation without tying up precious communications or ground control resources, said Principal Investigator Andrew Liounis.

“Celestial navigation tells you where you are with respect to Earth, the Sun, and other solar system objects by detecting three or more objects and triangulating your location,” Liounis said.

“If you’re looking at the object you’re trying to get to, that’s optical navigation,” he clarified. “If you’re looking at other objects in the solar system, that’s celestial navigation.”

In addition to providing a kilometer-level location in the solar system, the expanded capabilities measure the streak a moving object leaves against the field of stars. This allows the navigation system to calculate speed and direction of travel based on the relative movement of other solar system bodies. His celestial navigation project expands on the core Flight System (cFS) Goddard Image Analysis and Navigation Tool (cGIANT), a streamlined, C++, autonomous, flight-software version of the groundbased Goddard Image Analysis and Navigation Tool (GIANT). GIANT performed critical backup navigation support for the Origins, Spectral Interpretation, Resource Identification, Security-Regolith Explorer (OSIRIS-REx) mission (CuttingEdge, Summer 2021, Page 15).

This new capability offers an autonomous visual navigation capability to planetary missions during deep-space flight, approach to the destination body, and for missions that orbit Lagrange points. cGIANT is part of Goddard’s autonomous Navigation, Guidance, and Control (autoNGC) software suite.

Liounis credited Co-investigator Chris Gnam for much of the accomplishments in integrating celestial navigation into cGIANT.

“Celestial navigation is a two-step process,” Gnam said. “On the ground, the mission plan is pre-processed to identify the kinds of objects we expect to

see, so we know where and when we expect them to be found. Then, during the mission, our primary focus is on tracking those objects and how they change their relative position as you cross the solar system.”

Liounis and Gnam are working on a follow-up IRAD to design and execute a software-in-the-loop test, an isolated simulation that ensures the codes can function without errors within the full autoNGC suite before being tested in the field.

Taken together, the software that makes up autoNGC provides plug-and-play solutions for smaller missions and future investigations of outer solar system planets, as well as any mission where access to precious communications networks and ground control resources is limited.

The software packages work in the core Flight Software (cFS) environment. This system provides the full range of spacecraft operating software for missions like a CubeSat in Earth orbit, where access to ground control is cost-prohibitive for day-to-day operations. Beyond Earth, where precise location from fleets of Global Positioning Systems is less reliable, cGIANT can incorporate optical navigation for accurate maneuvering, landing, and navigation, Liounis said. v

Volume 19 • Issue 1 • Fall 2022 PAGE 9 cutting edge • goddard’s emerging technologies www.nasa.gov/gsfctechnology
CONTACT Andrew.J.Liounis@NASA.gov or 301.286.2856
A streak against a field of stars identifies a moving body in the solar system. One celestial navigation technique involves detecting and tracking such streaks and comparing them against a catalog of known bodies expected to be found along a spacecraft’s journey. Co-Investigator Chris Gnam implemented a prototype streak detection algorithm in MATLAB and is incorporating it into optical navigation software. Image Courtesy: Chris Gnam

An Artificial Intelligence for Seeing

A NASA scientist is working to develop an A-Eye. Oceanographer John Moisan said artificial intelligence will direct his A-Eye, a movable sensor. After analyzing images his AI will not just find known patterns in new data, but also steer the sensor to observe and discover new features or biological processes.

Moisan said existing AI and machine learning technologies don’t come close to replicating the kind of human visual processing or intelligence needed to make real-time decisions about unfamiliar data.

“A truly intelligent machine needs to be able to recognize when it is faced with something truly new and worthy of further observation,” Moisan said. “Most AI applications are mapping applications trained with familiar data to recognize patterns in new data. How do you teach a machine to recognize something it doesn’t understand, stop and say ‘What was that? Let’s take a closer look.’ That’s discovery.”

Finding and identifying new patterns in complex data is still the domain of human scientists, and how humans see plays a large part. Goddard AI expert James MacKinnon said scientists analyze large data sets using visualizations to reveal relationships between variables.

“A truly intelligent machine needs to be able to recognize when it is faced with something truly new and worthy of further observation. How do you teach a machine to recognize something it doesn’t understand, stop and say ‘What was that? Let’s take a closer look.’ That’s discovery.”

A complex data visualization tool called an embedding space allows scientists to manipulate different aspects or dimensions of the data in a multi-dimensional visualization. By carefully manipulating the scale of specific dimensions, the trained human eye finds relationships between different aspects of the data, which can be investigated to identify key variables.

“You need some way to take a perception of a scene and turn that into a decision and that’s really hard,” he said. “The scary thing, to a scientist, is to throw away data that could be valuable. An AI might prioritize what data to send first or have an algorithm that can call attention to anoma-

Volume 19 • Issue 1 • Fall 2022 cutting edge • goddard’s emerging technologies PAGE 10 www.nasa.gov/gsfctechnology
An image of a coastal marshland combines aerial and satellite views in a technique similar to hyperspectral imaging. Combining data from multiple sources gives scientists information that can support environmental management. Image Credit: NASA/John Moisan

lies, but at the end of the day, it’s going to be a scientist looking at that data that results in discoveries.”

Other investigators like Goddard’s Bethany Theiling also hope to train an AI to look for new patterns or absences of various chemicals or organic compounds. Theiling is preparing for ocean worlds missions by creating artificial oceans that are different from Earth’s bodies of water, then analyzing them for data sets to train algorithms.

Moisan is developing his AI to interpret hyperspectral mapping images from complex aquatic and coastal regions.

“How do you pick out things that matter in a scan,” Moisan asked. “I want to be able to quickly point that hyperspectral imager at something swept up in the scan, so that from a remote area we can get whatever we need to understand the environmental scene.”

Using real-time estimates of surface topology and spatial patterns from a high-resolution camera, thermal emission from an infrared camera, and

course, reflected light from a wide-angle “pushbroom” optical camera in flight, Moisan’s on-board AI would scan the analyzed data in real-time to search for significant features, then steer an optical pointing instrument (the eye in A-Eye) to collect subnanometer hyperspectral reflectance data on those identified areas of interest.

This year, Moisan trained the AI using observations from prior flights over the Delmarva Peninsula. Follow-up funding would help him complete the optical pointing goal.

Thinking machines are set to play a larger role in future exploration of our universe. Sophisticated computers taught to recognize chemical signatures that could indicate life processes, or landscape features like lava flows or craters, promise to increase the value of science data returned from lunar or deep-space exploration. v

CONTACT

John.R.Moisan@NASA.gov or 757.824.1312

Genetic Programming and Intelligent Swarms

The A-Eye is Goddard oceanographer John Moisan’s latest pursuit, but he has been working for NASA in artificial intelligence and distributed missions for years.

Moisan said an artificial intelligence technique called Genetic Programming might help develop the types of AI NASA needs to solve a variety of exploration challenges, from distributed missions, to identifying high-value science data.

“I’ve been doing genetic programming for 15 years,” he said, “combining segments of code to perform certain functions, analyzing their fitness capability and breeding what you’re looking for into successive generations of code.”

Much like natural selection guides biological evolution using DNA as the code, genetic programming incorporates a form of mutation by random substitutions and uses repeated versions to select the fittest programs for reproduction. Each generation of code performs better than those that came before until the desired function is achieved.

Moisan also helped NASA tackle another AI problem related to an agency priority for future explorations: training a distributed swarm to efficiently explore a region without duplicating efforts.

“We would send autonomous boats onto the surface of the ocean to collect information,” he said. “We struggled with how to give each individual craft their own ability to choose its own path without them all ending up in the same place. They needed central direction and decision making.”

Although the AI technology available at that time could not keep the boats from clumping together, Moisan is undeterred in pursuing innovations to unlock those higher functions.

Volume 19 • Issue 1 • Fall 2022 PAGE 11 cutting edge • goddard’s emerging technologies www.nasa.gov/gsfctechnology
Infrared images like this one from a marsh area on the Maryland/Virginia Eastern Shore coastal barrier and back bay regions reveal clues to scientists about plant health, photosynthesis, and other conditions that affect vegetation and ecosystems. Image Courtesy John Moisan

Efficient, High Resolution Infrared Cameras

Anew camera expands Goddard’s legacy of broadspectrum infrared imagers by capturing more infrared wavelengths in an easy-to-reproduce format appropriate for many science applications.

Researchers using Goddard’sType II Strained-Layer Superlattice (SLSdetectors say the new camera will be able to resolve molecular spectral signatures in the infrared fingerprint region (approximately 2 to 14 microns in wavelengthwith accuracy and resolution.

The expanded range enables a variety of science investigations: measuring atmospheric trace gases, sea ice properties, infrared ocean color, analyzing vegetation, identifying and characterizing wildfires, mineral surveying, and studying how Earth’s atmosphere stores and emits heat.

The camera provides both high spectral resolution and sufficient image sharpness for these investigations, Principal Investigator Dr Tilak Hewagama said.

“When you start looking at molecular compositions in the atmosphere,” he said, “then you need to have

higher spectral resolution to resolve molecules’signatures as seen in their reflected sunlight and thermal emission.”

His current project builds on a strong Goddard Internal Research and Development heritage. The innovative SLS detectors were developed by Murzy Jhabvala and his engineering team. Hewagama’s spectrometer builds on the successful Compact Thermal Imager (CTI) that flew on the International Space Station (CuttingEdge, Summer 2018, Page 19), and which also used the SLS detector. CTI yielded 15 million images with a range of earth scenes at 40meter spatial resolution in two spectral bands.

The new instrument's hyperspectral capabilities take advantage of the SLS detectors' resolution and their dependability, Hewagama said. “We can reliably manufacture those detectors with their proven performance and adaptability.”

These cameras operate at cryogenic temperatures using compact, low-power, commercial coolers and are cost efficient compared to competing mercury-

Volume 19 • Issue 1 • Fall 2022 cutting edge • goddard’s emerging technologies PAGE 12 www.nasa.gov/gsfctechnology
The newly assembled thermal imaging camera sits on a bench. Based on proven detectors with a record of success in orbit, this camera expands on both pixel and infrared spectrum resolution to enable a wide variety of science investigations. Photo Credit: Tilak Hewagama

cadmium-telluride IR sensors. The SLS-based cameras can also be adapted with a variety of gratings and filters to cater to specific missions.

Hewagama is working with researchers including Jhabvala, Don Jennings, and Emily Kan to build the new spectrometer in close collaboration with Goddard Earth scientists Doug Morton, Luke Oman, Dong Wu, and others. The current project also builds on the Goddard-developed Thermal Infrared Composite Imaging Spectrometer (TIRCIS) by Planetary Scientist Terry Hurford.

Expanding the IR Spectrum

“When you start looking at molecular compositions in the atmosphere, then you need to have higher spectral resolution to resolve molecules’ signatures as seen in their reflected sunlight and thermal emission.”

“When you have finer spectral resolution,” Oman said, “it becomes easier to separate carbon dioxide from ozone while also enabling measurement of concentrations of different trace gases in the atmosphere. Those trace gasses are important for looking at changes over time and understanding processes going on in the atmosphere. It’s often very subtle changes – 10s of parts per billion can be very important. That’s what you need the higher spectral resolution to confirm.”

Enabling high resolution spectroscopy out to longer wavelengths using SLS can bring the sensor’s resolving capabilities to benefit crucial science investigations in infrared, said Goddard research scientist Luke Oman, from studying the ionized upper atmosphere, or aeronomy, to geology to climate studies.

The broad-spectrum camera also allows scientists to see the difference between sunlight reflected off an object or gas molecule in visible and nearinfrared light, and thermal radiation emitted by that source, typically in longer-wavelength infrared. This helps improve scientists’ understanding of solar heating and other changes in the environment, Hewagama said. v

Tilak.Hewagama@NASA.gov or 301.286.9130

Volume 19 • Issue 1 • Fall 2022 PAGE 13 cutting edge • goddard’s emerging technologies www.nasa.gov/gsfctechnology
CONTACT
The image on the right shows the improved resolution of the Strained-Layer Superlattice, or SLS, detector array as compared with that of its predecessor, the Quantum Well Infrared Photodetector technology at left. -Dr. Tilak Hewagama Image Credit: NASA

Measuring Helium in the Exosphere

When solar weather reaches Earth, it heats the gas in the outer atmosphere and increases drag against satellites and other spacecraft in orbit. To better predict these effects of space weather, scientists need more information on what exactly happens up there.

Goddard heliophysicist Hyunju Connor plans to modify an instrument developed for the MarsAtmosphere and Volatile Evolution (MAVEN) mission to study the extreme upper reaches of Earth’s atmosphere – called the exosphere. This thin realm starts above 310 miles (500 km), and Connor specifically wants to measure helium in the area from 430 to 930 miles (approximately 700 to 1500 km).

The Neutral Gas and Ion Mass Spectrometer (NGIMS) instrument built by Goddard planetary scientists Paul Mahaffy and Mehdi Benna can accurately measure as little as 1,000 helium molecules

per cubic centimeter, Connor said. She is using Internal Research and Development (IRAD) funding to adapt NGIMS to a small satellite – something the size of a medium to large kitchen appliance.

“The technology itself is already flight proven,” Connor said. “Now I want to bring that instrument to Earth to understand how our outer atmosphere behaves.”

The NGIMS instrument was tested in Earth’s upper atmosphere before being sent to Mars, Benna said. “We honed our techniques on planetary missions,” he said. “Now we are bringing all those measurement improvements back to Earth science.”

Solar weather stirs the diffuse gasses of the upper atmosphere and exosphere, causing unpredictable amounts of drag on spacecraft. That subtle pressure

Volume 19 • Issue 1 • Fall 2022 cutting edge • goddard’s emerging technologies PAGE 14 www.nasa.gov/gsfctechnology
The engineering unit of the Neutral Gas and Ion Mass Spectrometer (NGIMS) is identical to the instrument currently orbiting Mars aboard the MAVEN spacecraft. In the lab are (left to right) planetary scientist Mehdi Benna, heliophysicist Hyunju Connor, and engineer Juan Raymond. ISEND adapts NGIMS’ fight-proven, highheritage instrument for understanding Earth’s exosphere under the influences of the dynamic space environment. Photo Credit: Michael Giunto

of atoms striking a moving object affects satellites’ orbits as well as how accurately mission controllers can point spacecraft communications arrays at ground stations to transmit data back to Earth.

Currently, mission controllers infer density changes in the outer atmosphere by measuring changes in spacecraft orbit-keeping maneuvers and pointing tasks, then calculating the additional fuel or spin from reaction wheels those tasks require. Direct, insitu measurements like those Connor’s ISEND mission could provide would improve our understanding of the weather up there and help predict changes in drag on spacecraft for more accurate maneuvering operations.

Connor said her ISEND concept would complement NASA’s Geospace Dynamics Constellation (GDC mission currently under development. GDC would use a constellation of six identical satellites to measure helium and heavier atoms at 200 miles (350 km)– where the International Space Station and many satellites orbit.

“If you could fly several small satellites above the GDC constellation,” Connor said, “it would provide a third dimension to better understand how solar weather affects our spacecraft.”

She became interested in Earth’s exosphere after studying soft X-rays emitted when solar winds interact with hydrogen. These gases fill an increasingly thin, hazy bubble that can extend beyond the orbit of the Moon, or more than 240,000 miles from Earth.

Connor said she is focusing on helium, a noble gas that is easier to measure, but behaves similarly to

hydrogen in response to space weather. The data ISEND returns will help fill gaps in our understanding of these thin exosphere environments, she said.

Studying the dynamic space environment is critical to understanding how Earth’s atmosphere evolves, our planet’s relationship with its star, and how to improve remote sensing for future terrestrial and planetary missions, Connor said. v

Volume 19 • Issue 1 • Fall 2022 PAGE 15 cutting edge • goddard’s emerging technologies www.nasa.gov/gsfctechnology
This image of Earth’s hydrogen corona was captured by the Lyman Alpha Imaging Camera (LAICA) on January 9, 2015. The circle in the center shows Earth’s position and size, and the Sun is to the left. The image shows how hydrogen atoms are extended by solar radiation pressure away from the Sun. These gases fill an increasingly thin, hazy bubble that can extend beyond the orbit of the Moon – more than 240,000 miles from Earth. Image Credit: S. Kameda/Rikkyo University

A Smaller, Simpler Mass Spectrometer for Exploration

Measuring individual charged particles is no easy task, but a mass spectrometer being developed at Goddard may help identify a wide variety of particles present in Earth’s auroras, on the surface of the Moon or coming in from interstellar space.

The Ion Velocity Mass Spectrometer being developed by physicists Ed Sittler and Robert Michell requires less power and is easier to build and operate than traditional mass spectrometers, Sittler said. What it can do is provide a high angular resolution, enabling three-dimensional imaging of the local environment, whether that means in Earth's orbit, on the Moon, or beyond.

“We can track the solar wind and measure it’s interaction with the surface of the Moon,” Sittler said. “We can capture backscattered ions from the wind hitting the surface dust.”

The more common time-of-flight mass spectrometers accelerate incoming ions using a strong electric field with a known strength. It requires the complexity of using ultra-thin carbon foils, high voltages up to 15 kilovolts, and high speed synchronized electronics to measure the time between the accelerated ion passing through the carbon foil and striking the detector The measured time is proportional to the ion mass and charge ratio which allows scientists to calculate its composition. They can be large and power-hungry, Sittler said. That makes them unsuitable for smaller missions.

An IVMS instrument can operate on a few watts of power. Instead of accelerating ions, it uses permanent magnets and a small electric field to allow ions to enter the instrument. It can resolve solar wind and minor ions by calculating their mass-per-charge ratio, Sittler said, but doesn’t resolve each particle’s charge state. It also can detune the major ions, but not the minor ions, so it can measure a wider range of particle intensities.

“This a strategic instrument to enable new missions,” Goddard heliophysics technology lead Nikolaos Paschalidis said of the project. “IVMS is a new generation, velocity analysis-based mass spectrometer which makes it low power and easier to incorporate into smaller spacecraft.”

Sittler said the instrument parts have been fabricated, and they are waiting on components to finish the control board before testing it with a calibrated ion beam The instrument will then fly on the Ground Imaging to Rocket investigation ofAuroral Fast Features (GIRAFF) sounding rocket mission out of Poker FlatAlaska, planned for early 2024.

Sittler and his team are looking for Oxygen and Hydrogen ions that they believe are responsible for certain electron pulsations visible in auroras. “If O+ is not there,” Sittler explained, “it could mean there’s something missing with our theory.” v

CuttingEdge is published quarterly by the Office of the Chief Technologist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. The publication describes the emerging, potentially transformative technologies that Goddard is pursuing to help NASA achieve its missions. For more information about Goddard technology, visit the website listed below or contact Chief Technologist Peter Hughes, Peter.M.Hughes@NASA.gov. If you wish to be placed on the publication’s distribution list, contact Editor Karl Hille, Karl.B.Hille@NASA.gov.

Publication Number: NP-2022-9-867-GSFC

Volume 19 • Issue 1 • Fall 2022 cutting edge • goddard’s emerging technologies PAGE 16 www.nasa.gov/gsfctechnology
Edward.C.Sittler@NASA.gov or 301.286.9215 CONTACT
IVMS promises a smaller, simpler mass spectrometer for exploration. Image Courtesy: Ed Sittler
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.