Sunday, August 28, 2011

Peering Inside the Flame: Fusion Imaging of the Final Space Shuttle Launch


Louise Walker and J.T. Heineck of the Experimental Aero-Physics Branch at NASA's Ames Research Center, Moffett Field, Calif., are learning how to see shape and detail in blindingly bright plumes of rocket fire. The two researchers were funded by the Space Shuttle Program to document the final shuttle launch, STS-135, with their distinctive images.

They first tested the technique as a challenge from a co-worker. "We were approached by an acoustics guy here at Ames who had a hobby rocket video," explained Walker. "He showed us the video and said, 'Can you take a better shot than this?'. It had the typical view of a launch you see on film -- white blown-out flame on a dark background. Basically the flame is over-exposed. We knew that we needed image fusion to really see what was going on."

Image fusion is a technique which begins with image files taken simultaneously at nearly identical angles and positions, each with different filters. The images are processed through minute alignment and warping to match camera angles precisely and account for the inches between each camera's position. The files are then transferred to software that combines each set of now identically framed images to highlight the different levels of detail captured in each. The processing software digitally removes saturated pure black or pure white pixels from one image and replaces them with the most detailed pixels in the set. The resulting image is sometimes called a high dynamic range image, referring to the different dynamic ranges, or exposure and brightness, in each image.

Realizing this technique could be developed and applied to much larger rockets, Walker and Heineck began improving how such images might be taken. The researchers looked within Ames' labs for materials.

"I found some cameras that matched and some scrap aluminum, and built the frame," explains Walker. "Each camera sits on a brick-sized mount that rotates and slides, and the whole thing is sitting on top of a sturdy tripod we already had. It was the Apollo 13 game -- this is what we have, this is what we need to do, how do we make it work?"

After hearing about their initial results, researchers started asking them to image static rocket firing tests and launch abort motor tests and finally a colleague from NASA's Marshall Spaceflight Center, Huntsville, Ala. contacted them. Walker recalls, "Darrell Gaddy, a thermal analysis engineer, came to us and said, 'Hey you guys should be doing a shuttle launch,' and we perked up and said, 'Yes, we agree!'"

Walker and Heineck arranged to image the STS-133 launch to support the shuttle debris tracking team, but the delays for that launch meant they had to leave before shooting it. For STS-134, they successfully shot the images that would create the first shuttle launch fusion video.

On June 27, 2011, Walker and Heineck trekked from California to NASA's Kennedy Space Center, Cape Canaveral, Fla., and set up their wall of cameras, affectionately called "Walle." At 1,250 feet from Atlantis on the launch pad, the team set up the equipment, aligned the cameras visually, then connected the control computer through system of fiber optic networks provided by Kennedy's Experimental Imaging Lab and Photo Operations.

"All five visible cameras record to internal memory and we communicate to them through Ethernet connections," said Heineck. "Each camera goes to a network hub, and we talk to the hub from miles away through the fiber optic connection."

The STS-135 launch imaging has a couple of notable differences from the STS-134 images, including wider framing to capture more of the launch, and an added layer of non-visible data.

"For this last one, we worked with Darrell Gaddy to add a thermal infrared camera. This allows us to see detail in the plume that we can't see with cameras set up in the visible spectrum," said Walker. "Darrell has been fielding thermal imaging of launches for a while now, and we just jumped on his shoulders in adding these extra details."

"With the combined multiple layers, human eyes and brains can process what's going on and take it all in," Heineck said. "That's not possible using just your eyes while it's happening, or on a single camera's photograph or video."

The technique can have many other technical uses, including validating computer models of very bright events. With the layers of real data to compare against computer-generated information, researchers can better understand the structure of the plume when rockets fire, the motion of the flames flowing out of the rocket motor, and how to design optimal future motors.

"We're exploring options working with the arc jets at Ames, are looking at working with other labs, and have been working with a group making new hybrid sounding rockets," said Heineck. "With any high dynamic events -- welding, wildfires, industrial machining – you can process much more data on detail and structure by using this technique than with a single setting in a camera."

The technique could have significant benefits for future space transportation systems, through imaging new rocket motor development and the Ames arc jets, which test aerothermodynamic heating a spacecraft endures throughout atmospheric re-entry and tests of thermal protection systems and materials.

"It was the intent all along to expand the image fusion techniques to include cameras with other parts of the spectrum -- X-ray, deep ultraviolet, and various other imaging methods can also be incorporated," said Heineck. "There are lots of applications we're anxious to try."

  • rss
  • Del.icio.us
  • Digg
  • Twitter
  • StumbleUpon
  • Reddit
  • Share this on Technorati
  • Post this to Myspace
  • Share this on Blinklist
  • Submit this to DesignFloat

0 comments: