Monday, June 28, 2010

NASA's Count Rises as More Land Slides: An Interview with Dalia Kirschbaum

0 comments


When a deadly landslide killed nearly 100 people and forced the evacuation of 75,000 in Guatemala on May 30, NASA carefully documented it. And when more than 300 other rain-triggered landslides pulled the Earth out from beneath towns and villages in China, Uganda, Bangladesh, Pakistan and other countries in 2010, NASA researched and documented each one.

Sudden, rain-induced landslides kill thousands each year, yet no one organization had consistently catalogued them to evaluate historical trends, according to landslide expert Dalia Kirschbaum of NASA's Goddard Space Flight Center. Three years ago, Kirschbaum set out to change that by creating a searchable inventory of landslides specifically triggered by rain.

WhatOnEarth spoke with Kirschbaum to understand how this tool might tell us more about when and where landslides are most likely to occur.

WhatOnEarth: What is a landslide?

Kirschbaum: Landslides occur when an environmental trigger like an extreme rain event -- often a severe storm or hurricane – and gravity's downward pull sets soil and rock in motion. Conditions beneath the surface are often unstable already, so the heavy rains or other trigger act as the last straw that causes mud, rocks, or debris -- or all combined -- to move rapidly down mountains and hillsides. Unfortunately, people and property are often swept up in these unexpected mass movements.

Landslides can also be caused by earthquakes, surface freezing and thawing, ice melt, the collapse of groundwater reservoirs, volcanic eruptions, and erosion at the base of a slope from the flow of river or ocean water. But torrential rains most commonly activate landslides. Our NASA inventory only tracks landslides brought on by rain.

WhatOnEarth: What prompted you to develop the NASA landslide inventory?

Kirschbaum: The project was initially meant to evaluate a procedure for forecasting landslide hazards globally. Studying landslide hazards over large areas is a thorny, complicated task because data collection is not always accurate and complete from one country to another. Improving our record-keeping is a first step in determining how to move forward with landslide hazard and risk assessments.

As a byproduct, we knew the catalog would provide information on the timing, location, and impacts of the landslides, which is valuable for exploring the socio-economic effects of these disasters. The International Disaster Database, the largest of its kind, often does not record smaller landslide events or detail their human or property toll.

Each one of our landslide entries contains information on the date of the event; details about the location; the latitude and longitude; an indication of the size of the event; the trigger; economic or social damages; and the number of fatalities.

WhatOnEarth: How is a landslide inventory useful or important?

Kirschbaum: As the catalog of events grows, we'll be able to extract more and more information about which countries have the highest number of landslide reports, highest number of fatalities, etc. We can also break down events by region, season, and latitude, which helps us identify some large-scale patterns. Though the database is limited by occasional reporting biases and incomplete data, the catalog indicates that the highest reported number of rainfall-triggered landslides and fatal landslides occur in South and Southeastern Asia.

We also believe that in the longer term, the catalog will enable us to identify patterns in the global and regional frequency of landslides with respect to El Nino and related climate effects.

WhatOnEarth: Have you used satellite observations for the inventory?

Kirschbaum: No. In a few instances we've been able to obtain satellite images over an area where a landslide is clearly visible. However, landslides typically occur over small areas. Satellites cannot generally "see" such fine ground details or do not pass over the affected area with the frequency necessary to capture when the landslide occurred.

We hope to use satellite imagery, for example from NASA’s Earth Observing 1 (EO-1) satellite, to evaluate the location and area of some larger landslides. This remains a work in progress.

WhatOnEarth: So, if satellites can't yet help you track landslides, how do you analyze each landslide event?

Kirschbaum: We have searched online literature – sources such as news reports, online journals and newspapers, and disaster databases -- for the years 2003 and from 2007 to the present. The landslide inventory is only as good as the availability and accuracy of the reports and sources used to develop it. The work can be tedious and time-consuming, so we've enlisted the help of several excellent graduate students to keep the inventory updated over the past three years.

Our database tries to capture as many rainfall-triggered landslides as possible, but this is often difficult due to limitations in reporting of landslide hazards. The accuracy and completeness of details surrounding an event -- especially when many landslides are triggered from a very large rainfall event over a broad area– can be less than informative so we are continually trying to improve the cataloguing effort. At the end of this year we'll have a five-year record of events which will provide us more information to identify global trends.

WhatOnEarth: Is the NASA's landslide inventory only available to lay people?

Kirschbaum: Our compilation methods were published in a scientific journal last year, and the actual inventory is now openly available to anyone on the Web. We'll be posting the inventory from January through June 2010 shortly.

Image Information: A massive landslide covered the Philippine village of Guinsaugon, in 2007, killing roughly half of the 2,500 residents. Credit: U.S. Marine Corps./ Lance Cpl. Raymond Petersen III (top). A map of landslide events in 2003, 2007, and 2008. Credit: NASA/Dalia Kirschbaum (above right).

-- Gretchen Cook-Anderson, NASA's Earth Science News Team

Friday, June 25, 2010

What is Consuming Hydrogen and Acetylene on Titan?

0 comments
Artist concept showing a lake on Saturn's moon Titan
This artist concept shows a mirror-smooth lake on the surface of the smoggy moon Titan.
› Full image and caption
Two new papers based on data from NASA's Cassini spacecraft scrutinize the complex chemical activity on the surface of Saturn's moon Titan. While non-biological chemistry offers one possible explanation, some scientists believe these chemical signatures bolster the argument for a primitive, exotic form of life or precursor to life on Titan's surface. According to one theory put forth by astrobiologists, the signatures fulfill two important conditions necessary for a hypothesized "methane-based life."

One key finding comes from a paper online now in the journal Icarus that shows hydrogen molecules flowing down through Titan's atmosphere and disappearing at the surface. Another paper online now in the Journal of Geophysical Research maps hydrocarbons on the Titan surface and finds a lack of acetylene.

This lack of acetylene is important because that chemical would likely be the best energy source for a methane-based life on Titan, said Chris McKay, an astrobiologist at NASA Ames Research Center, Moffett Field, Calif., who proposed a set of conditions necessary for this kind of methane-based life on Titan in 2005. One interpretation of the acetylene data is that the hydrocarbon is being consumed as food. But McKay said the flow of hydrogen is even more critical because all of their proposed mechanisms involved the consumption of hydrogen.

"We suggested hydrogen consumption because it's the obvious gas for life to consume on Titan, similar to the way we consume oxygen on Earth," McKay said. "If these signs do turn out to be a sign of life, it would be doubly exciting because it would represent a second form of life independent from water-based life on Earth."

To date, methane-based life forms are only hypothetical. Scientists have not yet detected this form of life anywhere, though there are liquid-water-based microbes on Earth that thrive on methane or produce it as a waste product. On Titan, where temperatures are around 90 Kelvin (minus 290 degrees Fahrenheit), a methane-based organism would have to use a substance that is liquid as its medium for living processes, but not water itself. Water is frozen solid on Titan's surface and much too cold to support life as we know it.

The list of liquid candidates is very short: liquid methane and related molecules like ethane. While liquid water is widely regarded as necessary for life, there has been extensive speculation published in the scientific literature that this is not a strict requirement.

The new hydrogen findings are consistent with conditions that could produce an exotic, methane-based life form, but do not definitively prove its existence, said Darrell Strobel, a Cassini interdisciplinary scientist based at Johns Hopkins University in Baltimore, Md., who authored the paper on hydrogen.

Strobel, who studies the upper atmospheres of Saturn and Titan, analyzed data from Cassini's composite infrared spectrometer and ion and neutral mass spectrometer in his new paper. The paper describes densities of hydrogen in different parts of the atmosphere and the surface. Previous models had predicted that hydrogen molecules, a byproduct of ultraviolet sunlight breaking apart acetylene and methane molecules in the upper atmosphere, should be distributed fairly evenly throughout the atmospheric layers.

Strobel found a disparity in the hydrogen densities that lead to a flow down to the surface at a rate of about 10,000 trillion trillion hydrogen molecules per second. This is about the same rate at which the molecules escape out of the upper atmosphere.

"It's as if you have a hose and you're squirting hydrogen onto the ground, but it's disappearing," Strobel said. "I didn't expect this result, because molecular hydrogen is extremely chemically inert in the atmosphere, very light and buoyant. It should 'float' to the top of the atmosphere and escape."

Strobel said it is not likely that hydrogen is being stored in a cave or underground space on Titan. The Titan surface is also so cold that a chemical process that involved a catalyst would be needed to convert hydrogen molecules and acetylene back to methane, even though overall there would be a net release of energy. The energy barrier could be overcome if there were an unknown mineral acting as the catalyst on Titan's surface.

The hydrocarbon mapping research, led by Roger Clark, a Cassini team scientist based at the U.S. Geological Survey in Denver, examines data from Cassini's visual and infrared mapping spectrometer. Scientists had expected the sun's interactions with chemicals in the atmosphere to produce acetylene that falls down to coat the Titan surface. But Cassini detected no acetylene on the surface.

In addition Cassini's spectrometer detected an absence of water ice on the Titan surface, but loads of benzene and another material, which appears to be an organic compound that scientists have not yet been able to identify. The findings lead scientists to believe that the organic compounds are shellacking over the water ice that makes up Titan's bedrock with a film of hydrocarbons at least a few millimeters to centimeters thick, but possibly much deeper in some places. The ice remains covered up even as liquid methane and ethane flow all over Titan's surface and fill up lakes and seas much as liquid water does on Earth.

"Titan's atmospheric chemistry is cranking out organic compounds that rain down on the surface so fast that even as streams of liquid methane and ethane at the surface wash the organics off, the ice gets quickly covered again," Clark said. "All that implies Titan is a dynamic place where organic chemistry is happening now."

The absence of detectable acetylene on the Titan surface can very well have a non-biological explanation, said Mark Allen, principal investigator with the NASA Astrobiology Institute Titan team. Allen is based at NASA's Jet Propulsion Laboratory in Pasadena, Calif. Allen said one possibility is that sunlight or cosmic rays are transforming the acetylene in icy aerosols in the atmosphere into more complex molecules that would fall to the ground with no acetylene signature.

"Scientific conservatism suggests that a biological explanation should be the last choice after all non-biological explanations are addressed," Allen said. "We have a lot of work to do to rule out possible non-biological explanations. It is more likely that a chemical process, without biology, can explain these results - for example, reactions involving mineral catalysts."

"These new results are surprising and exciting," said Linda Spilker, Cassini project scientist at JPL. "Cassini has many more flybys of Titan that might help us sort out just what is happening at the surface."

The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. JPL, a division of the California Institute of Technology, manages the mission for NASA's Science Mission Directorate, Washington, D.C. The Cassini orbiter was designed, developed and assembled at JPL.

For more information about the Cassini-Huygens mission visit http://www.nasa.gov/cassini and http://saturn.jpl.nasa.gov.

Thursday, June 24, 2010

NASA Langley to Break Ground on Hydro Impact Basin

0 comments
What goes up must come down, and it will be NASA Langley Research Center's job to make sure that when astronauts return from space, they land safely.

On June 8, NASA Langley will break ground on a $1.7 million Hydro Impact Basin that will serve to validate and certify that future space vehicles, such as NASA's Orion crew module, are designed for safe water landings.

The water basin will be 115 feet (35 m) long, 90 feet (27.4 m) wide and 20 feet (6.1 m) deep and will be built at the west end of Langley's historic Landing and Impact Research Facility, also known as the Gantry, where Neil Armstrong trained for walking on the moon. Construction will begin mid-June and will be completed by December 2010.

A series of water impact tests will be conducted using Orion drop test articles beginning in the spring of 2011. These tests will initially validate and improve the computer models of impact and acoustic loads used in the design and engineering process, and will ultimately qualify the final vehicle design for flight.

"We are excited about being a part of the nation's next space vehicle and it's landing system," said Lynn Bowman, who is managing the series of tests for the Orion project. "Our team has been involved with furthering the knowledge and testing of space vehicle landing systems and their components for the past few years."

The skill sets that NASA Langley engineers and technicians bring to the table as well as the capability of the gantry are two of the reasons the basin is being built at the center.

Bowman explains: "The Gantry provides the ability to control the orientation of the test article while imparting a vertical and horizontal impact velocity, which is required for human rating vehicles."

"This existing capability when combined with the water basin will provide a complete facility needed for landing certification of any manned spacecraft for water landing," added Bowman. "Even vehicles that do not perform a nominal water landing will need to certify for launch abort landings into water."

Additionally, NASA Langley has more than 40 years experience with conducting controlled impact/landing tests of instrumented vehicles, said Lisa Jones, head of the Structural Testing Branch at NASA Langley.

NASA Langley's Gantry, built in 1963, was originally used to model lunar gravity. But after the Apollo program ended, it was transformed into the Impact Dynamics Research Facility and was used to test the crash worthiness of aircraft and rotorcraft.

In 2006 the Gantry experienced a revitalization as the country shifted its focus back to space exploration. The 240-foot (73 m) high Gantry provided engineers and astronauts a means to prepare for Orion's return to Earth.

When testing began in 2006, it was thought that a dry landing on Earth would be the preferred landing for the Orion capsule as it returned from space. During this phase, engineers studied the use of airbags during landings and dropped a total of 73 test articles, including a full-scale model of the Crew exploration vehicle, with different generations of airbags attached to the bottom.

More tests followed, including a series that evaluated the crew module's energy absorbing seat system, which protects the crew during a wide range of landing conditions. Langley engineers designed and built a 20,000-pound (9,072 kg) piece of steel hardware called the Crew Impact Attenuation System (CIAS) test article, which was dropped onto crushable honeycomb material sized to represent a broad range of landing conditions Orion could face.

In all, 117 drop tests were performed.

"This team really cranked out high quality testing and excellent analysis," said Bowman, who managed the Orion Landing System Team. "117 tests is a record."

Now that ground-landing tests are complete and the decision came to design Orion for landing in the water, the team at NASA Langley is ready to shift its focus to water. The team has already gotten its feet wet with a series of elemental water impact testing that began this past fall.

During these tests engineers dropped a 20-inch (50.8 cm) hemisphere from five feet (1.5 m) into a four-foot (1.2 m) deep pool so that they could build confidence in a design tool they might use to analyze data during the full-scale water impact tests to be done at the basin.

Monday, June 21, 2010

NASA Images Show Oil's Invasion Along Louisiana Coast

0 comments
Oil moving into Louisiana's coastal wetlands
Multiple cameras on JPL's MISR instrument on NASA's Terra spacecraft were used to create two unique views of oil moving into Louisiana's coastal wetlands.
› Full image and caption
These images, acquired on May 24, 2010 by the Multi-angle Imaging SpectroRadiometer (MISR) instrument aboard NASA's Terra spacecraft, show the encroachment of oil from the former Deepwater Horizon rig into Louisiana's wildlife habitats. The source of the spill is located off the southeastern (bottom right) edge of the images.

Dark filaments of oil are seen approaching the shores of Blind Bay and Redfish Bay at the eastern edge of the Mississippi River delta, and also nearing Garden Island Bay and East Bay farther to the south. These areas are home to many varieties of fish. To the north, the arc-shaped pattern of land and runoff is associated with the Chandeleur Islands, which are part of the Breton National Wildlife Refuge. This refuge is the second oldest in the United States and is a habitat for dozens of seabird, shorebird and waterfowl species. Oil is reported to have reached the islands on May 6. Eighteen days later, this image shows filaments of oil crossing the island barrier -- which had been heavily eroded by Hurricane Katrina in 2005 -- and entering the Breton and Chandeleur Sounds.

The left-hand image contains data from MISR's vertical-viewing camera. It is shown in near-true color, except that data from the instrument's near-infrared band, where vegetation appears bright, have been blended with the instrument's green band to enhance the appearance of vegetation.

The Mississippi River delta is located below the image center. The slick is seen approaching the delta from the lower right, and filaments of oil are also apparent farther to the north (towards the top). The oil is made visible by sun reflecting off the sea surface at the same angle from which the instrument is viewing it, a phenomenon known as sunglint. Oil makes the surface look brighter under these viewing conditions than it would if no oil were present. However, other factors can also cause enhanced glint, such as reduced surface wind speed. To separate glint patterns due to oil from these other factors, additional information from MISR's cameras is used in the right-hand image.

Previous MISR imagery of the spill shows that the contrast of the oil against the surroundings is enhanced by using a combination of vertical views and oblique-angle views. The right-hand panel was constructed by combining data from several MISR channels. In this false-color view, oil appears in shades of inky blue to black; silt-laden water due to runoff from the Mississippi River shows up as orange, red and violet; and land and clouds appear in shades of cyan.

The images cover an area measuring 110 by 119 kilometers (68 by 74 miles).

Read more at http://photojournal.jpl.nasa.gov/catalog/?IDNumber=pia13174

Friday, June 18, 2010

Backwards Black Holes Might Make Bigger Jets

0 comments
artist’s concept showing a galaxy with a supermassive black  hole  at its core
Going against the grain may turn out to be a powerful move for black holes. New research suggests supermassive black holes that spin backwards might produce more ferocious jets of gas. The results have broad implications for how galaxies change over time.

"A lot of what happens in an entire galaxy depends on what's going on in the miniscule central region where the black hole lies," said theoretical astrophysicist David Garofalo of NASA's Jet Propulsion Laboratory in Pasadena, Calif. Garofalo is lead author of a new paper that appeared online May 27 in the Monthly Notices of the Royal Astronomical Society. Other authors are Daniel A. Evans of the Massachusetts Institute of Technology, Cambridge, Mass., and Rita M. Sambruna of NASA Goddard Space Flight Center, Greenbelt, Md.

Black holes are immense distortions of space and time with gravity that is so great, even light itself cannot escape. Astronomers have known for more than a decade that all galaxies, including our own Milky Way, are anchored by tremendous, so-called supermassive black holes, containing billions of suns' worth of mass. The black holes are surrounded and nourished by disks of gas and dust, called accretion disks. Powerful jets stream out from below and above the disks like lasers, and fierce winds blow off from the disks themselves.

The black holes can spin either in the same direction as the disks, called prograde black holes, or against the flow - the retrograde black holes. For decades, astronomers thought that the faster the spin of the black hole, the more powerful the jet. But there were problems with this "spin paradigm" model. For example, some prograde black holes had been found with no jets.

Garofalo and his colleagues have been busy flipping the model on its head. In previous papers, they proposed that the backward, or retrograde, black holes spew the most powerful jets, while the prograde black holes have weaker or no jets.

The new study links the researchers' theory with observations of galaxies across time, or at varying distances from Earth. They looked at both "radio-loud" galaxies with jets, and "radio-quiet" ones with weak or no jets. The term "radio" comes from the fact that these particular jets shoot out beams of light mostly in the form of radio waves.

The results showed that more distant radio-loud galaxies are powered by retrograde black holes, while relatively closer radio-quiet objects have prograde black holes. According to the team, the supermassive black holes evolve over time from a retrograde to a prograde state.

"This new model also solves a paradox in the old spin paradigm," said David Meier, a theoretical astrophysicist at JPL not involved in the study. "Everything now fits nicely into place."

The scientists say that the backward black holes shoot more powerful jets because there's more space between the black hole and the inner edge of the orbiting disk. This gap provides more room for the build-up of magnetic fields, which fuel the jets, an idea known as the Reynold's conjecture after the theoretical astrophysicist Chris Reynolds of the University of Maryland, College Park.

"If you picture yourself trying to get closer to a fan, you can imagine that moving in the same rotational direction as the fan would make things easier," said Garofalo. "The same principle applies to these black holes. The material orbiting around them in a disk will get closer to the ones that are spinning in the same direction versus the ones spinning the opposite way."

Jets and winds play key roles in shaping the fate of galaxies. Some research shows that jets can slow and even prevent the formation of stars not just in a host galaxy itself, but also in other nearby galaxies.

"Jets transport huge amounts of energy to the outskirts of galaxies, displace large volumes of the intergalactic gas, and act as feedback agents between the galaxy's very center and the large-scale environment," said Sambruna. "Understanding their origin is of paramount interest in modern astrophysics."

The California Institute of Technology, Pasadena, manages JPL for NASA.

Thursday, June 17, 2010

X-51A Makes Longest Scramjet Flight

0 comments
X-51A, artist's conceptAn engine first validated in a NASA wind tunnel successfully made the longest supersonic combustion ramjet-powered hypersonic flight to date off the southern California coast on May 26.

The air-breathing scramjet engine, built by Pratt & Whitney Rocketdyne, burned for more than 200 seconds to accelerate the U.S. Air Force's X-51A vehicle to Mach 5, or five times the speed of sound. It broke the previous record for the longest scramjet burn in a flight test, set by
NASA's X-43 vehicle.

"This is great news for the hypersonics community," said Jim Pittman, principal investigator for the Hypersonics Project of NASA's Fundamental Aeronautics Program. "It's also good for NASA's research into flight at Mach 5 or faster. We will receive the X-51 flight data for analysis and comparison to the data we obtained during ground tests at NASA Langley's 8-Foot High Temperature Tunnel and to predictions from our propulsion codes."

Air Force officials called the test -- the first of four planned -- an unqualified success. The flight is considered the first use of a practical hydrocarbon-fueled scramjet in flight.

"We are ecstatic to have accomplished most of our test points on the X-51A's very first hypersonic mission," said program manager Charlie Brink of the Air Force Research Laboratory at Wright-Patterson Air Force Base in Dayton, Ohio. "We equate this leap in engine technology as equivalent to the post-World War II jump from propeller-driven aircraft to jet engines."

The X-51A launched from Edwards Air Force Base in California, carried aloft under the left wing of an Air Force Flight Test Center B-52 Stratofortress. It was released while the B-52 flew at 50,000 feet over the Pacific Ocean Point Mugu Naval Air Warfare Center Sea Range. After release, an Army Tactical Missile solid rocket booster accelerated the X-51A to about Mach 4.8 before it and a connecting interstage were jettisoned. The launch and separation were normal, according to Brink.

The SJX61-2 engine that powered the X-51A test vehicle successfully  completed ground tests simulating Mach 5 flight conditions at NASA's  Langley Research Center, Hampton, Va., in 2008Once the X-51A was free of its booster and interstage, its SJY61 engine ignited, initially on a mix of ethylene, similar to lighter fluid, and JP-7 jet fuel then exclusively on JP-7 jet fuel. The flight reached an altitude of about 70,000 feet and a peak speed of Mach 5.

Onboard sensors transmitted data to an airborne U.S. Navy P-3, as well was ground systems at Point Mugu, Vandenberg and Edwards Air Force bases in California. The flight was terminated after about 200 seconds of engine operation because of a technical issue. The X-51A was not designed to be recovered for examination, so engineers are busily examining the data to identify the cause of the problem.

Four X-51A cruisers have been built for the Air Force and the Defense Advanced Research Projects Agency by industry partners Pratt & Whitney Rocketdyne, West Palm Beach, Fla., and The Boeing Company, Palmdale, Calif. Brink said the Air Force intends to fly the three remaining X-51A flight test vehicles this fall on virtually identical flight profiles, building knowledge from each successive flight.

"This first flight was the culmination of a six-year effort by a small, but very talented AFRL, DARPA, NASA and industry development team," Brink said. "Now we will go back and really scrutinize our data. No test is perfect, and I'm sure we will find anomalies that we will need to address before the next flight. But anyone will tell you that we learn just as much, if not more, when we encounter a glitch."

The engine can produce between 400 and 1,000 pounds of thrust. Like a conventional jet engine, the SJY61 is capable of adjusting thrust throughout the X-51's flight envelope.

Hypersonic flight presents unique technical challenges with heat and pressure, which make conventional turbine engines impractical. Program officials said producing thrust with a scramjet has been compared to lighting a match in a hurricane and keeping it burning.

Wednesday, June 16, 2010

AVIRIS

0 comments
AVIRIS flew over the Gulf oil spill in a NASA ER-2 aircraft from NASA’s Dryden Flight Research Center, Edwards, Calif.

Tuesday, June 15, 2010

Small Near-Earth Object Probably a Rocket Part

0 comments
Graphic depicting the trajectory of near-Earth object 2010 KQ
Graphic depicting the trajectory of near-Earth object 2010 KQ.
› Larger view
Scientists at NASA's Near-Earth Object Program Office at NASA's Jet Propulsion Laboratory in Pasadena, Calif., have determined that a small object that safely passed Earth on May 21 is more than likely an upper-stage of a rocket that carried a spacecraft on an interplanetary trajectory.

"The orbit of this object is very similar to that of the Earth, and one would not expect an object to remain in this type of orbit for very long," said Paul Chodas, a scientist at NASA's Near-Earth Object Program Office at the Jet Propulsion Laboratory in Pasadena, Calif.

Observations by astronomer S.J. Bus, using the NASA-sponsored Infrared Telescope Facility in Mauna Kea, Hawaii, indicate that 2010 KQ's spectral characteristics do not match any of the known asteroid types, and the object's absolute magnitude (28.9) suggests it is only a few meters in size.

2010 KQ was discovered by astronomer Richard Kowalski at the NASA-sponsored Catalina Sky Survey in the mountains just north of Tucson, Ariz., on May 16. Five days later, it made its closest approach to Earth at a distance just beyond the moon's orbit. The object is departing Earth's neighborhood but will be returning in 2036.

"At present, there is a 6 percent probability that 2010 KQ will enter our atmosphere over a 30-year period starting in 2036," said Chodas. "More than likely, additional observations of the object will refine its orbit and impact possibilities. Even in the unlikely event that this object is headed for impact with Earth, whether it is an asteroid or rocket body, it is so small that it would disintegrate in the atmosphere and not cause harm on the ground."

NASA detects, tracks and characterizes asteroids and comets passing close to Earth using both ground- and space-based telescopes. The Near-Earth Object Observations Program, commonly called "Spaceguard," discovers these objects, characterizes a subset of them, and plots their orbits to determine if any could be potentially hazardous to our planet.

JPL manages the Near-Earth Object Program Office for NASA's Science Mission Directorate in Washington. JPL is a division of the California Institute of Technology in Pasadena.

More information about asteroids and near-Earth objects is at: http://www.jpl.nasa.gov/asteroidwatch.

Monday, June 14, 2010

NASA Takes to the Air With New 'Earth Venture' Research Projects

0 comments
JPL's Carbon in Arctic Reservoirs Vulnerability Experiment
JPL's Carbon in Arctic Reservoirs Vulnerability Experiment will bridge critical gaps in our knowledge and understanding of Arctic ecosystems, links between the Arctic water and terrestrial carbon cycles, and the effects of fires and thawing permafrost. › Larger view
Hurricanes, air quality and Arctic ecosystems are among the research areas to be investigated during the next five years by new NASA airborne science missions announced today.

The five competitively-selected proposals, including one from NASA's Jet Propulsion Laboratory, Pasadena, Calif., are the first investigations in the new Venture-class series of low-to-moderate-cost projects established last year.

The Earth Venture missions are part of NASA's Earth System Science Pathfinder program. The small, targeted science investigations complement NASA's larger research missions. In 2007, the National Research Council recommended that NASA undertake these types of regularly solicited, quick-turnaround projects.

This year's selections are all airborne investigations. Future Venture proposals may include small, dedicated spacecraft and instruments flown on other spacecraft.

"I'm thrilled to be able to welcome these new principal investigators into NASA's Earth Venture series," said Edward Weiler, associate administrator of the agency's Science Mission Directorate in Washington. "These missions are considered a 'tier 1' priority in the National Research Council's Earth Science decadal survey. With this selection, NASA moves ahead into this exciting type of scientific endeavor."

The missions will be funded during the next five years at a total cost of not more than $30 million each. The cost includes initial development and deployment through analysis of data. Approximately $10 million was provided through the American Recovery and Reinvestment Act toward the maximum $150 million funding ceiling for the missions.

Six NASA centers, 22 educational institutions, nine U.S. or international government agencies and three industrial partners are involved in these missions. The five missions were selected from 35 proposals.

The selected missions are:

1. Carbon in Arctic Reservoirs Vulnerability Experiment. Principal Investigator Charles Miller, NASA's Jet Propulsion Laboratory in Pasadena, Calif.

The release and absorption of carbon from Arctic ecosystems and its response to climate change are not well known because of a lack of detailed measurements. This investigation will collect an integrated set of data that will provide unprecedented experimental insights into Arctic carbon cycling, especially the release of important greenhouse gases such as carbon dioxide and methane. Instruments will be flown on a Twin Otter aircraft to produce the first simultaneous measurements of surface characteristics that control carbon emissions and key atmospheric gases.

2. Airborne Microwave Observatory of Subcanopy and Subsurface. Principal Investigator Mahta Moghaddam, University of Michigan

North American ecosystems are critical components of the global exchange of the greenhouse gas carbon dioxide and other gases within the atmosphere. To better understand the size of this exchange on a continental scale, this investigation addresses the uncertainties in existing estimates by measuring soil moisture in the root zone of representative regions of major North American ecosystems. Investigators will use NASA's Gulfstream-III aircraft to fly synthetic aperture radar that can penetrate vegetation and soil to depths of several feet.

3. Airborne Tropical Tropopause Experiment. Principal Investigator Eric Jensen, NASA's Ames Research Center in Moffett Field, Calif.

Water vapor in the stratosphere has a large impact on Earth's climate, the ozone layer and how much solar energy Earth retains. To improve our understanding of the processes that control the flow of atmospheric gases into this region, investigators will launch four airborne campaigns with NASA's Global Hawk remotely piloted aerial systems. The flights will study chemical and physical processes at different times of year from bases in California, Guam, Hawaii and Australia.

4. Deriving Information on Surface Conditions from Column and Vertically Resolved Observations Relevant to Air Quality. Principal Investigator James Crawford, NASA's Langley Research Center in Hampton, Va.

Satellites can measure air quality factors like aerosols and ozone-producing gases in an entire column of atmosphere below the spacecraft, but distinguishing the concentrations at the level where people live is a challenge. This investigation will provide integrated data of airborne, surface and satellite observations, taken at the same time, to study air quality as it evolves throughout the day. NASA's B-200 and P-3B research aircraft will fly together to sample a column of the atmosphere over instrumented ground stations.

5. Hurricane and Severe Storm Sentinel. Principal Investigator Scott Braun, NASA's Goddard Space Flight Center in Greenbelt, Md.

The prediction of the intensity of hurricanes is not as reliable as predictions of the location of hurricane landfall, in large part because of our poor understanding of the processes involved in intensity change. This investigation focuses on studying hurricanes in the Atlantic Ocean basin using two NASA Global Hawks flying high above the storms for up to 30 hours. The Hawks will deploy from NASA's Wallops Flight Facility in Virginia during the 2012 to 2014 Atlantic hurricane seasons.

"These new investigations, in concert with NASA's Earth-observing satellite capabilities, will provide unique new data sets that identify and characterize important phenomena, detect changes in the Earth system and lead to improvements in computer modeling of the Earth system," said Jack Kaye, associate director for research of NASA's Earth Science Division in the Science Mission Directorate.

Langley manages the Earth System Pathfinder program for the Science Mission Directorate. The missions in this program provide an innovative approach to address Earth science research with periodic windows of opportunity to accommodate new scientific priorities.

For information about NASA and agency programs, visit: http://www.nasa.gov .

Friday, June 11, 2010

NASA Satellite Spots Oil at Mississippi Delta Mouth

0 comments
A new image from NASA's Terra spacecraft
Oil from the Deepwater Horizon spill laps around the mouth of the Mississippi River delta in this May 24, 2010, image from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument on NASA's Terra spacecraft. The oil appears silver, while vegetation is red. › Full image and caption
On May 24, 2010, the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument on NASA's Terra spacecraft captured this false-color, high-resolution view of the very tip of the Mississippi River delta. Ribbons and patches of oil that have leaked from the Deepwater Horizon well offshore appear silver against the light blue color of the adjacent water. Vegetation is red.

In the sunglint region of a satellite image--where the mirror-like reflection of the sun gets blurred into a wide, bright strip--any differences in the texture of the water surface are enhanced. Oil smoothes the water, making it a better "mirror." Oil-covered waters are very bright in this image, but, depending on the viewing conditions (time of day, satellite viewing angle, slick location), oil-covered water may look darker rather than brighter.

The relative brightness of the oil from place to place is not necessarily an indication of the amount of oil present. Any oil located near the precise spot where the sun's reflection would appear, if the surface of the Gulf were perfectly smooth and calm, is going to look very bright in these images. The cause of the dark patch of water in the upper left quadrant of the image is unknown. It may indicate the use of chemical dispersants, skimmers or booms, or it may be the result of natural differences in turbidity, salinity or organic matter in the coastal waters.

Thursday, June 10, 2010

Astronomers Discover New Star-Forming Regions in Milky Way

0 comments
An artist's conception of our Milky Way galaxy
An artist's conception of our Milky Way galaxy.
› Larger view
Astronomers studying the Milky Way have discovered a large number of previously unknown regions where massive stars are being formed. Their discovery, made with the help of NASA's Spitzer Space Telescope, provides important new information about the structure of our home galaxy and promises to yield new clues about its composition.

The star-forming regions the astronomers sought, called H II regions, are sites where hydrogen atoms are stripped of their electrons by intense radiation from massive, young stars. To find these regions, hidden from visible-light detection by the Milky Way's gas and dust, the researchers used infrared and radio telescopes.

"We found our targets by using the results of infrared surveys done with NASA's Spitzer Space Telescope and of surveys done with the National Science Foundation's Very Large Array radio telescope," said astronomer Loren Anderson of the Astrophysical Laboratory of Marseille in France, who worked on the project. "Objects that appear bright in both the Spitzer and Very Large Array images we studied are good candidates for H II regions."

Further analysis allowed the astronomers to determine the locations of the H II regions. They found concentrations of the regions at the end of the galaxy's central bar and in its spiral arms. Their analysis also showed that 25 of the regions are farther from the galaxy's center than the sun.

Read more at http://www.nrao.edu/pr/2010/gbthiiregions/ .

Wednesday, June 9, 2010

NASA's Webb Telescope Has 'Made It' to New York City!

0 comments
The  Webb telescope full scale model lit up at night in Munich, Germany in  2009The James Webb Space Telescope has finally made the "big time" at least according to the old Frank Sinatra song "New York, New York." The life-sized model of NASA's next generation space telescope is being set up in New York City's Battery Park for the 2010 World Science Festival, which runs June 1- June 6. The opening ceremony will be held in front of the model on June 1.

As the song goes, "if (the Webb telescope) can make it there, it'll make it anywhere" and scientists are hoping that it will safely arrive in its orbit one million miles from Earth.

"The World Science Festival is a great opportunity for people to get a look at, and learn more about, the future of astronomy from space," said Eric Smith, NASA's Webb Program Scientist. "The Webb telescope full scale model dramatically highlights how far the next generation of space telescopes will be from its predecessors. It’s unlike any telescope you’ve ever seen."

The James Webb Space Telescope is the next-generation premier space observatory, exploring deep space phenomena from distant galaxies to nearby planets and stars. The telescope will give scientists clues about the formation of the universe and the evolution of our own solar system, from the first light after the Big Bang to the formation of star systems capable of supporting life on planets like Earth.

For six days in June, New York City residents can get a free look at the full-scale model of the Webb telescope as it sits on display in Battery Park. The model viewing hours run from Tuesday, June 1 from 9:00 a.m. to Sunday, June 6 at 9:00 p.m. EDT. The actual size model is highly detailed. It is constructed mainly of aluminum and steel, weighs 12,000 pounds, is approximately 80 feet long, 40 feet wide and 40 feet tall. It is as large as a tennis court. The model requires 2 trucks to ship it and assembly takes a crew of 12 approximately four days. The model will be lit up from its base so that night-time viewers can take in all the details.

The full-scale model of the James Webb Space Telescope was built by the prime contractor, Northrop Grumman, to provide a better understanding of the size, scale and complexity of this satellite.

Visitors will also be able to learn about what the Webb telescope is going to show scientists. They can play with interactive exhibits, watch videos about what the Webb will be exploring in the cosmos, and even ask a scientist about the telescope.

On Friday June 4, from 8-9:30 p.m. EDT, there will be a special event at the base of the full-sized model, called "From the City to the Stars," where scientists will talk about the possible discoveries that the Webb telescope could make.

The event is also free and open to the public. Dr. John Mather, Nobel laureate and the Webb telescope’s senior project scientist; Dr. John Grunsfeld, astronaut, physicist and "chief repairman" of the Hubble Telescope and planetary astronomer Dr. Heidi Hammel will be at the event to talk about the discoveries anticipated from the Webb telescope. NASA Deputy Administrator Lori Garver will be a featured speaker at the Festival kick-off. She will share with the New York audience NASA’s strong commitment to continued scientific discovery, with missions like the Webb telescope, and talk about some of the other exciting endeavors on NASA’s new path forward.

Since 2005, the model has journeyed to Florida, Germany, Ireland and Washington, D.C. The actual Webb space telescope is going a lot further, about a million miles from Earth!

Related Links:

> "From the City to the Stars"
> World Science Festival
> James Webb Space Telescope
> Model on display in Washington, DC - May 10-12, 2007

Tuesday, June 8, 2010

Why NASA Keeps a Close Eye on the Sun's Irradiance

0 comments
Sunspots are darker areas of the Sun that have lower solar  irradiance than other areasFor more than two centuries, scientists have wondered how much heat and light the sun expels, and whether this energy varies enough to change Earth’s climate. In the absence of a good method for measuring the sun's output, the scientific conversation was often heavy with speculation.

By 1976, that began to change when Jack Eddy, a solar astronomer from Boulder, Colo., examined historical records of sunspots and published a seminal paper that showed some century-long variations in solar activity are connected with major climatic shifts. Eddy helped show that an extended lull in solar activity during the 17th Century --called the Maunder Minimum -- was likely connected to a decades-long cold period on Earth called the "Little Ice Age."

Two years after Eddy published his paper, NASA launched the first in a series of satellite instruments called radiometers, which measure the amount of sunlight striking the top of Earth's atmosphere, or total solar irradiance. Radiometers have provided unparalleled details about how the sun's irradiance has varied in the decades since. Such measurements have helped validate and expand upon Eddy's findings. And they've led to a number of other discoveries—and questions—about the sun.

Without radiometers, scientists would probably still wonder how much energy the sun emits and whether it varies with the sunspot cycle. They wouldn't know of the competition between dark sunspots and bright spots called faculae that drives irradiance variations.

And they’d have little chance of answering a question that continues to perplex solar experts today: Has overall irradiance changed progressively throughout the past three 11-year cycles, or are variations in the sun's irradiance limited to a single cycle?

The answer has important implications for understanding climate change, as some scientists have suggested that trends in solar irradiance account for a significant portion of global warming.

The next space radiometer, slated for launch this November aboard NASA's Glory satellite, should help chip away at the uncertainty that surrounding the sun's role in climate change.

A Variable Sun It's well known today that the sun's irradiance fluctuates constantly in conjunction with sunspots, which become more and less abundant every 11 years due to turbulent magnetic fields that course through the sun's interior and erupt onto its surface.

But as recently as the 1970s, scientists assumed that the sun’s irradiance was unchanging; the amount of energy it expels was even called the "solar constant."

It was data from radiometers aboard Nimbus 7, launched in 1978, and the Solar Maximum Mission, launched two years later, that were the death knell to the solar constant. Soon after launching, instruments aboard both satellites showed that solar irradiance changed significantly as patches of sunspots rotated around the sun's surface. Irradiance would fall, for example, when groups of sunspots faced Earth. And it would recover when the sunspots rotated to the far side of the sun.
Like sunspots, solar prominences are more likely to occur during  the most active part of the solar cycle
Likewise, in 2003, a radiometer aboard NASA's Solar Radiation and Climate Experiment (SORCE) satellite observed large sunspot patches that caused irradiance to drop by as much 0.34 percent, the largest short-term decrease ever recorded.

"When you look at longer scales on the sun, it's the opposite," said Lean, a solar scientist at the U.S. Naval Research Laboratory in Washington, D.C., and a member of Glory's science team. "Overall, irradiance actually increases when the sun is more active even though sunspots are more common."

How can increases in dark, cool sunspots yield increases in irradiance? "It didn't make much sense until we were able to show that sunspots are just half of the story," said Lean.

Measurements collected during the 1980s and 1990s gave scientists the evidence they needed to prove that irradiance is actually a balance between darkening from sunspots and brightening from accompanying hot regions called faculae, a word meaning "bright torch" in Latin.

When solar activity increases, as it does every 11 years or so, both sunspots and faculae become more numerous. But during the peak of a cycle, the faculae brighten the sun more than sunspots dim it.

Overall, radiometers show that the sun’s irradiance changes by about 0.1 percent as the number of sunspots varies from about 20 sunspots or less per year during periods of low activity (solar minimum) to between 100 and 150 during periods of high activity (solar maximum).

“That may seem like a tiny amount, but it’s critical we understand even these small changes if we want to understand whether the sun's output is trending up or down and affecting climate,” said Greg Kopp, a principal investigator for Glory and scientist at the Laboratory for Atmospheric and Space Physics at the University of Colorado in Boulder.

Though most scientists believe the 0.1 percent variation is too subtle to explain all of the recent warming, it's not impossible that long-term patterns -- proceeding over hundreds or thousands of years -- could cause more severe swings that could have profound impacts on climate.
Although sunspots cause a decrease in irradiance they're   accompanied by bright white blotches called faculae that cause an   overall increase in solar irradiance
Searching for a Trend Line A total of 10 radiometers have monitored the sun since Nimbus 7, and by patching all of the measurements together into one data stream, scientists have tried to identify whether the sun’s irradiance has increased or decreased over the last three cycles.

However, melding the results from different instruments has proven complicated because many of the radiometers record slightly different absolute measurements. And the areas of overlap between instruments in the long-term record aren't as robust as scientists would like.

As a result, questions remain about how the sun's irradiance has changed. Richard Willson, principal investigator for NASA's Active Cavity Radiometer Irradiance Monitor (ACRIM), reported in a 2003 paper that the overall brightness of the sun was increasing by 0.05 percent per decade.

Subsequent assessments of the same data have come to a different conclusion. Other groups of scientists have shown that the apparent upward trend is actually an artifact of the radiometers and how they degrade in orbit. Complicating the issue further, an instrument aboard NASA's Solar and Heliospheric Observatory (SOHO) measured irradiance levels during a solar minimum in 2008 that were actually lower than the previous solar minimum.

Which measurements are right? Has the sun experienced subtle brightening or dimming during the last few solar cycles? Such questions remain controversial, but the radiometer aboard Glory, called the Total Irradiance Monitor (TIM), is ready to provide answers. The Glory TIM will be more accurate and stable than previous instruments because of unique optical and electrical advances. And each of its components has undergone a rigorous regime of calibrations at a newly-built facility at the University of Colorado.

“It’s a very exciting time to be studying the sun,” said Lean. “Every day there's something new, and we’re on the verge of answering some very important questions.”

Monday, June 7, 2010

Spacecraft Reveals Small Solar Events Have Large Scale Effects

0 comments
image of the SDO satellite orbiting EarthNASA's Solar Dynamics Observatory, or SDO, has allowed scientists for the first time to comprehensively view the dynamic nature of storms on the sun. Solar storms have been recognized as a cause of technological problems on Earth since the invention of the telegraph in the 19th century.

The Atmospheric Imaging Assembly (AIA), one of three instruments aboard SDO, allowed scientists to discover that even minor solar events are never truly small scale. Shortly after AIA opened its doors on March 30, scientists observed a large eruptive prominence on the sun's edge, followed by a filament eruption a third of the way across the star's disk from the eruption.

"Even small events restructure large regions of the solar surface," said Alan Title, AIA principal investigator at Lockheed Martin Advanced Technology Center in Palo Alto, Calif. "It's been possible to recognize the size of these regions because of the combination of spatial, temporal and area coverage provided by AIA."

The AIA instrument also has observed a number of very small flares that have generated magnetic instabilities and waves with clearly-observed effects over a substantial fraction of the solar surface. The instrument is capturing full-disk images in eight different temperature bands that span 10,000 to 36-million degrees Fahrenheit. This allows scientists to observe entire events that are very difficult to discern by looking in a single temperature band, at a slower rate, or over a more limited field of view.

The data from SDO is providing a torrent of new information and spectacular images to be studied and interpreted. Using AIA's high-resolution and nearly continuous full-disk images of the sun, scientists have a better understanding of how even small events on our nearest star can significantly impact technological infrastructure on Earth.

Solar storms produce disturbances in electromagnetic fields that can induce large currents in wires, disrupting power lines and causing widespread blackouts. The storms can interfere with global positioning systems, cable television, and communications between ground controllers and satellites and airplane pilots flying near Earth's poles. Radio noise from solar storms also can disrupt cell phone service.

Launched in Feb. 2010, the spacecraft's commissioning May 14 confirmed all three of its instruments successfully passed an on-orbit checkout, were calibrated and are collecting science data.

"We're already at five million images and counting," said Dean Pesnell, the SDO project scientist at NASA's Goddard Space Flight Center in Greenbelt, Md. "With data and images pouring in from SDO, solar scientists are poised to make discoveries that will rewrite the books on how changes in solar activity have a direct effect on Earth. The observatory is working great, and it's just going to get better."

Goddard built, operates and manages the SDO spacecraft for NASA's Science Mission Directorate in Washington. SDO is the first mission of NASA's Living with a Star Program. The program's goal is to develop the scientific understanding necessary to address those aspects of the sun-Earth system that directly affect our lives and society.