The current Geostationary Operational Environmental Satellite GOES-13 captured this image of Hurricane Danielle heading for the north Atlantic (top center), Hurricane Earl with a visible eye hitting the Leeward Islands (left bottom) and a developing tropical depression 8 (lower right) at 1:45 p.m. EDT on Aug. 30.
For more information visit http://www.nasa.gov/multimedia/imagegallery/image_feature_1749.html
A new experiment designed to reveal the origin and structure of the universe has reached its last stop on Earth before it’s set to ride into orbit aboard space shuttle Endeavour early next year.
The long-awaited Alpha Magnetic Spectrometer-2 (AMS) arrived Aug. 26 at NASA's Kennedy Space Center in Florida, secured in the belly of a U.S. Air Force C-5M cargo plane that arrived at the launch center with a late-morning touchdown on the shuttle's runway.
Nobel Prize-winning physicist Samuel Ting of the Massachusetts Institute of Technology nurtured AMS from concept to reality.
"I'm very pleased to be here," Ting said as he waited for the experiment's arrival. He was joined at the runway by several members of the international AMS team and the STS-134 astronaut crew.
Boasting a large magnet and state-of-the-art particle detector, AMS will use its lofty vantage point on the International Space Station's main truss to measure cosmic rays with unprecedented sensitivity and accuracy. In addition to a better understanding of cosmic radiation -- a major challenge of long-duration spaceflight -- the instrument could uncover evidence of mysterious dark matter or missing antimatter, discoveries that would help answer lingering questions about the universe and its beginnings.
"Over the last 50 years, all our knowledge about space has come from measuring light rays," Ting explained. "Hubble Telescope is a good example. But besides light rays, there are charged particles: electrons, positrons, protons, antiprotons, helium, and antihelium."
Ting and his scientific team believe that the best chance to detect these particles is in space, before they have hit Earth's atmosphere.
"And because it carries a charge, you need a magnet," he added.
Because AMS is the first experiment of its kind to fly in space for a long period of time, anything learned from it will be new knowledge.
"Nobody has really measured the charged-particle field precisely," Ting said. "So you enter into a new field."
The AMS instrument will be installed on the space station's main truss during the STS-134 mission, scheduled to be the last flight for space shuttle Endeavour. Led by Commander Mark Kelly, the mission's crew also comprises Pilot Gregory H. “Box” Johnson and Mission Specialists Michael Fincke, Greg Chamitoff, Andrew Feustel and European Space Agency astronaut Roberto Vittori.
AMS is expected to operate for the rest of the station's life, at least 10 years.
"It's a really neat design and as an astronaut, I appreciate the elegance of it," said Fincke. During the flight, the Endeavour astronauts will use the shuttle's robotic arm to remove AMS from the payload bay and hand it off to the station's arm.
"We're going to put it right on the space station. No bolts required, no human intervention," he explained. "Box Johnson's going to hit a couple buttons, and it's going to be captured automatically. The two umbilicals for power and data are going to stretch right in, and it'll be up and running."
Sponsored by the Department of Energy, AMS-2 was developed by an international team of 56 scientific institutions from 16 countries. The roughly 15,000-pound experiment was built and tested at the European Laboratory for Particle Physics, or CERN, in Switzerland.
"NASA's extremely excited to have AMS on board the International Space Station, because we think that it is a perfect experiment for the International Space Station," said Trent Martin, AMS project manager for the agency's Johnson Space Center in Houston.
"It shows you can bring together 500 physicists, engineers and technicians into a collaboration, build an experiment, launch it to the International Space Station, operate it for an extended period of time and hopefully get extremely exciting data that tells us something about the origins of the universe," Martin said.
Several members of the international AMS team gathered at the runway, excited to see the product of so many years of hard work finally on the ground at Kennedy. A cheer, followed by the clicking of camera shutters, met the cargo plane as it rolled onto the runway's parking apron for offloading.
Still in its packing crate, the 15-foot-wide, 13-foot-tall experiment was carefully removed from the cargo plane and transported to Kennedy's Space Station Processing Facility, where it will undergo final testing and integration before it's deemed ready to fly.
"We have our online testing that we have to do, which is basically making sure it works with the space station, making sure it can talk to the orbiter," said Joe Delai, payload mission manager for STS-134. "That should bring us to about the end of October, and in between October and February, the AMS folks will be calibrating their sensors. Then, we're ready for launch in February."
That's a sentiment shared by the entire team, including the STS-134 astronauts, who will have trained for this mission for about a year and a half when Endeavour is targeted to launch in February 2011.
"It's fitting that on its (Endeavour's) last assembly mission, the space station is going to be complete," STS-134 Commander Mark Kelly said. "It's Important to note it's going to be completed with a very complex and, hopefully, very successful physics experiment. We look forward to seeing the results that Dr. Ting is going to produce over the next decade."
For more information visit http://www.nasa.gov/mission_pages/shuttle/behindscenes/ams_arrives.html
One of NASA's orbiting sentinels is expected to return to Earth in a few days. The agency's Ice, Cloud, and land Elevation (ICESat) satellite completed a very productive scientific mission earlier this year. NASA lowered the satellite's orbit last month and then decommissioned the spacecraft in preparation for re-entry. It is estimated that the satellite will re-enter the Earth's atmosphere and largely burn up on or about August 29.
ICESat was launched in January 2003, as a three-year mission with a goal of returning science data for five years. It was the first mission of its kind –specifically designed to study Earth's polar regions with a space-based laser altimeter called the Geoscience Laser Altimeter System, or GLAS.
ICESat's lasting legacy will be its impact on the understanding of ice sheet and sea ice dynamics. The mission has led to scientific advances in measuring changes in the mass of the Greenland and Antarctic ice sheets, polar sea ice thickness, vegetation-canopy heights, and the heights of clouds and aerosols. Using ICESat data, scientists identified a network of lakes beneath the Antarctic ice sheet. ICESat introduced new capabilities, technology and methods such as the measurement of sea ice freeboard – or the amount of ice and snow that protrudes above the ocean surface - for estimating sea ice thickness.
"ICESat has been a tremendous scientific success," said Jay Zwally, ICESat's project scientist at NASA's Goddard Space Flight Center in Greenbelt, Md. "It has provided detailed information on how the Earth's polar ice masses are changing with climate warming, as needed for government policy decisions. In particular, ICESat data showed that the Arctic sea ice has been rapidly thinning, which is critical information for revising predictions of how soon the Arctic Ocean might be mostly ice free in summer. It has also shown how much ice is being lost from Greenland and contributing to sea level rise. Thanks to ICESat we now also know that the Antarctic ice sheet is not losing as much ice as some other studies have shown."
The End of an Era
After seven years in orbit and 15 laser-operations campaigns, ICESat's science mission ended in February 2010 with the failure of its primary instrument. Because the spacecraft remained in operating condition, NASA's Science Mission Directorate accepted proposals for engineering tests to be performed using ICESat. These tests were completed on June 20. NASA's Earth Science Division then authorized the decommissioning of ICESat. After completing a review of decommissioning activities, the agency directed that ICESat be decommissioned by this August.
Mission flight controllers began firing ICESat's propulsion system thrusters on June 23 to lower its orbit. Thruster firings ended on July 14, safely reducing the lowest point of the spacecraft's orbit to 125 miles (200 km) above Earth's surface. The orbit has since naturally decayed. ICESat was successfully decommissioned from operations on Aug. 14. All remaining fuel on the spacecraft is now depleted, and atmospheric drag is slowly lowering ICESat's orbit until the spacecraft re-enters the Earth's atmosphere.
A statement from the Earth Science Mission Operations office summarized the achievement:
"The ICESat mission operations team is commended for its exceptional performance, working tirelessly for the past eleven years (four years of preparation and seven years of operations), overcoming several obstacles in the early years of the mission, and closing out the mission with a flawless series of orbital maneuvers before final decommissioning. The positive control maintained over the mission right to the end shows the quality and effort that went into designing, building, qualifying, launching, and operating a tremendously successful mission such as ICESat."
The Return to Planet Earth
The vast majority of ICESat will burn up in the atmosphere during re-entry. Of the spacecraft's total mass (about 2000 lbs.), only a small percent will reach the surface of Earth. Some pieces of the spacecraft, weighing collectively about 200 pounds, are expected to survive re-entry. The risk of harm coming to anyone on Earth from this debris is estimated to be very low.
ICESat was not designed to perform a controlled re-entry and is unable to provide targeting to a particular location on Earth. ICESat circles the Earth from pole to pole, so surviving debris could land almost anywhere on the planet. Due to natural variability in the near-Earth environment, a precise location of where spacecraft debris will re-enter cannot be forecast. The U.S. Space Surveillance Network is closely monitoring the orbit of ICESat during its final days and will continue to issue periodic predictions of re-entry time and location. The NASA Orbital Debris Program Office will issue re-entry information based on these predictions.
NASA and international standards for space objects re-entering Earth's atmosphere do not require controlled re-entry but do have requirements and guidelines for the maximum risk posed by debris surviving re-entry.
"The ICESat team has done a marvelous job to ensure that the spacecraft is removed as a hazard to other spacecraft and as a potential source of future orbital debris," said Nicholas L. Johnson, NASA Chief Scientist for Orbital Debris at NASA's Johnson Space Center in Houston.
The Future Looks Bright
Despite the end of ICESat's mission, NASA's observations of Earth's polar regions continue. In anticipation of the ICESat mission coming to an end, and in accordance with the National Research Council's Decadal Survey of future NASA Earth science missions, NASA has begun development of ICESat-2, planned for launch in 2015. ICESat-2 will continue the science legacy of its predecessor, and improve our understanding of Earth's dynamic polar regions with new and advanced technology.
The Operation Ice Bridge airborne mission, started in 2009, is the largest airborne survey of Earth's polar ice ever flown. The mission is designed to partially fill the data gap between the ICESat and ICESat-2 satellite missions. For the next five years, instruments on NASA aircraft will target areas of rapid change to yield an unprecedented 3-D view of Arctic and Antarctic ice sheets, ice shelves, and sea ice. Targeted information from aircraft combined with the broad and consistent coverage from satellites contribute to a more complete understanding of Earth's response to climate change, helping scientists make better predictions of what the future might hold.
For more information visit http://www.nasa.gov/mission_pages/icesat/icesat-end.html
One of the instruments on a 2016 mission to orbit Mars will provide daily maps of global, pole-to-pole, vertical distributions of the temperature, dust, water vapor and ice clouds in the Martian atmosphere.
The joint European-American mission, ExoMars Trace Gas Orbiter, will seek faint gaseous clues about possible life on Mars. This instrument, called the ExoMars Climate Sounder, will supply crucial context with its daily profiling of the atmosphere's changing structure.
The European Space Agency and NASA have selected five instruments for ExoMars Trace Gas Orbiter. The European Space Agency will provide one instrument and the spacecraft. NASA will provide four instruments, including ExoMars Climate Sounder, which is coming from NASA's Jet Propulsion Laboratory, Pasadena, Calif.
Two of the other selected instruments are spectrometers -- one each from Europe and the United States -- designed to detect very low concentrations of methane and other important trace gases in the Martian atmosphere.
"To put the trace-gas measurements into context, you need to know the background structure and circulation of the atmosphere," said JPL's Tim Schofield, principal investigator for the ExoMars Climate Sounder. "We will provide the information needed to understand the distribution of trace gases identified by the spectrometers. We'll do this by characterizing the role of atmospheric circulation and aerosols, such as dust and ice, in trace-gas transport and in chemical reactions in the atmosphere affecting trace gases."
The ExoMars Climate Sounder is an infrared radiometer designed to operate continuously, day and night, from the spacecraft's orbit about 400 kilometers (about 250 miles) above the Martian surface. It can pivot to point downward or toward the horizon, measuring temperature, water vapor, dust and ices for each 5-kilometer (3-mile) increment in height throughout the atmosphere from ground level to 90 kilometers (56 miles) altitude.
Schofield and his international team have two other main goals for the investigation, besides aiding in interpretation of trace-gas detections.
One is to extend the climate mapping record currently coming from a similar instrument, the Mars Climate Sounder, on NASA's Mars Reconnaissance Orbiter, which has been working at Mars since 2006. The orbital geometry of the Mars Reconnaissance Orbiter mission enables this sounder to record atmospheric profiles only at about 3 p.m. and 3 a.m. during the Martian day, except near the poles. The ExoMars Trace Gas Orbiter will fly an orbital pattern that allows the spacecraft to collect data at all times of day, at all latitudes.
"We'll fill in information about variability at different times of day, and we'll add to the number of Mars years for understanding year-to-year variability," said Schofield. "The most obvious year-to-year change is that some years have global dust storms and others don't. We'd like to learn whether there's anything predictive for anticipating the big dust storms, and what makes them so variable from year to year."
A third research goal is to assist future landings on Mars by supplying information about the variable density of the atmosphere. At a chosen landing site, atmospheric density can change from one day to the next, affecting a spacecraft's descent.
"We want to provide background climatology for what to expect at a given site, in a given season, for a particular time of day, and also nearly real-time information for the atmospheric structure in the days leading up to the landing of a spacecraft launched after 2016," said Schofield.
The 2016 ExoMars Trace Gas Orbiter is the first in a series of planned Mars mission collaborations of the European Space Agency and NASA. A variable presence of small amounts of methane in the Martian atmosphere has been indicated from orbital and Earth-based observations. A key goal of the mission is to gain a better understanding of methane and other trace gases that could be evidence about possible biological activity. Methane can be produced both biologically and without life.
Besides the two spectrometers and the climate sounder, the orbiter's selected instruments include two NASA-provided imagers: a high-resolution, stereo, color imager, and a wide-angle, color, weather camera. The orbiter will also serve as a communications relay for missions on the surface of Mars and will carry a European-built descent-and-landing demonstration module designed to operate for a few days on the Mars surface. JPL, a division of the California Institute of Technology, manages NASA's roles in the mission.
For more information visit http://www.jpl.nasa.gov/news/news.cfm?release=2010-280
Unicorns and roses are usually the stuff of fairy tales, but a new cosmic image taken by NASA's Wide-field Infrared Explorer (WISE) shows the Rosette nebula located within the constellation Monoceros, or the Unicorn.
This flower-shaped nebula, also known by the less romantic name NGC 2237, is a huge star-forming cloud of dust and gas in our Milky Way galaxy. Estimates of the nebula's distance vary from 4,500 to 5,000 light-years away.
At the center of the flower is a cluster of young stars called NGC 2244. The most massive stars produce huge amounts of ultraviolet radiation, and blow strong winds that erode away the nearby gas and dust, creating a large, central hole. The radiation also strips electrons from the surrounding hydrogen gas, ionizing it and creating what astronomers call an HII region.
Although the Rosette nebula is too faint to see with the naked eye, NGC 2244 is beloved by amateur astronomers because it is visible through a small telescope or good pair of binoculars. The English astronomer John Flamsteed discovered the star cluster NGC 2244 with a telescope around 1690, but the nebula itself was not identified until John Herschel (son of William Herschel, discoverer of infrared light) observed it almost 150 years later.
The streak seen at lower left is the trail of a satellite, captured as WISE snapped the multiple frames that make up this view.
This image is a four-color composite created by all four of WISE's infrared detectors. Color is representational: blue and cyan represent infrared light at wavelengths of 3.4 and 4.6 microns, which is dominated by light from stars. Green and red represent light at 12 and 22 microns, which is mostly light from warm dust.
JPL manages the Wide-field Infrared Survey Explorer for NASA's Science Mission Directorate, Washington. The principal investigator, Edward Wright, is at UCLA. The mission was competitively selected under NASA's Explorers Program managed by the Goddard Space Flight Center, Greenbelt, Md. The science instrument was built by the Space Dynamics Laboratory, Logan, Utah, and the spacecraft was built by Ball Aerospace & Technologies Corp., Boulder, Colo. Science operations and data processing take place at the Infrared Processing and Analysis Center at the California Institute of Technology in Pasadena. Caltech manages JPL for NASA.
For more information http://www.jpl.nasa.gov/news/news.cfm?release=2010-278
In early August 2005, Katrina was just a name. By September, it had become synonymous with the costliest and one of the deadliest tropical cyclones in U.S. history.
Five years later, NASA is revisiting Hurricane Katrina with a short video that shows the storm as captured by NASA satellites. NASA provides space-based satellite observations, field research missions, and computer climate modeling to further scientists' understanding of these storms. NASA also provides measurements and modeling of global sea surface temperatures, precipitation, winds and ocean heat content -- all ingredients that contribute to the formation of tropical cyclones (the general name for typhoons, tropical storms and hurricanes).
On Aug. 29, 2005, after passing over the Caribbean and Florida, Katrina made landfall along the Gulf Coast as a category 3 hurricane on the Saffir-Simpson scale. As hurricanes go, Katrina was actually only moderate in size when it reached the Mississippi and Louisiana coasts, having weakened from a category 5 the day before. However, Katrina had a very wide footprint, which caused a broad area of large ocean swells to develop within the Gulf of Mexico. As the hurricane made its final landfall, the resulting storm surge was massive and unrelenting. Ultimately, this storm surge was responsible for much of the damage as it flooded coastal communities, overwhelmed levees, and left at least 80 percent of New Orleans underwater.
By the time the hurricane subsided, Katrina had claimed more than 1,800 human lives and caused roughly $125 billion in damages.
As scientists and rescue organizations worked on the ground to prepare for the hurricane and assist in its wake, NASA provided data gathered from a series of Earth-observing satellites to help predict the hurricane's path and intensity. In the aftermath, NASA satellites also helped identify areas hardest hit.
In this 3 1/2-minute video created by NASA-TV producer Jennifer Shoemaker at NASA's Goddard Space Flight Center in Greenbelt, Md., viewers will see many different kinds of data NASA satellites gathered about the storm. The video contains a sampling of the kinds of things NASA studies about hurricanes. Various additional data products are created in hurricane and post-hurricane research that are not depicted in the video.
The video opens with Atlantic Ocean sea surface temperatures data from an instrument called AMSR-E (Advanced Microwave Scanning Radiometer - Earth Observing System) that flies aboard NASA's Aqua satellite. Warm ocean waters (of 80 degrees Fahrenheit or warmer) provided energy to fuel the growing storm. Next, the MISR (Multi-angle Imaging SpectroRadiometer) instrument on NASA's Terra satellite captured the growth of cloud tops in the gathering storm.
Just before landfall, the Tropical Rainfall Measuring Mission (TRMM) satellite data revealed "hot towers" hidden within the hurricane -- powerful thunderstorms that helped intensify Katrina. TRMM also captured data on rainfall amounts throughout the hurricane's lifecycle.
Finally, the video shows Landsat satellite imagery of New Orleans before and during the flooding, as well as a more recent view of a city still rebuilding from the hurricane some five years later.
Katrina was just one of 28 named tropical cyclones during the 2005 hurricane season, but due to the tragedy it caused, it remains the one most remembered. The World Meteorological Organization has since retired the name "Katrina" from its list of hurricane names. As such, there will never be another Hurricane Katrina.
Meanwhile, NASA satellites continue to provide satellite data to study tropical cyclones around the world and to help forecasters make better predictions about storm's behavior and hurricane response organizations to better prepare for those yet to come. NASA also studies the effects of hurricanes long after the storm has passed, in order to better understand effects of large storms, which will ultimately help in restoration and preparation efforts in the future.
For more information visit http://www.nasa.gov/mission_pages/hurricanes/features/katrina-retrospective.html
Two extremely bright stars illuminate a greenish mist in this image from the Spitzer Space Telescope's "GLIMPSE360" survey. This mist is comprised of hydrogen and carbon compounds called polycyclic aromatic hydrocarbons (PAHs), which also are found here on Earth in sooty vehicle exhaust and on charred grills. In space, PAHs form in the dark clouds that give rise to stars. These molecules provide astronomers a way to visualize the peripheries of gas clouds and study their structures in great detail. They are not actually green; but are color coded in these images to allow scientists see their glow in infrared.
This image is a combination of data from Spitzer and the Two-Micron All-Sky Survey (2MASS). The Spitzer data was taken after Spitzer's liquid coolant ran dry in May 2009, marking the beginning of its "warm" mission.
For More information visit http://www.nasa.gov/multimedia/imagegallery/image_feature_1736.html
This fall, NASA researchers will move one step closer to sailing among the stars.
Astrophysicists and engineers at the Marshall Space Flight Center in Huntsville, Ala., and the Ames Research Center in Moffett Field, Calif., have designed and built NanoSail-D, a “solar sail" that will test NASA’s ability to deploy a massive but fragile spacecraft from an extremely compact structure. Much like the wind pushing a sailboat through water, solar sails rely on sunlight to propel vehicles through space. The sail captures constantly streaming solar particles, called photons, with giant sails built from a lightweight material. Over time, the buildup of these particles provides enough thrust for a small spacecraft to travel in space.
Many scientists believe that solar sails have enormous potential. Because they take advantage of sunlight, they don’t require the chemical fuel that spacecraft currently rely on for propulsion. Less fuel translates into lower launch weight, lower costs and fewer logistical challenges. Solar sails accelerate slowly but surely, capable of eventually reaching tremendous speeds. In fact, most scientists consider solar sailing the only reasonable way to make interstellar travel a reality.
Of course, it's not as easy as it sounds.
For scientists to really make use of solar sails, the sails must be huge. Because the particles emitted by the sun are so tiny and the spacecraft is so large, the sail needs to intercept as many particles as possible. It's almost like trying to fill up a swimming pool with rain drops; the wider the pool, the more rain it captures. The same is true with solar sails and the sun's energy. In fact, a NASA team in the 1970s predicted it would need a solar sail with a surface area of nearly 6 million square feet -- about the size of 10 square blocks in New York City -- to successfully employ a solar sail for space exploration.
That's where NanoSail-D comes in. As the first NASA solar sail deployed in low-Earth orbit, NanoSail-D will provide valuable insight into this budding technology.
"One of the most difficult challenges solar sails face is trying to deploy enormous but fragile spacecraft from extremely small and compact structures. We can't just attach a giant, fully spread sail to a rocket and launch it into space. The journey would shred the sail to pieces," said Dean Alhorn, NanoSail-D principal investigator and aerospace engineer at the Marshall Center.
"Instead, we need to pack it in a smaller and more durable container, launch that into space and deploy the solar sail from that container," Alhorn said. "With NanoSail-D, we're testing a technology that does exactly that."
One objective of the NanoSail-D project is to demonstrate the capability to pack and deploy a large sail structure from a highly compacted volume. This demonstration can be applied to deploy future communication antennas, sensor arrays or thin film solar arrays to power the spacecraft.
NanoSail-D will be deployed 400 miles up after it's launched this fall aboard a Minotaur IV rocket, part of the payload aboard the Fast, Affordable, Science and Technology Satellite, or FASTSAT. The relatively low-deployment altitude means drag from Earth's atmosphere may dominate any propulsive power it gains from the sun, but the project represents a small first step toward eventually deploying solar sails at much higher altitudes.
When fully deployed, NanoSail-D has a surface area of more than 100 square feet and is made of CP1, a polymer no thicker than single-ply tissue paper. The first big challenge for researchers was to pack it into a container smaller than a loaf of bread and create a mechanism capable of unfolding the sail without tearing it.
"Think of how easily I can rip a piece of tissue paper with my hands," Alhorn said. "Designing a mechanism to unfurl a space sail about that thick without tearing is no easy task."
To accomplish their goal, engineers tightly wound the NanoSail-D sail around a spindle and packed it in the container.
During launch, NanoSail-D is stored inside FASTSAT. Once orbit is achieved, the NanoSail-D satellite will be ejected from the satellite bus and an internal timer will start counting down. When the timer reaches zero, four booms will quickly deploy and the NanoSail-D sail will start to unfold. Within just five seconds the sail will be fully unfurled.
"The deployment works in the exact opposite way of carpenter's measuring tape," Alhorn explained. "With a measuring tape, you pull it out, which winds up a spring, and when you let it go it is quickly pulled back in. With NanoSail-D, we wind up the booms around the center spindle. Those wound-up booms act like the spring. Approximately seven days after launch, it deploys the sail off the center spindle."
Researchers designing NanoSail-D have faced more than their fair share of challenges. When the project was commissioned in 2008, NASA set a deadline of just four months to design and test the new technology. The team had to make decisions quickly, often using whatever parts happened to be available.
"It wasn't a question of going off and doing an exhaustive study of what components to use," Alhorn recalled. "There was no time for that. We said, 'Okay, this is the size of component we need, this is its function' -- and as soon as we found one that worked, we used it."
After months of work in 2008, researchers and engineers finally completed the sail, which was set to launch that August and orbit Earth for one to two weeks. Engineers integrated the flight unit on the Falcon 1, a launch vehicle designed and manufactured by SpaceX of Hawthorne, Calif., but unfortunately the rocket experienced launch failure and NanoSail-D never made it to orbit.
Fortunately, the team had built a spare. For the past two years, Alhorn and his team have worked to refine the second flight unit, hammering out the manufacturing problems and cleaning up the spool and a few of the other internals. In addition to having a higher orbit, the second NanoSail-D will launch into space and remain there for up to 17 weeks, a big increase from the original mission. The new orbit, 400 miles above the earth, also will allow more astronomers to get pictures of the sail as it glides across the night sky. Most of the mission has remained the same, however. For example, because the sail will deploy relatively close to Earth, researchers will have a difficult time detecting the slight solar effects.
After a few months, NanoSail-D will begin to move out of orbit. This de-orbiting process will provide NASA researchers with information about how systems like NanoSail-D might one day be used to bring old satellites out of space. This will provide a means for future satellites to de-orbit after their mission is complete -- keeping them from becoming space junk.
For now, Alhorn and his team are anxiously awaiting NanoSail-D's second attempt.
"The most exciting thing about the upcoming launch is just being able to do it," he said. To get a second chance is invigorating. You rarely get one like this -- that's what motivates me to get up and keep doing this."
After the NanoSail-D flight, Alhorn hopes to continue developing solar sails for NASA. He's already started to design FeatherSail, a next-generation solar sail that will rely on insights gained from the NanoSail-D mission to take solar sailing to the next level.
NanoSail-D, managed by the Marshall Center, will be the first NASA solar sail deployed in low Earth orbit. The sail payload was designed and built by NeXolve, a division of ManTech in Huntsville, Ala. The NanoSail-D project is a collaboration with the Nanosatellite Missions Office at Ames Research Center. The experiment is a combined effort between NASA, the U.S. Air Force, Space Development and Test Wing, Kirtland Air Force Base, N.M., and the U.S. Army Space and Missile Defense Command and the Von Braun Center for Science & Innovation, both in Huntsville.
For more information visit http://www.nasa.gov/mission_pages/smallsats/10-109.html
This image shows the eruption of a galactic “super-volcano” in the massive galaxy M87, as witnessed by NASA's Chandra X-ray Observatory and NSF's Very Large Array (VLA). At a distance of about 50 million light years, M87 is relatively close to Earth and lies at the center of the Virgo cluster, which contains thousands of galaxies.
The cluster surrounding M87 is filled with hot gas glowing in X-ray light (and shown in blue) that is detected by Chandra. As this gas cools, it can fall toward the galaxy's center where it should continue to cool even faster and form new stars.
However, radio observations with the VLA (red) suggest that in M87 jets of very energetic particles produced by the black hole interrupt this process. These jets lift up the relatively cool gas near the center of the galaxy and produce shock waves in the galaxy's atmosphere because of their supersonic speed. The interaction of this cosmic “eruption” with the galaxy's environment is very similar to that of the Eyjafjallajokull volcano in Iceland that occurred in 2010. With Eyjafjallajokull, pockets of hot gas blasted through the surface of the lava, generating shock waves that can be seen passing through the grey smoke of the volcano. This hot gas then rises up in the atmosphere, dragging the dark ash with it. This process can be seen in a movie of the Eyjafjallajokull volcano where the shock waves propagating in the smoke are followed by the rise of dark ash clouds into the atmosphere.
In the analogy with Eyjafjallajokull, the energetic particles produced in the vicinity of the black hole rise through the X-ray emitting atmosphere of the cluster, lifting up the coolest gas near the center of M87 in their wake. This is similar to the hot volcanic gases drag up the clouds of dark ash. And just like the volcano here on Earth, shockwaves can be seen when the black hole pumps energetic particles into the cluster gas.
For more information visit http://www.nasa.gov/multimedia/imagegallery/image_feature_1743.html
Earth has done an ecological about-face: Global plant productivity that once flourished under warming temperatures and a lengthened growing season is now on the decline, struck by the stress of drought.
NASA-funded researchers Maosheng Zhao and Steven Running, of the University of Montana in Missoula, discovered the global shift during an analysis of NASA satellite data. Compared with a six-percent increase spanning two earlier decades, the recent ten-year decline is slight -- just one percent. The shift, however, could impact food security, biofuels, and the global carbon cycle.
"We see this as a bit of a surprise, and potentially significant on a policy level because previous interpretations suggested that global warming might actually help plant growth around the world," Running said.
"These results are extraordinarily significant because they show that the global net effect of climatic warming on the productivity of terrestrial vegetation need not be positive -- as was documented for the 1980’s and 1990’s," said Diane Wickland, of NASA Headquarters and manager of NASA's Terrestrial Ecology research program.
Conventional wisdom based on previous research held that land plant productivity was on the rise. A 2003 paper in Science led by then University of Montana scientist Ramakrishna Nemani (now at NASA Ames Research Center, Moffett Field, Calif.) showed that global terrestrial plant productivity increased as much as six percent between 1982 and 1999. That's because for nearly two decades, temperature, solar radiation and water availability -- influenced by climate change -- were favorable for growth.
Setting out to update that analysis, Zhao and Running expected to see similar results as global average temperatures have continued to climb. Instead, they found that the impact of regional drought overwhelmed the positive influence of a longer growing season, driving down global plant productivity between 2000 and 2009. The team published their findings Aug. 20 in Science.
"This is a pretty serious warning that warmer temperatures are not going to endlessly improve plant growth," Running said.
The discovery comes from an analysis of plant productivity data from the Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA's Terra satellite, combined with growing season climate variables including temperature, solar radiation and water. The plant and climate data are factored into an algorithm that describes constraints on plant growth at different geographical locations.
For example, growth is generally limited in high latitudes by temperature and in deserts by water. But regional limitations can very in their degree of impact on growth throughout the growing season.
Zhao and Running's analysis showed that since 2000, high-latitude northern hemisphere ecosystems have continued to benefit from warmer temperatures and a longer growing season. But that effect was offset by warming-associated drought that limited growth in the southern hemisphere, resulting in a net global loss of land productivity.
"This past decade’s net decline in terrestrial productivity illustrates that a complex interplay between temperature, rainfall, cloudiness, and carbon dioxide, probably in combination with other factors such as nutrients and land management, will determine future patterns and trends in productivity," Wickland said.
Researchers are keen on maintaining a record of the trends into the future. For one reason, plants act as a carbon dioxide "sink," and shifting plant productivity is linked to shifting levels of the greenhouse gas in the atmosphere. Also, stresses on plant growth could challenge food production.
"The potential that future warming would cause additional declines does not bode well for the ability of the biosphere to support multiple societal demands for agricultural production, fiber needs, and increasingly, biofuel production," Zhao said.
"Even if the declining trend of the past decade does not continue, managing forests and croplands for multiple benefits to include food production, biofuel harvest, and carbon storage may become exceedingly challenging in light of the possible impacts of such decadal-scale changes," Wickland said.
For more information visit http://www.nasa.gov/topics/earth/features/plant-decline.html
The first flight of NASA's hurricane airborne research mission is scheduled to take off from Ft. Lauderdale, Fla., on Tuesday, Aug. 17. NASA's DC-8 research aircraft will be making a planned five-hour flight along the Gulf Coast from western Florida to Louisiana primarily as a practice run for the many scientific instruments aboard.
Mission scientists, instrument teams, flight crew and support personnel gathered in Fort Lauderdale this weekend to begin planning the six-week Genesis and Rapid Intensification Processes mission, or GRIP. NASA's DC-8, the largest of NASA's three aircraft taking part in the mission, is based at the Fort Lauderdale airport. The two other aircraft -- the WB-57 based in Houston and the autonomous Global Hawk flying out of southern California -- will join the campaign in about a week.
The target for Tuesday's "shakedown" flight is the remnants of Tropical Depression 5, a poorly organized storm system whose center is currently hugging the coasts of Mississippi and Louisiana and moving westward. While forecasters do not expect this storm system to strengthen significantly before it reaches landfall in Louisiana, the system offers the DC-8's seven instrument teams an opportunity to try out their equipment on possible convective storms. Rainfall rates, wind speed and direction below the airplane to the surface, cloud droplet sizes, and aerosol particle sizes are just some of the information that these instruments will collect.
GRIP science team members and project managers are now meeting daily at the airport to review weather forecasts and plan upcoming flights with their counterparts in two other airborne hurricane research missions sponsored by the National Atmospheric and Oceanic Administration (NOAA) and the National Science Foundation. Instrument teams are also working on their equipment onboard the DC-8 in preparation for the flight.
On Sunday, Aug. 15, NASA's Global Hawk completed a successful test flight over NASA's Dryden Flight Research Center in Edwards, Calif., that took the remotely piloted plane to an altitude of 60,000 feet. The last of three instruments being mounted on the Global Hawk for GRIP is being installed this week.
For more information visit http://www.nasa.gov/mission_pages/hurricanes/missions/grip/news/shakedown-flight.html
Rain drops are fat and snowflakes are fluffy, but why does it matter in terms of predicting severe storms?
We've all seen fat rain drops, skinny rain drops, round hailstones, fluffy snowflakes and even ice needles. This summer, NASA researchers are going to get a look at just how much these shapes influence severe storm weather. To do it, they'll have to look inside the guts of some of the world's fiercest storms. NASA recently assembled a team of hurricane scientists from across the country to carry out high-altitude-aircraft surveillance to explore in detail how storms form, intensify and dissipate.
Earth scientists and engineers at NASA's Marshall Space Flight Center in Huntsville, Ala., have redesigned one of their instruments, the Advanced Microwave Precipitation Radiometer, or AMPR, to better observe the different shapes of precipitation. In August and September, AMPR will fly at an altitude of 60,000 feet over the Gulf of Mexico and Atlantic Ocean. It will sit in the bomb bay of a WB-57 airplane, which is based at the NASA Johnson Space Center's Ellington Field in Houston.
During these flights, AMPR researchers will test a new build -- the instrument is an upgraded version of the original AMPR built at NASA Marshall in the early 1990s -- and use it to participate in NASA's upcoming hurricane study, the Genesis and Rapid Intensification Processes field campaign, better known as GRIP. The campaign involves three planes mounted with 14 different instruments, including AMPR. The instruments will all work together to create the most complete view of a hurricane to date.
Researchers hope the hurricane campaign will help them answer some of nature's most perplexing questions. As tropical storms grow, they produce massive amounts of rain -- a key element in the development of full-scale hurricanes. Scientists will use AMPR along with the other instruments, such as data from the Tropical Rainfall Measuring Mission or TRMM satellite, to figure out just how hard it's raining inside these ferocious storms, and how much of that rain is associated with the production of ice during intensification.
"If you don't know how hard it's raining or where the rain is forming in the atmosphere, you don't know hurricanes," said Dr. Walt Petersen, AMPR principle investigator and Marshall Center earth scientist. "AMPR provides us an opportunity to see their precipitation structure by using an instrument like those currently flying on, for example, the TRMM and Aqua satellites in space."
That's because AMPR doesn't just give scientists new information about hurricanes. The instrument also enables them to test equipment currently in space. Every day, numerous weather satellites orbit Earth to measure the rainfall rate of storms across the globe. They work much like AMPR except over much larger scales. Because they're so far above the Earth and moving so fast, they can take only one measurement every few miles along their track. Scientists can correct for such coarse measurements, but to do so they need highly accurate data. AMPR can take several measurements per mile, giving scientists the data they need to verify that weather satellites continue to provide accurate data.
"It's like the pixels in your computer screen," Petersen said. "When satellites take measurements, they have really big pixels, and we might lose some of the finer details of what's happening on the ground. AMPR has much smaller pixels, much higher resolution, and allows us to see a much clearer picture. It's a part of our arsenal to make sure what we're measuring from space makes sense. We'd hate to send something up and not have it accurately measure what's happening on the ground."
That information translates into better predictions of hurricane track and intensity -- how hard it's going to rain in a certain area when a hurricane hits, for example, aiding in early flood warnings.
AMPR doesn't just measure how hard rain falls. Within the last several years, the AMPR team has worked vigorously to upgrade the instrument. These upgrades will enable AMPR to more accurately detect what kind of precipitation is in the storm. By identifying the shape of the precipitation, AMPR may present scientists with recognizable signatures that define different types of precipitation. For example, varying combinations of fat or skinny rain drops, snow, ice or hail distributed throughout the depth of the storm will produce different brightness temperatures when viewed at different angles. A storm may develop and behave differently depending on these variations.
Engineers packed the 380-pound AMPR payload with a delicate set of instruments and computer hardware. AMPR gathers data by measuring the amount of microwave radiation rising from the surface beneath -- often the ocean. Because rain water is a better emitter of microwave radiation than ocean water, the radiation measured from rainfall is actually greater during a big storm. This measurement is converted to a "brightness temperature," which correlates to how much radiation is being generated. The more rain, the higher the brightness temperature.
Alternatively, if a hurricane's clouds are full of ice or hail, as they usually are, much of the microwave radiation is scattered away. The corresponding brightness temperature is much lower than the anticipated surface measurement. Scientists can use those changes to determine how hard it's raining inside a storm or how much ice a given storm might contain.
"Whether rain drops are fat or skinny, and whether ice is round or bumpy, these factors are critical when we're trying to estimate rainfall rates," Petersen explained. "Because of air drag, the rate at which these precipitation particles fall through the air depends on their thickness or shape. A fat rain drop falls more slowly than a hail stone of the same size, for example -- that factor enables you to determine rainfall rate."
After the GRIP experiment ends in September, Petersen and his team will unload the data and begin analyzing it, adding their findings to the increasingly large body of hurricane knowledge.
"The GRIP experiment will give us information about how a hurricane circulates and how it intensifies. Basically we have a bunch of theories about the role of precipitation in hurricanes, and we need to test them. That's where instruments like AMPR come in."
After this summer’s hurricane study, AMPR will continue to fly in storm campaigns. It's already scheduled for a major joint NASA and U.S. Department of Energy study in April 2011 to support the Global Precipitation Measurement
Petersen loves the challenge. Storms have fascinated him ever since his junior year of high school, when lightning struck just inches away from him while he was at a drive-in movie.
"The thing that excites me is looking inside a storm that we can't fly into," he said. "We can't fly inside these big storms because they're just too nasty. The only way to get information about what's going on inside is to do what AMPR does.
"Being able to look at the guts of a storm and figure out what's going on, that's the key thing for me," he added.
With any luck, AMPR's look into hurricanes will put scientists one step closer to predicting some of the world's fiercest storms.
For more information visit http://www.nasa.gov/mission_pages/hurricanes/missions/grip/news/ampr.html
A STORRM is brewing aboard space shuttle Endeavour.
The next generation in docking and rendezvous technology will make its debut early next year during the STS-134 mission, scheduled to be the final space shuttle flight. Officially called the Sensor Test for Orion Relative Navigation Risk Mitigation, the "STORRM" system was installed Aug. 10 inside Endeavour's payload bay, where it will fly as a Development Test Objective, or DTO -- in other words, an in-flight experiment.
Designed for use on the Orion capsule, STORRM includes the Visual Navigation Sensor, or VNS, along with an advanced docking camera. The VNS relies on a light-based remote sensing technology called lidar to provide extremely accurate data while the docking camera offers high-resolution docking imagery.
When the STORRM's two hardware components -- the Sensor Enclosure Assembly (SEA) and Avionics Enclosure Assembly (AEA) -- were lowered into place in Endeavour's payload bay, an unusually large crowd of enthusiastic agency and contractor representatives were on hand to observe and celebrate the milestone.
"I'd have to say this is the most people I've ever seen come for a payload installation," said NASA's Vehicle Manager for Endeavour, Shelley Ford, as she surveyed a crowd of about 30 people vying for the best views among the myriad of access platforms surrounding the orbiter. "It's exciting that Endeavour will be contributing to the technology development for our future space program."
STORRM was developed at NASA's Johnson Space Center in Houston, which is responsible for program management, technology evaluation, flight test objectives, operational concepts, contract management and data post-processing. Engineers at NASA's Langley Research Center in Virginia were in charge of engineering management, design and build of the avionics, STORRM software application and reflective elements. They are also responsible for the integration, testing and certification of these components. Industry partners Lockheed Martin Space Systems and Ball Aerospace Technologies Corp. handled the design, build and testing of the VNS and docking camera.
Installation began with the Sensor Enclosure Assembly, a 52-pound box about the size of a microwave oven. United Space Alliance Lead Mechanical Technician Tim Keyser, serving as move director, oversaw the installation as technicians using a jib hoist carefully lifted the SEA over several levels of platforms, then lowered it into the forward end of Endeavour's payload bay.
The SEA was mounted in place in front of the shuttle's airlock, alongside the existing Trajectory Control System. The location of the docking camera offers an accurate snapshot of how the system would handle on the Orion capsule, and provide precise visual cues to the crew.
"This works great for us," said Scott Cryan, Orion relative navigation hardware subsystem manager at Johnson. "The docking camera in the SEA is right in line with the orbiter's center line."
Next, the team picked up the 82-pound Avionics Enclosure Assembly, which provides power distribution, data recording and memory for the camera and navigation system. The AEA is mounted in bay 3 on the port side of the payload bay.
According to Deputy Project Manager Rick Walker, visiting from Langley, the assembly's location in the payload bay is due to the large volumes of high-speed data the hardware will have to digest. But placing it in the bay resulted in the need for radiation-tolerant memory. The team succeeded by using a blend of commercial and Langley-developed technologies, completing the work in nearly half the time it would normally take.
"This was done in 14 months -- a pretty quick turnaround," Walker said after the AEA was bolted into place. "Now, this is the exciting part. You see the hard work, long hours and travel away from home come together. This is what it's all about."
Electrical connections were completed the next day, followed by a round of functional testing that verified the STORRM hardware is ready to fly.
"The team successfully completed the test and checkout of the STORRM payload yesterday, so after the test cables are demated and some final inspections are accomplished, it will be ready for flight," Ford said after the testing wrapped up. "We'll be cheering the STORRM folks on and wishing for their success when Endeavour docks to the ISS early next year."
For more information visit http://www.nasa.gov/mission_pages/shuttle/behindscenes/storrm_install.html
On Aug. 5, 2010, an enormous chunk of ice, about 251 square kilometers (97 square miles) in size, or roughly four times the size of Manhattan, broke off the Petermann Glacier along the northwestern coast of Greenland. The Petermann Glacier lost about one-quarter of its 70-kilometer-long (40-miles) floating ice shelf, according to researchers at the University of Delaware, Newark, Dela. The recently calved iceberg is the largest to form in the Arctic in 50 years.
Icebergs calving off the Petermann Glacier are not unusual. Petermann Glacier's floating ice tongue is the Northern Hemisphere's largest, and it has occasionally calved large icebergs.
Scientists are monitoring the movement of the iceberg closely. If it moves out into the narrow Nares Strait, there is the potential it could interfere with or block the loss of Arctic sea ice out of the Arctic Ocean into Baffin Bay, a sea that connects the Arctic and Atlantic Oceans. The ice could also eventually pose a hazard to shipping.
This image of Petermann Glacier and the new iceberg was acquired from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument on NASA's Terra spacecraft on Aug. 12, 2010. It covers an area of 49.5 by 31.5 kilometers (30.7 by 19.5 miles) and is located at 81.1 degrees north latitude, 61.7 degrees west longitude.
For more information visit http://www.jpl.nasa.gov/news/news.cfm?release=2010-268
Where you live may say a lot about your socioeconomic status. It also may suggest how vulnerable you are to long periods of excessively hot weather.
Researchers at NASA’s Johnson Space Center, Arizona State University and the University of California at Riverside are studying the relationship between temperature variations and socioeconomic variables across metropolitan Phoenix. They have found that the urban poor are the most vulnerable to extreme heat.
Those in higher incomes tend to live in areas that are cooler due to the increased amount of vegetation, such as lush lawns and canopy trees, that surrounds homes or on higher-elevation hillslopes above the hotter Salt River valley floor. The urban poor tend to live in the urban core of metro Phoenix where the heat island effect is intense. These neighborhoods are located near industrial areas, commercial centers, and transportation corridors. There are few amenities, such as parks, and the landscaping has little or no grass or trees.
Propelled by a $1.4 million grant from the National Science Foundation as part of its Dynamics of Coupled Natural and Human Systems Program, the research team is compiling a history of the development of the metro Phoenix urban heat island. Urban heat islands result when existing soil and grass is replaced with materials such as asphalt and concrete that absorb heat during the day and reradiate it at night, thus causing increased temperatures especially during nighttime.
Sharon Harlan, a sociologist in the School of Human Evolution and Social Change at ASU, has pulled together the interdisciplinary team, which is comprised of social and natural scientists, public health experts, and educators.
Harlan is excited about the potential for this pioneering research.
“The problem of heat-related deaths and illnesses is very serious,” said Harlan. “Each year, heat fatalities in the U.S. occur in greater numbers than mortality from any other type of weather disaster. Global climate changes and rapidly growing cities are likely to compound and intensify the adverse health effects of heat islands around the world. Our research is integrating data with sophisticated modeling tools to analyze urban systems while keeping health equity considerations and the well-being of vulnerable populations at the center of attention. We want our research to be used to promote better decision-making about climate adaptation in cities.”
The primary objective of the research is to study high heat wave events—unexpected long-duration heat waves. Many cities including Chicago, Phoenix and Paris have encountered these events over the past several years.
Data from numerous sources, including remotely sensed imagery from NASA, are being used to create an historical record of how temperatures and vegetation patterns changed across metro Phoenix from the early 1970s to 2000. William Stefanov, senior geoscientist with Jacobs Technology in JSC’s Astromaterials Research and Exploration Science Directorate, is providing the orbital view of the metropolitan area.
The remotely sensed information is collected from satellites or airplanes and includes vegetation, temperature and land cover. Together it provides a map of the urban and suburban surface at a moment in time. In addition, researchers will use the data to do what is called change detection analysis. Images from one year or one season can be compared with those from another. The changes, such as those in vegetation, can be highlighted.
“We’re using a series of Landsat data for historical vegetation and surface temperature, high-resolution airborne imagery to get detailed maps of the land cover in our study neighborhoods and the Advanced Spaceborne Thermal Emission and Reflection Radiometer, or ASTER, a Japanese sensor on board the NASA Terra satellite, for current surface temperature data,” said Stefanov.
An airborne data flight over Phoenix by the NASA MODIS/ASTER Simulator, or MASTER, sensor is planned for next year to coincide with a ground data collection campaign. Among other biophysical information, high-resolution measurements of ground surface temperature will be obtained from the MASTER data throughout the metropolitan area to compare with and validate other airborne and satellite data sets used in the project.
According to several global climate change models, the southwestern United States is predicted to experience higher temperatures and more droughts over the coming century. If that happens, Phoenix is expected to experience more heat wave events.
The remotely sensed data are fed into high-resolution urban climate models to build predictive simulations of what will happen to the Phoenix metropolitan area if predicted climate change occurs there. Maps of “riskscapes” produced by this project will show where people in Phoenix are most vulnerable to high heat events.
“This project has theoretical aspects, but it also has an applied focus,” said Stefanov. “We are trying to develop tools that city planners and emergency responders can use. Urban planners also can use this data so that they can help plan the city’s growth and perhaps replace materials that absorb heat with those that are more reflective.”
“A lot of urban development is taking place around the world in arid or semiarid climates,” said Stefanov. “By studying Phoenix, researchers can better understand what these developing cities may face and how their environments may change as populations expand.”
For more information visit http://www.nasa.gov/topics/earth/features/phoenix_heatwaves_feature.html
A long-exposure Hubble Space Telescope image shows a majestic face-on spiral galaxy located deep within the Coma Cluster of galaxies, which lies 320 million light-years away in the northern constellation Coma Berenices.
The galaxy, known as NGC 4911, contains rich lanes of dust and gas near its center. These are silhouetted against glowing newborn star clusters and iridescent pink clouds of hydrogen, the existence of which indicates ongoing star formation. Hubble has also captured the outer spiral arms of NGC 4911, along with thousands of other galaxies of varying sizes. The high resolution of Hubble's cameras, paired with considerably long exposures, made it possible to observe these faint details.
NGC 4911 and other spirals near the center of the cluster are being transformed by the gravitational tug of their neighbors. In the case of NGC 4911, wispy arcs of the galaxy's outer spiral arms are being pulled and distorted by forces from a companion galaxy (NGC 4911A), to the upper right. The resultant stripped material will eventually be dispersed throughout the core of the Coma Cluster, where it will fuel the intergalactic populations of stars and star clusters.
The Coma Cluster is home to almost 1,000 galaxies, making it one of the densest collections of galaxies in the nearby universe. It continues to transform galaxies at the present epoch, due to the interactions of close-proximity galaxy systems within the dense cluster. Vigorous star formation is triggered in such collisions.
Galaxies in this cluster are so densely packed that they undergo frequent interactions and collisions. When galaxies of nearly equal masses merge, they form elliptical galaxies. Merging is more likely to occur in the center of the cluster where the density of galaxies is higher, giving rise to more elliptical galaxies.
This natural-color Hubble image, which combines data obtained in 2006, 2007, and 2009 from the Wide Field Planetary Camera 2 and the Advanced Camera for Surveys, required 28 hours of exposure time.
The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA's Goddard Space Flight Center manages the telescope. The Space Telescope Science Institute (STScI) conducts Hubble science operations. STScI is operated for NASA by the Association of Univ
For more information visit http://www.nasa.gov/mission_pages/hubble/science/island-universe.html
Gaze up at a cloud-filled sky, and you may spot the white, fluffy shape of a dragon, fish or elephant. Looking at the same sky, Graeme Stephens sees a different vision -- a possible future for Earth's climate.
Stephens, a professor at Colorado State University in Ft. Collins, is principal investigator of NASA's CloudSat mission, launched in 2006 to improve our understanding of the role clouds play in our complicated climate system. Stephens says that as Earth's global temperature continues to rise, water vapor -- the most abundant greenhouse gas on Earth, which traps heat much as carbon dioxide does -- will continue to build, with uncertain results.
"We're seeing that now," Stephens said. "We just don't know what this will mean for how clouds might change, and for Earth's temperature and climate. Although a small change of clouds--for example, more low clouds--in the right direction would mitigate the effects of increased carbon dioxide, a small change of clouds in a different direction--for example, more high clouds--would amplify the warming caused by increasing carbon dioxide."
Calculating the balance between the cooling or warming effect of clouds and the warming effect of greenhouse gases is a complex problem for researchers, given their current understanding of clouds on Earth. And it's just one of many questions Stephens and fellow scientists are working to address with observations from CloudSat, an experimental satellite built and managed by NASA's Jet Propulsion Laboratory, Pasadena, Calif. CloudSat's goal is to learn about clouds and their effect on climate by studying them from space.
Floating Facts of Life
Clouds are an inescapable, and necessary, part of life. Aside from making for spectacular sunsets, they also create weather as we know it, from drizzly spring afternoons to the dark, dreary days of winter. "In all ways, shapes and forms, clouds influence life on Earth -- including our climate," says Stephens.
They also play a major role in making Earth habitable. As the sun's rays shine on our planet, flat, low-altitude stratus clouds reflect most of this heat back into space, keeping Earth cool with their shade. At the same time, thin, wide cirrus clouds high in the atmosphere trap heat on Earth's surface, keeping the planet warm. This delicate balance helps to create a comfortable climate, where life flourishes.
Clouds also play a primary role in how life-giving water circulates around our planet. As water on Earth's surface heats, it evaporates into water vapor and rises. As this vapor cools in the atmosphere, the molecules begin to clump together around stray particulates and condense to form clouds. When the clumps become too big, they drop back onto Earth's surface in the form of rain or snow. The never-ending global process of evaporation, precipitation, freezing and melting circulates water around the world -- while also providing the freshwater we need to live. This cycle, which is closely linked to natural exchanges of energy among the atmosphere, ocean and land, helps define our climate.
It's difficult to say what our world would be like if there were no clouds. But, says Stephens, "It's certain that our world without clouds would be nothing like what we know today."
Mars: A World Without Clouds (Mostly)
In fact, it might be much like Mars, says JPL planetary scientist David Kass. The Red Planet today has relatively few clouds compared to Earth. That's because the Martian atmosphere contains less than a tenth of a percent of the amount of water vapor found in Earth's atmosphere. Without much water vapor, and with temperatures averaging 80 degrees Celsius (176 degrees Fahrenheit) colder than on Earth, only thin ice clouds form. They tend to look like a thinner version of Earth's wispy cirrus clouds.
"We don't think that clouds on Mars get to the point where you couldn't see the sun through them, but they might get thick enough that you could look at the sun through them without hurting your eyes," sats Kass.
Mars also has thicker clouds made of frozen carbon dioxide -- commonly called dry ice --that form both high in the atmosphere and at the poles during winter, where the sun never rises for half the Mars year. These clouds are dense enough to dim the sun's light by about 40 percent (although the polar clouds are never actually illuminated by the sun), but because they are found only in limited regions near the planet's poles and equator, they are unlikely to affect the Martian climate as a whole.
Scientists theorize that the relatively sparse clouds on Mars allow temperatures to rise and fall dramatically. Without the cooling effect of significant cloud shade or the insulating effect of thick cloud blankets, the surface of Mars heats drastically during the day -- reaching temperatures around 18 degrees Celsius (65 degrees Fahrenheit) at the equator -- before the temperature plummets at night -- to equatorial surface temperatures as cold as 130 degrees Celsius below freezing (minus 202 degrees Fahrenheit).
But researchers don't yet know for certain how exactly Martian clouds affect the planet's climate. "It's not clear yet how big a role clouds play in Mars' climate," says Kass. "This is really on the cutting edge right now." As planetary climate models become more sophisticated, they will include the radiative effects of the clouds seen in data from the Mars Climate Sounder on NASA's Mars Reconnaissance Orbiter. Kass says the modelers will be able to incorporate that data and examine cases with and without clouds to see their impacts. "We hope to know more soon," Kass adds.
Venus: A Greenhouse Girl Gone Wild
If Mars is what an Earth without many clouds might look like, then Venus shows what our world might look like with far more.
Venus' skies are stuffed with brilliant white clouds that stretch around the entire planet without a single break. As a result, they -- and other molecules in the atmosphere -- reflect more than 80 percent of the sun's light back out into space. For many years, planetary scientists thought this would keep the surface of Venus relatively cool. Yet when the Russian probe Venera 4 landed on the Venusian surface in 1967, it measured a temperature of 482 degrees Celsius (900 degrees Fahrenheit). That's hot enough to melt lead.
"At that point, we realized two things: Venus' atmosphere is very thick -- about 100 times thicker than Earth's -- and greenhouse gases are important to climates," said Kevin Baines, a planetary scientist at JPL and senior research scientist at the University of Wisconsin-Madison.
Venus' thick clouds are surrounded by carbon dioxide, a greenhouse gas that traps heat on the planet's surface. The little heat from the sun that makes it through the reflective cloud barrier has little chance of escape, and as that heat builds -- if only a little bit at a time -- the surface of Venus gets hotter and hotter.
The heating of Venus' clouds could also cause the planet's extreme air circulation. The excess heat, Baines says, seems to whip the entire atmosphere up to hurricane-force winds, causing the atmosphere at cloud level to circulate 60 times faster than the planet rotates.
"Venus is a planet of extremes," says Baines. "It's very hostile and very hot; you can't survive very long there."
Titan: Partly Cloudy, With a Chance of Methane Rain
There is a middle ground between Mars' relatively clear skies and Venus' cloud-choked heavens. Scattered clouds float above the icy surface and liquid lakes of Titan, the largest of Saturn's many moons. These clouds, which are made mostly of methane, punctuate the sky more in the winter than in the summer, just like clouds on Earth. By trapping in the little heat that makes it through Titan's upper level of thicker atmospheric clouds, the scattered clouds warm the surface to a frigid minus 183 degrees Celsius (minus 297 degrees Fahrenheit) on average, keeping the moon's methane lakes and rivers liquid.
NASA's Cassini-Huygens spacecraft studies Titan and its climate, in part to learn more about how cloud cover and other variables affect climate.
CloudSat: Revealing the Inner Secrets of Earth's Clouds
So what have the first four years of CloudSat operations taught us about our mysterious friends in the sky? Stephens says the mission has already yielded a number of important findings.
Among the highlights, the satellite has gathered the first statistics on global vertical cloud structure, including overlapping clouds, to create three-dimensional maps of Earth's cloud cover. It measured the percentage of clouds giving off rain at any given time (13 percent) to better understand how efficiently clouds convert condensed water into rain. It has monitored nighttime storms at Earth's poles from space for the first time. And it has revealed connections between storms at the poles and very high clouds that help create ozone.
"Before CloudSat, we essentially had photos of the tops of clouds from other satellites and photos of the bottoms of clouds from ground-based telescopes," says Deborah Vane, CloudSat deputy principal investigator and JPL project manager for the mission. "CloudSat's advanced radar slices into clouds and looks into their inner structure."
By viewing this complete picture of how clouds operate both inside and out for the first time, and monitoring it on a global scale, CloudSat is offering climatologists the data they need to create better models of Earth's climate -- and help predict what the surface of our planet will probably look like in the future.
So could Earth ultimately turn into a steady inferno like Venus or a fluctuating icebox like Mars? Fortunately, says Stephens, data from CloudSat and other sources show that Earth's clouds are not about to shrink drastically or engulf our skies anytime soon.
"With CloudSat, we're getting information that's critical to understanding how changes to clouds will ultimately take place," said Stephens. "If we can confirm that the assumptions climate models make are right -- or wrong -- then we can have a major influence on their ability to predict the future."
For more informations visit http://www.jpl.nasa.gov/news/news.cfm?release=2010-262
The next spacewalk to complete the removal of a failed ammonia pump module and installation and activation of a new pump module on the International Space Station’s S1 Truss will take place no earlier than Wednesday.
Expedition 24 Flight Engineers Doug Wheelock and Tracy Caldwell Dyson completed the first spacewalk to remove and replace the pump module at 3:22 p.m. EDT Saturday. As the result of an ammonia leak in the final line that needed to be disconnected from the failed pump module, the day’s tasks were only partially completed. The decision was made to reconnect the line on the pump module and install a spool positioning device to maintain proper pressure internal to the ammonia line.
Teams on the ground are evaluating the impact of the leak on plans to replace the failed pump, as well as possible fixes for the leak. The completion of the process will most likely require at least two additional spacewalks.
Saturday’s excursion lasted 8 hours, 3 minutes, making it the longest expedition crew spacewalk in history and the sixth longest in human spaceflight history.
Wheelock conducted the fourth spacewalk of his career. Caldwell Dyson made her first spacewalk. Flight Engineer Shannon Walker operated Canadarm2, the station’s robotic arm, and assisted the spacewalkers from inside the station.
After the loss of one of two cooling loops July 31, ground controllers powered down and readjusted numerous systems to provide maximum redundancy aboard the orbiting laboratory. The International Space Station is in a stable configuration, the crew is safe and engineers continue reviewing data from the failed pump.
For more information visit http://www.nasa.gov/mission_pages/station/main/index.html
On July 6 this summer, Virginia's Department of Environmental Quality issued the region's first "unhealthy" air alert since 2008.
The culprit? "Bad" ozone and other air pollution that had combined to produce an abnormally high reading of 119 parts per billion in Suffolk and 70-80 parts per billion in other parts of southeastern Virginia. That compares to the natural concentration of ozone of about 10 parts per billion that was the norm more than a century ago.
Ozone spikes are part of a pattern of increasing O3 levels globally, in even the most remote areas, says Dr. Jack Fishman, senior research scientist in the Science Directorate at NASA Langley Research Center in Hampton, Va.
"I think what we have to dispel is that ozone pollution is confined to places like Los Angeles and Houston," says Fishman. "Despite emission controls that have resulted in notable reductions in many American cities, O3 concentrations in non-urban areas in both the U.S. and around the world are increasing, with negative impacts to all living things -- plants, animals, and people."
Fishman is an expert in the composition of the troposphere, which is the part of the atmosphere that extends from the ground up to four to 12 miles (19.3 km), depending on where it is measured. In general, the troposphere is deeper in the tropics than at higher latitudes.
The troposphere contains about 75 percent of the atmosphere's mass, 99 percent of its water vapor and is where weather occurs.
Although 'good' ozone high in the stratosphere -- the layer just above the troposphere -- provides a shield to protect life on Earth, direct contact with it is harmful to plants and animals, including humans.
According to the Environmental Protection Agency, exposure to ozone levels of greater than 80 parts per billion for eight hours or longer is unhealthy. Harmful effects can include throat and lung irritation or aggravation of asthma or emphysema.
Ground-level 'bad' ozone forms when nitrogen oxide gases from vehicle and industrial emissions react with volatile organic compounds -- carbon-containing chemicals that evaporate easily into the air, such as gasoline and paint thinners.
In addition to impacting human health, rising ozone levels are measurably reducing crops yields, says Fishman.
Among the crops affected are soybeans, rice, alfalfa, barley, cotton, oat, peanut, potato and wheat. Research by Fishman and others suggests that globally, the cost of crop damage by surface ozone is as much as $26 billion annually.
And it's likely to get worse.
"Coupling our recently published crop productivity statistical findings with a global model that simulates the formation and transport of ozone pollution, our findings suggest that we are now at a crossroads with respect to agricultural productivity," the St. Louis native says.
Surface ozone, Fishman adds, knows no geographic or political boundaries. Indeed, he says, "the influx of pollution from east Asia might have been a factor that led to crossing a threshold concentration in the U.S. so that the impact of such pollution is now observable. In other words, if we had done the same analysis using agricultural and ozone data from the 1980s or even 1990s, the impact of ozone on crops would not have been seen."
"Certainly, in the 19th and early 20th century, background surface ozone concentrations were so low that an increase of 25 percent would not have affected living organisms," says Fishman. "But with the IPCC-projected increase on the order of 10 to 20 percent in the next decade or two, the currently observable effects on crop productivity will be significantly exacerbated."
How data are gathered
Ozone data have been collected from space by NASA's Total Ozone Mapping Spectrometer (TOMS) aboard several satellites that flew between 1978 and 2005 and now from the Ozone Monitoring Instrument (OMI) on the Aura satellite, launched in 2004.
At NASA's Langley Research Center in Hampton, Va., air-quality monitoring is performed onsite daily by Virginia's Department of Environmental Quality (DEQ) and the U.S. Environmental Protection Agency. The monitoring station opened this past spring under an agreement between NASA and the DEQ.
Measured are pollutants -- from factories, power plants and cars -- that can damage human health, plants, the environment and infrastructure. The pollutants include ozone, carbon monoxide, nitrogen oxides, sulfur oxides and airborne particulates.
The site "will bring together the partnership of NASA, DEQ, and the EPA in a coordinated effort to assess the relationship between space-based observations and surface observations of air quality," Langley scientist Margaret Pippin said last April. She is the scientist coordinating the DEQ's move to Langley.
The Langley site will house a complementary instrument that is essentially a ground-based version of the Ozone Monitoring Instrument. In parallel, both instruments will provide unique insight into how satellites can be used to improve our understanding of the formation of widespread air pollution episodes.
Ground-based air-quality measuring stations are located around the world. Although these provide valuable data, their coverage is limited. Satellites can provide a more global picture of air quality, but the quantities they measure from space are dependent on other factors in addition to the concentration measured at the surface. In addition to the DEQ/EPA/NASA venture, a Langley-led campaign will make trace gas and particulate measurements from instruments aboard NASA aircraft. The campaign is intended to improve the use of satellites for monitoring air quality, and to better understand the relationship between ground and satellite measurements.
The five-year effort will draw on researchers at Langley, Goddard Space Flight Center in Greenbelt, Md.; Ames Research Center, outside San Francisco; and multiple universities. The campaign is called DISCOVER-AQ (Deriving Information on Surface Conditions from Column and Vertically Resolved Observations Relevant to Air Quality).
For more information visit http://www.nasa.gov/topics/earth/features/bad_ozone.html
New NASA airborne radar images of Southern California near the U.S.-Mexico border show Earth's surface is continuing to deform following the April 4 magnitude, 7.2 temblor and its many aftershocks that have rocked Mexico's state of Baja California and parts of the American Southwest.
The data, from NASA's airborne Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR), reveal that some faults in the area west of Calexico, Calif., have continued to move at Earth's surface, most likely in the many aftershocks. This fault motion is likely to be what is known as "triggered slip," caused by changes in stress in Earth's crust from the main quake rupture. The new maps, called interferograms, were created by combining data from flights on April 13, 2010, and July 1, 2010.
The first image shows a UAVSAR interferogram swath measuring 87 by 20 kilometers (54 by 12.5 miles) overlaid atop a Google Earth image. Each colored contour, or fringe, of the interferogram represents 11.9 centimeters (4.7 inches) of surface displacement. The different shades in the image represent ground surface motions of up to a few inches upward or downward. Yellow shaded regions moved to the south or downward, regions in blue moved to the north or upward, and regions shaded in magenta showed no motion. Major fault lines are marked in red, and recent aftershocks are denoted by yellow, orange and red dots, with older earthquakes shown as gray dots.
An enlargement of the interferogram is shown in the second image. This image focuses on the area where most of the aftershocks have been located, west of Calexico. The enlargement, which covers an area measuring about 28 by 18 kilometers (18 by 11.5 miles), reveals many small "cuts," or discontinuities, in the interferogram color. These are caused by ground motions on small faults that have occurred since April 13, ranging from less than a centimeter to a few centimeters (half an inch to a few inches). The thin, colored lines represent faults previously mapped by the U.S. and California Geological Surveys through the end of 2009.
A science team at NASA's Jet Propulsion Laboratory, Pasadena, Calif., is using the JPL-developed UAVSAR to measure surface deformation from the quake. The radar flies at an altitude of 12.5 kilometers (41,000 feet) on a Gulfstream-III aircraft from NASA's Dryden Flight Research Center, Edwards, Calif.
The team uses a technique that detects minute changes in the distance between the aircraft and the ground over repeated, GPS-guided flights.
JPL geophysicist Andrea Donnellan, principal investigator of the UAVSAR project to map and assess seismic hazard in Southern California, said the latest flight provides valuable new data that researchers can use to monitor the continued readjustment of Earth's crust since April's major quake. "The region was reflown with UAVSAR to monitor continued activity, including quiet motions--which are movements of faults that do not result in earthquakes--as Earth's crust readjusts, as well as large aftershocks, such as the magnitude 5.7 quake observed on June 14," she said.
The April 4, 2010, El Mayor-Cucapah quake was centered 52 kilometers (32 miles) south-southeast of Calexico, Calif., in northern Baja California. The quake, the region's largest in nearly 120 years, was also felt in southern California and parts of Nevada and Arizona. There have been thousands of aftershocks, extending from near the northern tip of the Gulf of California to a few miles northwest of the U.S. border. The area northwest of the main rupture, along the trend of California's Elsinore fault, has been especially active.
Geologists used the first UAVSAR interferogram, which included the April 4th quake, to map many new, small fault ruptures in the field. JPL geophysicist Eric Fielding said, "This new interferogram shows that some of the faults have continued to slip since our overflight on April 13th. Such mapping is important for understanding the fault structure in this area between the main fault ruptures on April 4th farther south in Baja California and the faults farther to the north in Southern California, including the Elsinore and San Jacinto faults."
Fielding's studies of interferograms from Japanese and European Space Agency satellites indicate the largest fault movement visible in the new UAVSAR interferogram occurred between May 21 and June 6 along a northeast-southwest trending fault known as the Yuha fault, visible to the right of the center of the interferogram. This fault previously had slipped about 2 to 4 centimeters (1 to 2 inches) in the first days after the April 4th earthquake, as was shown in the earlier UAVSAR and satellite interferograms. Since April 13, the fault has slipped about another 2 centimeters (1 inch).
The fact that the Yuha fault has a trend towards the northeast is also significant, Fielding added, because it is very different from the northwest-trending major Elsinore and San Jacinto fault systems and the faults in Mexico that ruptured in the main earthquake. "This adds to evidence that the faults in Mexico are not directly connected to the Elsinore and San Jacinto faults and may explain why the magnitude 7.2 April 4 earthquake stopped before it reached California," he said.
UAVSAR is part of NASA's ongoing effort to apply space-based technologies, ground-based techniques and complex computer models to advance our understanding of quakes and quake processes. The radar flew over Hispaniola earlier this year to study geologic processes following January's devastating Haiti quake. The data are giving scientists a baseline set of imagery in the event of future quakes. These images can then be combined with post-quake imagery to measure ground deformation, determine how slip on faults is distributed, and learn more about fault zone properties.
UAVSAR is also serving as a flying test bed to evaluate the tools and technologies for future space-based radars, such as those planned for a NASA mission currently in formulation called the Deformation, Ecosystem Structure and Dynamics of Ice, or DESDynI. That mission will study hazards such as earthquakes, volcanoes and landslides, as well as global environmental change.
For more information visit http://www.jpl.nasa.gov/news/news.cfm?release=2010-258
My Friend's Blogroll
My Friend's Blogroll
My Friend's Blogroll
- ► 2011 (195)
- Three Storms
- Alpha Magnetic Spectrometer Makes Last Stop on Ear...
- NASA's Successful Ice Cloud and Land Elevation Mis...
- Tracing the Big Picture of Mars' Atmosphere
- WISE Captures the Unicorn's Rose
- Hurricane Katrina: A NASA Satellite Video Retrospe...
- Bright Lights
- Sailing Among the Stars
- Massive Attack
- Drought Drives Decade-Long Decline in Plant Growth...
- GRIP 'Shakedown' Flight Planned over Gulf Coast
- Tropically Speaking, NASA Investigates Precipitati...
- Endeavour Braces for STORRM
- NASA Releases New Image of Massive Greenland Icebe...
- NASA Assets Provide Orbital View to Study Phoenix ...
- 'Island Universe' in the Coma Cluster
- Send in the Clouds
- Next Spacewalk No Earlier Than Wednesday
- Increasing 'Bad' Ozone Threatens Human and Plant H...
- NASA Images Show Continuing Mexico Quake Deformati...
- Tank Prep
- NASA Chat: Perseid Meteor Shower Lights Up August ...
- NASA and ESA's First Joint Mission to Mars Selects...
- NASA's ATHLETE Warms Up for High Desert Run
- ▼ August (24)
- ► 2009 (511)