Welcome!

News Feed Item

International Global Precipitation Measurement Mission Data Goes Public

WASHINGTON, Sept. 4, 2014 /PRNewswire-USNewswire/ -- The most accurate and comprehensive collection of rain, snowfall and other types of precipitation data ever assembled now is available to the public. This new resource for climate studies, weather forecasting, and other applications is based on observations by the Global Precipitation Measurement (GPM) Core Observatory, a joint mission of NASA and the Japan Aerospace Exploration Agency (JAXA), with contributions from a constellation of international partner satellites.

NASA Logo

The GPM Core Observatory, launched from Japan on Feb. 27, carries two advanced instruments to measure rainfall, snowfall, ice and other precipitation. The advanced and precise data from the GPM Core Observatory are used to unify and standardize precipitation observations from other constellation satellites to produce the GPM mission data. These data are freely available through NASA's Precipitation Processing System at Goddard Space Flight Center in Greenbelt, Maryland.

"We are very pleased to make all these data available to scientists and other users within six months of launch," said Ramesh Kakar, GPM program scientist in the Earth Science Division at NASA Headquarters, Washington.

In addition to NASA and JAXA, the GPM mission includes satellites from the U.S. National Oceanic and Atmospheric Administration, U.S. Department of Defense's Defense Meteorological Satellite Program, European Organisation for the Exploitation of Meteorological Satellites, Indian Space Research Organisation, and France's Centre National d'Études Spatiales.

Instruments on the GPM Core Observatory and partner satellites measure energy naturally emitted by liquid and frozen precipitation. Scientists use computer programs to convert these data into estimates of rain and snowfall. The individual instruments on the partner satellites collect similar data, but the absolute numbers for precipitation observed over the same location may not be exactly the same. The GPM Core Observatory's data are used as a reference standard to smooth out the individual differences, like a principal violinist tuning the individual instruments in an orchestra. The result is data that are consistent with each other and can be meaningfully compared.

With the higher sensitivity to different types of precipitation made possible by the GPM Core Observatory's Microwave Imager (GMI) and Dual-frequency Precipitation Radar (DPR), scientists can for the first time accurately measure the full range of precipitation from heavy rain to light rain and snow. The instruments are designed not only to detect rain and snow in the clouds, but to measure the size and distribution of the rain particles and snowflakes. This information gives scientists a better estimate of water content and a new perspective on winter storms, especially near the poles where the majority of precipitation is snowfall.

"With this GPM mission data, we can now see snow in a way we could not before," said Gail Skofronick-Jackson, GPM project scientist at Goddard Space Flight Center.  "Cloud tops high in the atmosphere have ice in them. If the Earth's surface is above freezing, it melts into rain as it falls. But in some parts of the world, it's cold enough that the ice and snow falls all the way to the ground."

One of the first storms observed by the GPM Core Observatory on March 17 in the eastern United States showed that full range of precipitation. Heavy rains fell over the North and South Carolina coasts. As the storm moved northward, West Virginia, Virginia, Maryland and Washington were covered with snow. The GMI observed an 547 mile- (880 kilometer) wide track of precipitation on the surface, while the DPR imaged every 820 feet (250 meters) vertically to get the three-dimensional structure of the rain and snowfall layer by layer inside the clouds.

"What's really clear in these images is the melting layer, the place in the atmosphere where ice turns into rain," said Skofronick-Jackson. "The melting layer is one part of the precipitation process that scientists don't know well because it is in such a narrow part of the cloud and changes quickly. Understanding the small scale details within the melting layer helps us better understand the precipitation process."

The combined snowfall and rainfall measurements from GPM will fill in the picture of where and how water moves throughout the global water cycle.

"Scientists and modelers can use the new GPM data for weather forecasts, estimating snowpack accumulation for freshwater resources, flood and landslide prediction, or tracking hurricanes," Skofronick-Jackson said. "This revolutionary information also gives us a better grasp of how storms and precipitating systems form and evolve around the planet, providing climate modelers insight into how precipitation might change in a changing climate."

GPM data are freely available to registered users from Goddard's Precipitation Processing System (PPS) website. The data sets are currently available in strips called swaths that correspond to the satellites' overpasses. Daily and monthly, global maps are also available from all the sensors. In the coming months, the PPS will merge this instrument data from all partner satellites and the Core Observatory into a seamless map that shows global rain and snow data at a 6-mile (10-kilometer) resolution every 30 minutes.

The GPM Core Observatory was the first of five scheduled NASA Earth science missions launching within a year. NASA monitors Earth's vital signs from land, air and space with a fleet of satellites and ambitious airborne and ground-based observation campaigns. NASA also develops new ways to observe and study Earth's interconnected natural systems with long-term data records and computer analysis tools to better see how our planet is changing. The agency freely shares this unique knowledge with the global community and works with institutions in the United States and around the world that contribute to understanding and protecting our home planet.

For more information about NASA's Earth science activities, visit:

http://www.nasa.gov/earthrightnow

For more information about GPM, visit:

http://www.nasa.gov/gpm

To access the newly released data, visit:

http://pmm.nasa.gov/data-access

Logo - http://photos.prnewswire.com/prnh/20081007/38461LOGO

SOURCE NASA

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, explained the best practices of continuous testing at high scale, which is rele...
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
"A lot of times people will come to us and have a very diverse set of requirements or very customized need and we'll help them to implement it in a fashion that you can't just buy off of the shelf," explained Nick Rose, CTO of Enzu, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and micro services. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your contain...
"We host and fully manage cloud data services, whether we store, the data, move the data, or run analytics on the data," stated Kamal Shannak, Senior Development Manager, Cloud Data Services, IBM, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
Information technology (IT) advances are transforming the way we innovate in business, thereby disrupting the old guard and their predictable status-quo. It’s creating global market turbulence. Industries are converging, and new opportunities and threats are emerging, like never before. So, how are savvy chief information officers (CIOs) leading this transition? Back in 2015, the IBM Institute for Business Value conducted a market study that included the findings from over 1,800 CIO interviews ...
"Operations is sort of the maturation of cloud utilization and the move to the cloud," explained Steve Anderson, Product Manager for BMC’s Cloud Lifecycle Management, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackIQ...
"I think that everyone recognizes that for IoT to really realize its full potential and value that it is about creating ecosystems and marketplaces and that no single vendor is able to support what is required," explained Esmeralda Swartz, VP, Marketing Enterprise and Cloud at Ericsson, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.