|By PR Newswire||
|March 25, 2014 11:21 AM EDT||
WASHINGTON, March 25, 2014 /PRNewswire-USNewswire/ -- NASA and the Japan Aerospace Exploration Agency (JAXA) have released the first images captured by their newest Earth-observing satellite, the Global Precipitation Measurement (GPM) Core Observatory, which launched into space Feb. 27.
The images show precipitation falling inside a March 10 cyclone over the northwest Pacific Ocean, approximately 1,000 miles east of Japan. The data were collected by the GPM Core Observatory's two instruments: JAXA's Dual-frequency Precipitation Radar (DPR), which imaged a three-dimensional cross-section of the storm; and, NASA's GPM Microwave Imager (GMI), which observed precipitation across a broad swath.
"It was really exciting to see this high-quality GPM data for the first time," said GPM project scientist Gail Skofronick-Jackson at NASA's Goddard Spaceflight Center in Greenbelt, Md. "I knew we had entered a new era in measuring precipitation from space. We now can measure global precipitation of all types, from light drizzle to heavy downpours to falling snow."
The satellite's capabilities are apparent in the first images of the cyclone. Cyclones such as the one imaged -- an extra-tropical cyclone -- occur when masses of warm air collide with masses of cold air north or south of the tropics. These storm systems can produce rain, snow, ice, high winds, and other severe weather. In these first images, the warm front ahead of the cyclone shows a broad area of precipitation -- in this case, rain -- with a narrower band of precipitation associated with the cold front trailing to the southwest. Snow is seen falling in the northern reaches of the storm.
The GMI instrument has 13 channels that measure natural energy radiated by Earth's surface and also by precipitation itself. Liquid raindrops and ice particles affect the microwave energy differently, so each channel is sensitive to a different precipitation type. With the addition of four new channels, the GPM Core Observatory is the first spacecraft designed to detect light rain and snowfall from space.
In addition to seeing all types of rain, GMI's technological advancements allow the instrument to identify rain structures as small as about 3 to 9 miles (5 to 15 kilometers) across. This higher resolution is a significant improvement over the capability of an earlier instrument flown on the Tropical Rainfall Measurement Mission in 1997.
"You can clearly see them in the GMI data because the resolution is that much better," said Skofronick-Jackson.
The DPR instrument adds another dimension to the observations that puts the data into high relief. The radar sends signals that bounce off the raindrops and snowflakes to reveal the 3D structure of the entire storm. Like GMI, its two frequencies are sensitive to different rain and snow particle sizes. One frequency senses heavy and moderate rain. A new, second radar frequency is sensitive to lighter rainfall and snowfall.
"Both return independent measurements of the size of raindrops or snowflakes and how they are distributed within the weather system," said DPR scientist Bob Meneghini at Goddard. "DPR allows scientists to see at what height different types of rain and snow or a mixture occur -- details that show what is happening inside sometimes complicated storm systems."
The DPR data, combined with data from GMI, also contribute to more accurate rain estimates. Scientists use the data from both instruments to calculate the rain rate, which is how much rain or snow falls to Earth. Rain rate is one of the Core Observatory's essential measurements for understanding where water is on Earth and where it's going.
"All this new information comes together to help us better understand how fresh water moves through Earth's system and contributes to things like floods and droughts," said Skofronick-Jackson.
GMI was built by Ball Aerospace & Technologies, Corp., in Boulder, Colo., under contract to NASA. DPR was developed by JAXA with the National Institute of Information and Communication Technology.
These first GPM Core Observatory images were captured during the first few weeks after launch, when mission controllers at the NASA Goddard Mission Operations Center put the spacecraft and its science instruments through their paces to ensure they were healthy and functioning as expected. The engineering team calibrates the sensors, and Goddard's team at the Precipitation Processing System verifies the accuracy of the data.
This initial science data from the GPM Core Observatory will be validated and then released for free by September online at:
For more information and the GPM mission, visit:
The GPM Core Observatory was the first of five planned Earth science launches for the agency in 2014. The joint NASA/JAXA mission will study rain and snow around the world, joining with an international network of partner satellites to make global observations every three hours.
NASA monitors Earth's vital signs from land, air and space with a fleet of satellites and ambitious airborne and ground-based observation campaigns. NASA develops new ways to observe and study Earth's interconnected natural systems with long-term data records and computer analysis tools to better see how our planet is changing. The agency shares this unique knowledge with the global community and works with institutions in the United States and around the world that contribute to understanding and protecting our home planet.
For more information about NASA's Earth science activities in 2014, visit:
In their session at 17th Cloud Expo, Hal Schwartz, CEO of Secure Infrastructure & Services (SIAS), and Chuck Paolillo, CTO of Secure Infrastructure & Services (SIAS), provide a study of cloud adoption trends and the power and flexibility of IBM Power and Pureflex cloud solutions. In his role as CEO of Secure Infrastructure & Services (SIAS), Hal Schwartz provides leadership and direction for the company.
Jul. 31, 2015 11:45 AM EDT Reads: 122
In a recent research, analyst firm IDC found that the average cost of a critical application failure is $500,000 to $1 million per hour and the average total cost of unplanned application downtime is $1.25 billion to $2.5 billion per year for Fortune 1000 companies. In addition to the findings on the cost of the downtime, the research also highlighted best practices for development, testing, application support, infrastructure, and operations teams.
Jul. 31, 2015 11:45 AM EDT Reads: 117
Mobile, social, Big Data, and cloud have fundamentally changed the way we live. “Anytime, anywhere” access to data and information is no longer a luxury; it’s a requirement, in both our personal and professional lives. For IT organizations, this means pressure has never been greater to deliver meaningful services to the business and customers.
Jul. 31, 2015 10:50 AM EDT
Puppet Labs has announced the next major update to its flagship product: Puppet Enterprise 2015.2. This release includes new features providing DevOps teams with clarity, simplicity and additional management capabilities, including an all-new user interface, an interactive graph for visualizing infrastructure code, a new unified agent and broader infrastructure support.
Jul. 31, 2015 10:00 AM EDT Reads: 140
For IoT to grow as quickly as analyst firms’ project, a lot is going to fall on developers to quickly bring applications to market. But the lack of a standard development platform threatens to slow growth and make application development more time consuming and costly, much like we’ve seen in the mobile space. In his session at @ThingsExpo, Mike Weiner, Product Manager of the Omega DevCloud with KORE Telematics Inc., discussed the evolving requirements for developers as IoT matures and conducte...
Jul. 31, 2015 08:45 AM EDT Reads: 301
Container technology is sending shock waves through the world of cloud computing. Heralded as the 'next big thing,' containers provide software owners a consistent way to package their software and dependencies while infrastructure operators benefit from a standard way to deploy and run them. Containers present new challenges for tracking usage due to their dynamic nature. They can also be deployed to bare metal, virtual machines and various cloud platforms. How do software owners track the usag...
Jul. 31, 2015 08:00 AM EDT Reads: 158
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...
Jul. 30, 2015 07:30 PM EDT Reads: 1,399
Providing the needed data for application development and testing is a huge headache for most organizations. The problems are often the same across companies - speed, quality, cost, and control. Provisioning data can take days or weeks, every time a refresh is required. Using dummy data leads to quality problems. Creating physical copies of large data sets and sending them to distributed teams of developers eats up expensive storage and bandwidth resources. And, all of these copies proliferating...
Jul. 30, 2015 06:30 PM EDT Reads: 889
Malicious agents are moving faster than the speed of business. Even more worrisome, most companies are relying on legacy approaches to security that are no longer capable of meeting current threats. In the modern cloud, threat diversity is rapidly expanding, necessitating more sophisticated security protocols than those used in the past or in desktop environments. Yet companies are falling for cloud security myths that were truths at one time but have evolved out of existence.
Jul. 30, 2015 06:00 PM EDT Reads: 1,803
Digital Transformation is the ultimate goal of cloud computing and related initiatives. The phrase is certainly not a precise one, and as subject to hand-waving and distortion as any high-falutin' terminology in the world of information technology. Yet it is an excellent choice of words to describe what enterprise IT—and by extension, organizations in general—should be working to achieve. Digital Transformation means: handling all the data types being found and created in the organizat...
Jul. 30, 2015 05:00 PM EDT Reads: 1,090
Public Cloud IaaS started its life in the developer and startup communities and has grown rapidly to a $20B+ industry, but it still pales in comparison to how much is spent worldwide on IT: $3.6 trillion. In fact, there are 8.6 million data centers worldwide, the reality is many small and medium sized business have server closets and colocation footprints filled with servers and storage gear. While on-premise environment virtualization may have peaked at 75%, the Public Cloud has lagged in adop...
Jul. 30, 2015 04:00 PM EDT Reads: 2,205
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
Jul. 30, 2015 03:45 PM EDT Reads: 452
The time is ripe for high speed resilient software defined storage solutions with unlimited scalability. ISS has been working with the leading open source projects and developed a commercial high performance solution that is able to grow forever without performance limitations. In his session at Cloud Expo, Alex Gorbachev, President of Intelligent Systems Services Inc., shared foundation principles of Ceph architecture, as well as the design to deliver this storage to traditional SAN storage co...
Jul. 30, 2015 03:00 PM EDT Reads: 1,741
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
Jul. 30, 2015 03:00 PM EDT Reads: 488
MuleSoft has announced the findings of its 2015 Connectivity Benchmark Report on the adoption and business impact of APIs. The findings suggest traditional businesses are quickly evolving into "composable enterprises" built out of hundreds of connected software services, applications and devices. Most are embracing the Internet of Things (IoT) and microservices technologies like Docker. A majority are integrating wearables, like smart watches, and more than half plan to generate revenue with ...
Jul. 30, 2015 02:30 PM EDT Reads: 101