Welcome!

News Feed Item

Moody’s Analytics Adds Stress Testing and Interest Rate Risk Models to RiskFrontier™ Software

Moody’s Analytics, a leader in risk measurement and management, today announced the release of the RiskFrontier™ 4.0 software, the latest version of its award-winning portfolio management and economic capital solution for banks, insurance companies, asset management firms, and corporations. The software includes two significant modeling innovations: the GCorr Macro Model, an expanded correlation model which enables clients to perform portfolio level stress testing; and, the ability to model the behavior of an exposure’s future cash flow using both credit and interest rate risk.

The GCorr Macro model supports single-period, simulation-based stress testing and reverse stress testing, as well as multi-period stress testing, as required by the Federal Reserve’s Comprehensive Capital Analysis and Review (CCAR). The first approach utilizes simulation output from Moody’s Analytics RiskFrontier software, taking into account portfolio effects such as concentration, diversification and credit migration. This enables clients to apply stress scenarios to their entire portfolios, measuring resulting losses and the portfolio’s sensitivity to each scenario.

For example, using a 35 billion sample corporate portfolio and 2013 CCAR variables, a simulation-based stress test shows that 54% of the portfolio loss is due to CCAR variables, while the remaining 46% of portfolio loss is due to other factors, such as industry or regional effects.

“GCorr Macro allows clients to see the effect macroeconomic scenarios have on an entire portfolio that might span commercial and industrial, small-medium enterprises, commercial real estate and retail loans,” said Dr. Amnon Levy, Head of Portfolio Research at Moody’s Analytics. “Clients can use the model to determine which variables have the greatest impact on a portfolio, or to determine which sectors are the most sensitive to specific variables. It also allows users to leverage their existing infrastructure, so implementation is relatively straightforward.”

Moody’s Analytics also implemented a bottom-up approach in the RiskFrontier 4.0 software to evaluate the losses accounting for both credit and interest rate risk at the instrument level. While historically these two risks have been evaluated in isolation, Moody’s Analytics built a framework which allows clients to model their interactions in a consistent way. For example, during simulations, an option on a fixed rate callable bond is optimally exercised based on the interest rate environment and the credit quality of the issuer.

“Financial institutions have long been struggling to integrate credit and interest rate risk, often having no choice but to account for these risks in silos and then combine them using crude approaches,” said Chris Shayne, Head of Portfolio & Valuation Products, Moody’s Analytics. “Moody’s Analytics integrated credit and interest rate risk model is the first of its kind that natively integrates the two sources of risk, improving the accuracy of results.”

The RiskFrontier 4.0 software will help financial institutions to measure the impact of rising interest rates, which are broadly forecasted by economists to increase throughout 2014, putting negative pressure on fixed-rate bond portfolios. Users can also measure the effect of growing volatility on call options and forecast economic losses resulting from changes in credit quality.

Financial institutions globally use the RiskFrontier software for credit portfolio management, valuation, capital optimization, risk based pricing, performance management and stress testing. It provides a granular analysis of a portfolio’s risk drivers through advanced analytics and modeling methodologies.

For more information, please visit http://www.moodysanalytics.com/riskfrontier2014.

About Moody’s Analytics

Moody’s Analytics helps capital markets and risk management professionals worldwide respond to an evolving marketplace with confidence. The company offers unique tools and best practices for measuring and managing risk through expertise and experience in credit analysis, economic research and financial risk management. By providing leading-edge software, advisory services and research, including proprietary analyses from Moody’s Investors Service, Moody’s Analytics integrates and customizes its offerings to address specific business challenges. Moody's Analytics is a subsidiary of Moody's Corporation (NYSE:MCO), which reported revenue of $2.7 billion in 2012, employs approximately 8,300 people worldwide and has a presence in 31 countries. Further information is available at www.moodysanalytics.com.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
StackIQ has announced the release of Stacki 3.2. Stacki is an easy-to-use Linux server provisioning tool. Stacki 3.2 delivers new capabilities that simplify the automation and integration of site-specific requirements. StackIQ is the commercial entity behind this open source bare metal provisioning tool. Since the release of Stacki in June of 2015, the Stacki core team has been focused on making the Community Edition meet the needs of members of the community, adding features and value, while ...
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
The cloud competition for database hosts is fierce. How do you evaluate a cloud provider for your database platform? In his session at 18th Cloud Expo, Chris Presley, a Solutions Architect at Pythian, gave users a checklist of considerations when choosing a provider. Chris Presley is a Solutions Architect at Pythian. He loves order – making him a premier Microsoft SQL Server expert. Not only has he programmed and administered SQL Server, but he has also shared his expertise and passion with b...
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
In his session at 18th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., and Logan Best, Infrastructure & Network Engineer at Webair, focused on real world deployments of DDoS mitigation strategies in every layer of the network. He gave an overview of methods to prevent these attacks and best practices on how to provide protection in complex cloud platforms. He also outlined what we have found in our experience managing and running thousands of Linux and Unix ...
"Operations is sort of the maturation of cloud utilization and the move to the cloud," explained Steve Anderson, Product Manager for BMC’s Cloud Lifecycle Management, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
UpGuard has become a member of the Center for Internet Security (CIS), and will continue to help businesses expand visibility into their cyber risk by providing hardening benchmarks to all customers. By incorporating these benchmarks, UpGuard's CSTAR solution builds on its lead in providing the most complete assessment of both internal and external cyber risk. CIS benchmarks are a widely accepted set of hardening guidelines that have been publicly available for years. Numerous solutions exist t...
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...