News Feed Item

EQECAT Releases RQE (Risk Quantification & Engineering) Catastrophe Modeling Platform

Today EQECAT announced the release of its RQETM (Risk Quantification & Engineering) catastrophe risk modeling platform that enables clients to quantify and manage the potential financial impact of natural hazards.

RQE version 13 is the result of a multiple-year initiative that involved a significant degree of collaboration with clients, prospects, and industry experts and is the single largest release of its kind. While there are many improvements within the new platform, EQECAT has preserved and leveraged its robust methodology and unique treatment of uncertainty, which are the hallmarks of EQECAT risk modeling.

“We are thrilled that RQE will provide significant and increased value to the global re/insurance market,” commented Bill Keogh, president of EQECAT. “With so much that differentiates us competitively, we look forward to satisfying the pent up demand for our analytics. Having collaborated closely with leaders from virtually every segment and geography in the global re/insurance business, we are confident that RQE will disrupt the status quo of catastrophe risk modeling. All of us at EQECAT thank our existing and new clients for their collaboration throughout this development process and their confidence in RQE.”

Highlights of RQE v. 13 include:

  • Comprehensive portfolio aggregation
  • New financial model
  • Improved user interface
  • Improved import workflow
  • New database schema with 4-tier hierarchy
  • Significant improvements in import run times
  • Catastrophe model updates

Comprehensive Portfolio Aggregation

Standard output includes both Event Loss Table (ELT) for occurrence-based metrics and Year Loss Table (YLT) for aggregate loss metrics. 3G Correlation™ allows users to aggregate one or more YLTs to create an aggregate YLT. Reports can be generated from the YLT for annual losses, Exceedance Probability (EP) curves and the associated ELT.

New Financial Model

The implementation of the new 4-tier database hierarchy facilitates enhancements that enable more complete loss modeling for excess and surplus (E&S) lines, "step" policies, and other complex financial structures.

Improved User Interface

The release offers users significant improvements to the import process, exposure management and report selection to provide an enhanced user experience.

Improved Import Workflow

RQE v. 13 includes a new import workflow to enable easier input and editing of exposure data.

New Database Schema

This release includes an improved uniform database schema with 4-tier hierarchy at the account and portfolio level.

Significant Improvements in Import Run-time

Clients will experience faster run-times with new and improved import capabilities.

Catastrophe Model Updates

RQE v. 13 includes the update of 178 country/peril models including vulnerability, hazard, and correlation/simulation updates. Hazard and vulnerability have been updated for a number of models to incorporate new scientific research and detailed analyses of claims and exposure data from recent events. Correlation and simulation updates were made to all country/peril models to allow the ability to combine multiple portfolios without the need to re-analyze, using 3G Correlation™.

EQECAT will host a multiple-day conference to enable clients to own their view of risk by providing a thorough understanding of RQE v.13 and the entire EQECAT catastrophe modeling process. The catastrophe modeling conference is being held in Fort Lauderdale, Florida on April 9 – 11, 2013 at the Ritz-Carlton.

Learn more about RQE catastrophe modeling, or read the press release online.

EQECAT connects re/insurance and financial services clients with the world’s leading scientific minds to quantify and manage exposure to catastrophic risk. Leveraging decades of experience, EQECAT’s comprehensive methodology is distinguished by a unique treatment of uncertainty that helps clients set rational expectations about risk.

RQETM (Risk Quantification & Engineering), EQECAT’s new catastrophe risk modeling platform, will provide enhanced functionality and user experience with a new financial model, import workflow, and user interface. Increased analytical speed, expanded reporting, and improved integration capabilities provide clients with increased transparency and faster access to results for 180 natural hazard software models for 96 countries spanning six continents.

EQECAT, a subsidiary of ABSG Consulting Inc., was founded in 1994 and is headquartered in Oakland, California.

For more information, contact:
[email protected]

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Effectively SMBs and government programs must address compounded regulatory compliance requirements. The most recent are Controlled Unclassified Information and the EU’s GDPR have Board Level implications. Managing sensitive data protection will likely result in acquisition criteria, demonstration requests and new requirements. Developers, as part of the pre-planning process and the associated supply chain, could benefit from updating their code libraries and design by incorporating changes.
Apache Hadoop is a key technology for gaining business insights from your Big Data, but the penetration into enterprises is shockingly low. In fact, Apache Hadoop and Big Data proponents recognize that this technology has not yet achieved its game-changing business potential. In his session at 19th Cloud Expo, John Mertic, director of program management for ODPi at The Linux Foundation, will explain why this is, how we can work together as an open data community to increase adoption, and the i...
By now most people have either created their configuration management solution or are just embarking on this journey. In his session at @DevOpsSummit at 19th Cloud Expo, Marco Ceppi, a DevOps Engineer working at Canonical, will discuss how to take configuration management to the next level with modelling and orchestration. He will also discuss how and why people are moving from a machine-centric view to a service/application-oriented view of deployments, and how you can leverage the knowledge a...
operations aren’t merging to become one discipline. Nor is operations simply going away. Rather, DevOps is leading software development and operations – together with other practices such as security – to collaborate and coexist with less overhead and conflict than in the past. In his session at @DevOpsSummit at 19th Cloud Expo, Gordon Haff, Red Hat Technology Evangelist, will discuss what modern operational practices look like in a world in which applications are more loosely coupled, are deve...
What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...
In his session at 19th Cloud Expo, Nick Son, Vice President of Cyber Risk & Public Sector at Coalfire, will discuss the latest information on the FedRAMP Program. Topics will cover: FedRAMP Readiness Assessment Report (RAR). This new process is designed to streamline and accelerate the FedRAMP process from the traditional timeline by initially focusing on technical capability instead of documentation preparedness. FedRAMP for High-impact level systems. Early in 2016 FedRAMP officially publishe...
What are the successful IoT innovations from emerging markets? What are the unique challenges and opportunities from these markets? How did the constraints in connectivity among others lead to groundbreaking insights? In her session at @ThingsExpo, Carmen Feliciano, a Principal at AMDG, will answer all these questions and share how you can apply IoT best practices and frameworks from the emerging markets to your own business.
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
DevOps is speeding towards the IT world like a freight train and the hype around it is deafening. There is no reason to be afraid of this change as it is the natural reaction to the agile movement that revolutionized development just a few years ago. By definition, DevOps is the natural alignment of IT performance to business profitability. The relevance of this has yet to be quantified but it has been suggested that the route to the CEO’s chair will come from the IT leaders that successfully ma...
In his session at Cloud Expo, Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, will provide economic scenarios that describe how the rapid adoption of software-defined everything including cloud services, SDDC and open networking will change GDP, industry growth, productivity and jobs. This session will also include a drill down for several industries such as finance, social media, cloud service providers and pharmaceuticals.
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to impr...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
Successful digital transformation requires new organizational competencies and capabilities. Research tells us that the biggest impediment to successful transformation is human; consequently, the biggest enabler is a properly skilled and empowered workforce. In the digital age, new individual and collective competencies are required. In his session at 19th Cloud Expo, Bob Newhouse, CEO and founder of Agilitiv, will draw together recent research and lessons learned from emerging and established ...