Welcome!

News Feed Item

EQECAT Releases RQE (Risk Quantification & Engineering) Catastrophe Modeling Platform

Today EQECAT announced the release of its RQETM (Risk Quantification & Engineering) catastrophe risk modeling platform that enables clients to quantify and manage the potential financial impact of natural hazards.

RQE version 13 is the result of a multiple-year initiative that involved a significant degree of collaboration with clients, prospects, and industry experts and is the single largest release of its kind. While there are many improvements within the new platform, EQECAT has preserved and leveraged its robust methodology and unique treatment of uncertainty, which are the hallmarks of EQECAT risk modeling.

“We are thrilled that RQE will provide significant and increased value to the global re/insurance market,” commented Bill Keogh, president of EQECAT. “With so much that differentiates us competitively, we look forward to satisfying the pent up demand for our analytics. Having collaborated closely with leaders from virtually every segment and geography in the global re/insurance business, we are confident that RQE will disrupt the status quo of catastrophe risk modeling. All of us at EQECAT thank our existing and new clients for their collaboration throughout this development process and their confidence in RQE.”

Highlights of RQE v. 13 include:

  • Comprehensive portfolio aggregation
  • New financial model
  • Improved user interface
  • Improved import workflow
  • New database schema with 4-tier hierarchy
  • Significant improvements in import run times
  • Catastrophe model updates

Comprehensive Portfolio Aggregation

Standard output includes both Event Loss Table (ELT) for occurrence-based metrics and Year Loss Table (YLT) for aggregate loss metrics. 3G Correlation™ allows users to aggregate one or more YLTs to create an aggregate YLT. Reports can be generated from the YLT for annual losses, Exceedance Probability (EP) curves and the associated ELT.

New Financial Model

The implementation of the new 4-tier database hierarchy facilitates enhancements that enable more complete loss modeling for excess and surplus (E&S) lines, "step" policies, and other complex financial structures.

Improved User Interface

The release offers users significant improvements to the import process, exposure management and report selection to provide an enhanced user experience.

Improved Import Workflow

RQE v. 13 includes a new import workflow to enable easier input and editing of exposure data.

New Database Schema

This release includes an improved uniform database schema with 4-tier hierarchy at the account and portfolio level.

Significant Improvements in Import Run-time

Clients will experience faster run-times with new and improved import capabilities.

Catastrophe Model Updates

RQE v. 13 includes the update of 178 country/peril models including vulnerability, hazard, and correlation/simulation updates. Hazard and vulnerability have been updated for a number of models to incorporate new scientific research and detailed analyses of claims and exposure data from recent events. Correlation and simulation updates were made to all country/peril models to allow the ability to combine multiple portfolios without the need to re-analyze, using 3G Correlation™.

EQECAT will host a multiple-day conference to enable clients to own their view of risk by providing a thorough understanding of RQE v.13 and the entire EQECAT catastrophe modeling process. The catastrophe modeling conference is being held in Fort Lauderdale, Florida on April 9 – 11, 2013 at the Ritz-Carlton.

Learn more about RQE catastrophe modeling, or read the press release online.

EQECAT connects re/insurance and financial services clients with the world’s leading scientific minds to quantify and manage exposure to catastrophic risk. Leveraging decades of experience, EQECAT’s comprehensive methodology is distinguished by a unique treatment of uncertainty that helps clients set rational expectations about risk.

RQETM (Risk Quantification & Engineering), EQECAT’s new catastrophe risk modeling platform, will provide enhanced functionality and user experience with a new financial model, import workflow, and user interface. Increased analytical speed, expanded reporting, and improved integration capabilities provide clients with increased transparency and faster access to results for 180 natural hazard software models for 96 countries spanning six continents.

EQECAT, a subsidiary of ABSG Consulting Inc., was founded in 1994 and is headquartered in Oakland, California.

For more information, contact:
[email protected]

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Interoute has announced the integration of its Global Cloud Infrastructure platform with Rancher Labs’ container management platform, Rancher. This approach enables enterprises to accelerate their digital transformation and infrastructure investments. Matthew Finnie, Interoute CTO commented “Enterprises developing and building apps in the cloud and those on a path to Digital Transformation need Digital ICT Infrastructure that allows them to build, test and deploy faster than ever before. The int...
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service.
Niagara Networks exhibited at the 19th International Cloud Expo, which took place at the Santa Clara Convention Center in Santa Clara, CA, in November 2016. Niagara Networks offers the highest port-density systems, and the most complete Next-Generation Network Visibility systems including Network Packet Brokers, Bypass Switches, and Network TAPs.
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
SYS-CON Events announced today that Outlyer, a monitoring service for DevOps and operations teams, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Outlyer is a monitoring service for DevOps and Operations teams running Cloud, SaaS, Microservices and IoT deployments. Designed for today's dynamic environments that need beyond cloud-scale monitoring, we make monitoring effortless so you ...
My team embarked on building a data lake for our sales and marketing data to better understand customer journeys. This required building a hybrid data pipeline to connect our cloud CRM with the new Hadoop Data Lake. One challenge is that IT was not in a position to provide support until we proved value and marketing did not have the experience, so we embarked on the journey ourselves within the product marketing team for our line of business within Progress. In his session at @BigDataExpo, Sum...
Virtualization over the past years has become a key strategy for IT to acquire multi-tenancy, increase utilization, develop elasticity and improve security. And virtual machines (VMs) are quickly becoming a main vehicle for developing and deploying applications. The introduction of containers seems to be bringing another and perhaps overlapped solution for achieving the same above-mentioned benefits. Are a container and a virtual machine fundamentally the same or different? And how? Is one techn...
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor - all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
What sort of WebRTC based applications can we expect to see over the next year and beyond? One way to predict development trends is to see what sorts of applications startups are building. In his session at @ThingsExpo, Arin Sime, founder of WebRTC.ventures, will discuss the current and likely future trends in WebRTC application development based on real requests for custom applications from real customers, as well as other public sources of information,
China Unicom exhibit at the 19th International Cloud Expo, which took place at the Santa Clara Convention Center in Santa Clara, CA, in November 2016. China United Network Communications Group Co. Ltd ("China Unicom") was officially established in 2009 on the basis of the merger of former China Netcom and former China Unicom. China Unicom mainly operates a full range of telecommunications services including mobile broadband (GSM, WCDMA, LTE FDD, TD-LTE), fixed-line broadband, ICT, data communica...
With the introduction of IoT and Smart Living in every aspect of our lives, one question has become relevant: What are the security implications? To answer this, first we have to look and explore the security models of the technologies that IoT is founded upon. In his session at @ThingsExpo, Nevi Kaja, a Research Engineer at Ford Motor Company, will discuss some of the security challenges of the IoT infrastructure and relate how these aspects impact Smart Living. The material will be delivered i...
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and micro services. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your contain...
Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, represent...
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningf...