Tackling Information Overload: Equipping the Digital Risk Professional
Emerging technologies place increasing amounts of data at the fingertips of insurance risk professionals but with mixed results. To leverage the right information at the right time, an optimal mix of technologies is needed. It will allow you to analyze and price risk in a profitable manner.
2018 will be recorded as another challenging year for property commercial insurance carriers. This is a continuation of the soft market experienced over the previous three years.1 Paradoxically, the current period of low premium growth coincides with natural disasters resulting in record claims.2 With climate change expected to lead to increased frequency and severity of such disasters, the ability of insurers to model for such events and the resulting risks is essential.
Such pressure mounting on the top and bottom lines intensifies the need to accurately and efficiently assess risk as key to underwriting profitable business. Today’s Risk Professional leverages expertise, experience, gut feeling and data to assess the internal and external risks facing a potential client’s business.3 Due to emerging technologies, the sheer volume of data available today and just over the horizon to the Risk Professional will grow exponentially. No single technology will provide the silver bullet for delivering increased accuracy and efficiency in calculating risk. Instead, the successful Risk Professional will require the right information at the right time in a cost-efficient manner; and this, in turn, requires the right ecosystem of technology to collect, collate, visualize, analyze and price risk in a cost-efficient manner.
Tapping into the potential of data
Commercial insurers are motivated to seek better risk information and efficiencies in their underwriting process to boost performance and cost savings. «Lean Underwriting », «Expedited submission handling» and «Express Quote» are some of the buzzwords that describe industry initiatives to improve speed to quote and hit ratio.4 In this austere and competitive market, the Risk Professional must expedite the delivery of high-quality risk assessments. This includes information to quantify and qualify risk exposure.
Much of the technology investment to date has focused on the submission process. Just a few years ago, the industry standard was to process submission information by keying it manually. This was often conducted by relatively low-cost workers offshore. Today, insurers increasingly employ robotic process automatization (RPA), text mining and other automation tools to improve data quality and value while avoiding manual and error-prone data entry. This enables insurers to extract, classify, enrich and analyze client data faster than ever before. But is it enough? These technologies do not necessarily guarantee a better or faster understanding of the overall risk. For this, a holistic approach is required, combining speed and accuracy. For an overview of the key pain points and the technological solutions available to alleviate them across the risk assessment value chain, (please see fig. 1). The ability to interpret information early on and to steer the risk assessment process accordingly is key to improving underwriting.
With the advancement in technology, a plethora of data sources providing risk insights is available and this information pool will continue to grow. However, such information is only useful when it supports the decision-making process without overwhelming the recipient. Technologists argue that «the difference between a modernized insurer and one stuck in the last century is the ability to procure the right data at the right time».5 The Risk Professional therefore requires an easy-to-use solution that aggregates, categorizes and filters all relevant data.

Enabling the digital Risk Professional
What is required of an insurer to ensure that Risk Professionals have actionable data available at the right time to make analytical-driven decisions while improving accuracy and speed? To answer this question, we must take a close look at the ecosystem of technology that will underpin this process.
Enabling tools for triaging risk data:
A variety of technologies can be leveraged to produce an instantaneous overview of insurable objects based on internal and external sources. Reports submitted by the broker and carrier can be analyzed by machine learning or semantic rule-based tools (e.g. «Cogito» by «Expert System») to extract and score relevant text passages. This information can be considered beside claims histories and benchmarks based on proximity, occupancy or construction similarities to further refine the risk analysis.
To enable data collection from external sources, the deployment of an API-oriented landscape is critical. Application programming interfaces (APIs) serve as the vehicle for obtaining a vast amount of third-party risk information. Service providers like «CatNet®», «NatCatSERVICE» and «HazardHub» provide data to better understand the natural catastrophe side of exposure. Artificial intelligence-driven solutions like «Betterview’s Property Profile» transform geospatial imaging into risk insights to better assess the type and condition of building construction. Pertinent news alerts and web browsing algorithms reveal occupancy, activity or claims-history related information. For an overview of the various data source types, (see fig. 2).

Enabling toold for data synthesis and analysis:
While APIs are the highways for delivering risk data, one problem still remains: What does one do with the huge amount of data once it arrives? In order for a Risk Professional to handle the data, it must be standardized and synthesized. Powerful, insurance industry-specific tools using machine learning and big data analytics are required to quickly sift through large amounts of data. This enables insurance carriers to clean, organize and present information to the user via an interconnected dashboard, visualizing the most important indicators in a clear, condensed way. Having achieved this, the Risk Professional can decide how to proceed with the risk assessment.
Overcoming industry barriers
Industry, legal and financial issues stand in the way of realizing benefits from advanced, automated risk assessment solutions. The risk assessment and broader commercial insurance industry lacks standardization and a high degree of integration. Insurers use different frameworks, methodologies and guidelines to differentiate good from bad risk. Terminology, rating criteria and processes vary and make transferability of risk information difficult.
As information can be easily obtained, one must be careful not to violate contractual and/or legal agreements related to data usage and storage. Risk assessments are often based on proprietary or even confidential information, particularly for business interruption coverage. Data governance and process compliance needs to be prioritized. An additional challenge is financing such solutions, particularly in a soft market where budgets are tight. Benefits like a higher hit ratio or timesaving are straightforward and, hence, easy to justify. However, the benefit case of a particular technology solution might be difficult to measure. Furthermore, investments are more likely to be questioned if the only goal is to gain a higher confidence level in the underwritten exposure. Subjectivity and uncertainty will always remain no matter how much time or resources are used. Technology investment for the sake of risk assessment must be justified by identifying outliers and patterns that would not be possible through existing means.
The right tools for the road ahead
Accuracy and speed are regarded as trade-offs in commercial insurance. Technology cannot fully resolve this issue, but it will enable both to be achieved to a greater extent than they are today. Having the tools and infrastructure to consider actionable data will be the key differentiator of successful insurers in delivering superior risk analysis and ultimately, underwriting.