Into – and out of – the valley
Unfortunately, it may be necessary to experience the uncanny valley effect to properly understand it.
Video 1, below, features Jordan Durlester, RGA Vice President of Data Analytics, talking about the essentials of a term life policy for a fictional insurance company. Video 2 features an AI-created representation of Jordan delivering the same script.
Have you ever felt uneasy watching a lifelike robot or CGI character? Watch the uncanny valley effect at work.
There is little to nothing overtly non-human about the Jordan Durlester seen in Video 2. Viewers asked to point out specifically what is “wrong” with AI Jordan often have trouble identifying exactly what is causing them to feel so uncomfortable.
That is the essence of the uncanny valley effect.
The original hypothesis1 behind this effect stated that, as the appearance of a robot is made more human, many observers’ emotional response is increasingly positive – until that appearance reaches a certain closeness to an actual human’s likeness. At that point, the emotional response quickly slides into strong revulsion.
Only after going through that valley and coming out the other side – to a virtually exact likeness of a human – does a person’s emotional response return to positive. Behavioral science – an umbrella term that encompasses a variety of disciplines, including psychology, economics, and neuroscience – seeks to understand why, as well as any implications of that “why.”
Defining the problem: Inside the comprehension gap
The proportion of US households covered by life insurance has steadily declined since the 1960s. The 2024 LIMRA Barometer shows that life insurance penetration dropped from 77% in 1989 to 63% in 2011 and 51% in 2024. This trend is concerning for the future of the individual life insurance industry, as it suggests consumers are failing to see the value that life insurance offers.
One reason for the long-term and ongoing decline may be the way in which life products are now sold. Insurers are increasingly turning to digital channels and direct-to-consumer (D2C) sales. These channels are often seen as beneficial because sales journeys can, in theory, be completed quickly, easily, and virtually. But it is not clear whether D2C sales are as effective at reaching consumers as anticipated. By removing the adviser from the process, a gap emerges that needs to be filled to help customers understand the products.
And that’s how the uncanny valley effect enters the discussion.
Video messaging in an AI age
RGA’s behavioral science team developed a study testing the impact of using both human- and AI-generated videos. It measured the comprehension of the message based on the participants’ (N=2,005) ability to answer questions related to the digital journey. The study found that presenting information in video format significantly improves comprehension – 15% more when added to other behavioral science techniques.
Digital marketers often state that “people process visuals 60,000 times faster than text.”2 Although this may be an exaggeration, research states that people process visual information about six to 600 times faster than information presented as text.3 One marketing study found that viewers retain 95% of a message presented in a video compared to 10% in text that conveys the same material.4
Increasingly, videos are being narrated by computer-generated avatars rather than by humans. AI video messengers have key advantages over human versions. They are cheaper to use, increasingly easy and quick to create, and can be tailored to a specific role.
Overall, the interaction quality with AI beings can be similar to that achieved with human messengers.5 RGA’s research explored whether AI-generated avatars could become effective messengers for life insurance information.
What the research showed
The study tested how the messengers in the two video versions – the human-featured video (Video 1) and the AI-featured video (Video 2) – were perceived by measuring a range of characteristics and modeling the effect of these perceptions on comprehension and user experience. Participants were asked to what extent they agreed or disagreed that the characteristics described the on-screen subject.
Figure 1 shows the degree to which participants agreed or disagreed that a particular characteristic described the presenter they were randomly assigned to view.
Across most characteristics, participants viewed the human presenter as significantly more credible, expert, and likeable, and more similar in appearance, personality, and cultural background to themselves than the AI messenger. Behavioral science has demonstrated that these characteristics make messengers more effective – trusted, believed, and persuasive.
Notably, the AI messenger was perceived as significantly more unsettling than the human messenger.
This is the uncanny valley effect in action.
That unsettled feeling has a quantifiable impact on comprehension, which is why studying this phenomenon is so important for our industry. Whereas perceiving the messenger as credible, expert, and similar in cultural background led to gains in comprehension of the message being delivered, experiencing that messenger as unsettling significantly reduced comprehension (Figure 2).
Conclusion
There is already a place for AI-enhanced customer engagement in insurance. For example, respondents in a separate RGA study on improving mental health disclosures said they were most comfortable sharing their conditions with website AI chatbots compared to independent advisors, online forms, phone sales agents, or their doctors (Figure 3).
That said, the uncanny valley effect is a real phenomenon, and until AI technology advances to the point where the likeness depicted through AI voice and character generation more closely resembles actual human appearance, inflection, and mannerisms, AI avatar agents and customer care professionals may be best left on the sidelines.
Download the full RGA study for more details on how behavioral science can make life insurance product information more simple, effective, and understandable.