Medical
  • Articles
  • May 2023

Improving Smoking and Alcohol Disclosures Using Behavioral Science

A pen pictured above an insurance application
In Brief

In this article, we discuss RGA’s latest behavioral science research, which assessed the trade-offs between optimizing questions for alcohol and smoking disclosure, applicants’ experience of answering the questions, and what the results might teach about designing simple and effective applications.

However, it is similarly vital that applicants have a smooth and pleasant experience when completing these forms. Behavioral techniques to improve questions often include breaking the “ask” into smaller and more digestible items. This might appear to make the question longer, although it may also make each item simpler to respond to, making the process simpler and more effective at capturing disclosures overall.

In this article, we discuss RGA’s latest behavioral science research, which assessed the trade-offs between optimizing questions for alcohol and smoking disclosure, applicants’ experience of answering the questions, and what the results might teach about designing simple and effective applications.

Introduction

Research has shown that even small adjustments to certain questions on life insurance applications, informed by behavioral science techniques, can lead to better disclosure rates[1] by mitigating certain psychological sources of misdisclosure.[2]

The techniques can include:

  • Reframing binary “yes or no” questions to conceal the underlying underwriting rule, making purposeful misdisclosure more difficult
  • Cueing applicants’ memory of their behaviors by providing greater specificity in questions, for example, by asking separate questions about the various alcoholic drink types
  • Reducing cognitive load (i.e., the mental effort required to answer a question) by breaking down general questions requiring complex thought into smaller pieces
  • Reducing the possibility of an applicant experiencing stigma, (i.e., feelings of shame or embarrassment) around sensitive questions by providing response scales and wordings that subtly normalize the target behaviors

At first glance, these techniques might appear to make the specific questions longer, with more elements for an applicant to consider. Questions revised with these techniques might also take up more visual space. This raises concerns for some insurers about applicant experience: does it make an application more time-consuming and/or challenging to complete?

Recent RGA behavioral science research into certain aspects of application question design is determining that these concerns may be unwarranted. Despite the questions appearing to be longer, they are turning out to be cognitively and emotionally easier for applicants to provide responses to them.

Designing Simple Underwriting Questions

When it comes to question design, simplicity is generally the goal. Often, insurers assume that simplicity comes from removing information or items in a question so that it appears shorter on a page. However, a psychological view suggests that achieving optimal simplicity comes from understanding how applicants process the information needed to produce a cogent and informative answer. Hence, there may be little trade-off overall between improved disclosures and applicant experience when using questions that allow people to process smaller amounts of information in one go but appear longer. Indeed, there may even be positive impacts.

We tested this hypothesis in a study using a randomized control trial format. The study utilized a pool of 8,000 participants from the U.S., Canada, South Africa, and Australia in nationally representative samples (proportionally represented the demographic distributions of each country). Different versions of application questions were presented to the participants as part of a survey that simulated an insurance application. Subjective user experience, amount of time spent on questions, and amount of information disclosed per question format was measured. We then used regression models to explore what impact the different question types had on the amount of information provided in responses while keeping other variables, such as the devices participants used and their demographics, constant.

Here is what we found for questions on smoking and alcohol use.

Smoking

We tested enhanced versions of an insurance application smoking question alongside the typical binary-style question (“Have you smoked or used nicotine products in the last two years? Yes/No”).

The enhanced versions asked: “When was the last time you smoked or used nicotine products?” and provided four, six, or eight response options. Each option is related to a different usage time period. For example, the question with four response options let the respondent choose among “in the last 12 months,” “between 12 months and 2 years ago,” “2 or more years ago”, and “never.” Those with six or eight response options let the applicant provide even greater specificity.

Providing these options both hides the underwriting rule and subtly destigmatizes smoking, making it more difficult for applicants to purposefully misdisclose and psychologically easier for them to be honest.

Findings

The enhanced versions of the questions with six or eight response options, but not the four-option version, led to significantly increased disclosure rates. Each of the two enhanced versions with more response options yielded an additional three percentage points of respondents disclosing they had smoked within the past two years compared to the typical binary question. (The four-option version also did not show an increase in disclosure.)

The key concern for insurers is whether adding more options might make for a lengthier response process. We found, however, that the difference was very small. The typical binary question took an average of one and a half seconds less to answer than the four-option version, two seconds less than the six-option enhanced version, and two and a half seconds less than the eight-option version (Figure 1).

Interestingly, those who disclosed smoking activities took slightly longer to respond to questions around smoking than those who did not, but only when answering the enhanced questions. This, perhaps, suggests that the question stimulated additional thought before producing a response.[3] The multiple-option response technique is purposefully used in question design to help users engage mindfully.[4] Interestingly, non-disclosers did not seem to have this issue.

This is a benefit for the applicant and the insurer, as the small increase in response time when using the enhanced questions yielded increased smoking disclosures, while response time for non-smokers was not affected.

We then explored how study participants experienced the actual process of answering the smoking question. We found no meaningful differences among the question types in terms of how easy or quick the question was to answer or how confident the participants felt about the accuracy of their responses. In fact, the enhanced questions were found to improve participants’ recall of the last time they smoked.

Overall, the tradeoffs between increased disclosure rates and applicant experience favor using behavioral science-enhanced questions: the increased response time is negligible and only present for those disclosing smoking, while the only noticeable difference in respondent perceptions of answering the enhanced questions is positive.

At RGA, we are eager to engage with clients to better understand and tackle the industry’s most pressing challenges together. Contact us to discuss and to learn more about RGA's capabilities, resources, and solutions.

Alcohol

Alcohol consumption is usually assessed in life insurance applications using an open format question, such as “In a typical week, how many alcoholic drinks do you consume?”

There are two concerns with this approach:

  • It is complicated for applicants to work out and then provide an accurate answer. They must think through and remember their alcohol consumption, decide what a typical week’s consumption consists of, and then create a reasonable estimate.
  • High levels of alcohol consumption carry a stigma, therefore applicants might be embarrassed to admit how much they actually consume.

The open-format question generally results in applicants estimating the number of drinks in a given time period that “feels” right and seems socially acceptable, rather than making the effort to provide a more accurate accounting. This reflects a behavioral science concept known as “satisficing,” in which people make “good enough” choices and decisions rather than taking the time to make optimal or well-thought-out decisions and judgments.[5]

We previously found that grouping types of alcoholic drinks and frequency of their consumption into categories, such as, “How many pints of beer/glasses of wine/shots of hard liquor do you drink in a typical week?” and then asking applicants to provide their response using a numerical scale, could improve disclosure rates.[1] The questions, it was thought, would more effectively elicit an applicant’s recall, reduce mental effort required to do so, and anchor their perspective of what is socially acceptable. If a respondent’s level of alcohol consumption is, for example, towards the middle of the scale provided, it may feel less embarrassing to them to indicate as much on the application.

As with the smoking question, providing more choices for the applicant to select from may make the actual question appear longer, but may also improve applicants’ experience and disclosures.

We tested this hypothesis by structuring questions that split alcohol consumption experience into categories that used either free text input, checkboxes, or a sliding scale as the response mechanisms. We also, for some respondents, used a typical free text question, which asked about total alcohol consumption generically in order to have a control group for comparison.

Findings

We first replicated previous findings by splitting types of alcoholic beverages (wine, beer, hard liquor) into categories and then provided either free text input boxes, a sliding scale, or a series of checkboxes for applicant responses. As with prior research, we again found that the individual drink checkbox and sliding scale versions increased disclosure rates. In fact, in this study, we found participants disclosed more than three times the level of alcoholic beverage consumption than they did when responding to a typical free-text question.

Our focus, as previously mentioned, was on whether the enhanced questions might affect applicants’ experience in responding to them. We found that using beverage categories also increased the average response time to the question, although as with the smoking question, the differences were small (Figure 2). The checkbox version was quicker for respondents to answer than the other enhanced versions and took around 15 seconds longer to answer than a free text question.

We also found no differences in terms of perceived ease, speed, confidence in the accuracy, or effortful mental processing between the free text and the checkbox versions. Indeed, the checkboxes were seen as significantly more useful to respondents in helping them remember their alcohol consumption than the other question types.

Overall, the disclosures and mental processing improvements were a significant win for an average response time increase of 15 seconds.

Enhanced Questions Bring Order to Underlying Complexities

We had set out to test whether tradeoffs might exist between increased disclosure rates and applicant experience when using behaviorally enhanced questions. In the case of both smoking and alcohol, the enhanced questions clearly produced better disclosure rates with minimal impacts on applicants’ experience.

In both categories, response times were only slightly higher when using the enhanced versions. For example, an additional 3% of participants disclosing smoking activity were obtained in exchange for an additional two seconds of average response time. This would seem a strong reason for using enhanced questions.

Participants’ subjective experience of answering the questions was not negatively impacted by the behavioral enhancements. In fact, they described an improved ability to remember their behavior for both smoking and alcohol-related activities. This supports our hypothesis that categories stimulate participants’ memories, making disclosures easier to elicit.

To think through one’s behavior and categorize it appropriately can be deceptively challenging. Typically, when a question requires a binary or free text response, the easiest route for respondents is to automatically respond with “no” or to estimate small, socially acceptable numbers. By presenting categories within enhanced question structures, a portion of the mental processing is already done for the applicant. The enhancements create, in fact, the structure that applicants would have to otherwise mentally create for themselves. Providing the enhancements disrupts the automatic “no” or “small amount” responses that are otherwise mentally easiest for respondents.

And here we come back to simplicity. As Apple’s former chief designer, Jony Ive has stated, simplicity is “much more than the absence of clutter. It’s about bringing order to complexity.”[6] When designing application forms, insurers would benefit from taking this stance. Visually longer questions may seem counterintuitive, but because these formats do the mental categorizing and organizing work that applicants would otherwise have to do themselves, the end result is not only better disclosure rates but also noticeable improvements in applicants’ ability to remember answers without negative impacts on their experience.

More Like This...

Meet the Authors & Experts

Peter Hovard
Author
Peter Hovard
Lead Behavioural Scientist, Risk and Behavioral Science

References

1. https://www.rgare.com/knowledge-center/media/ articles/behavioral-economics-disclosure-gaps- and-customer-journeys
2. https://www.rgare.com/knowledge-center/media/ articles/how-can-life-insurers-improve-the-dtc- application-process-a-behavioral-science-analysis
3. https://www.insurancethoughtleadership.com/ customer-experience/there-such-thing-positive- friction
4. https://uclic.ucl.ac.uk/content/4-publications/0- design-frictions-for-mindful-interactions-the-case- for-microboundaries/cox.chi.2016.pdf
5. https://psycnet.apa.org/record/1957-01985-001
6. https://techcrunch.com/2013/06/11jony-ives-de butes-ios-7-bringing-order-to-complexity/?guc counter=1