Does the way we talk about our Offers during Digital Conversations affects our Conversion Rates?

Does the way we talk about our Offers during Digital Conversations affects our Conversion Rates?
Reading Time: 3 minutes

Digital conversations are increasingly prevalent throughout almost every customer and sales engagement, both with businesses and consumers. These technologies sometimes enhance the capibilities of a live customer or sales representative. Other times they completely replace person altogether.  It is driving efficiencies within companies, and even providing convenience for the buyer and customer.

Digital conversations often leverage automation and AI (artificial intelligence),  in the form of sales cadence tools, customer engagement and orchestration platforms, and even chatbot technologies. As productive as these technologies can be, we need to look at the language used in these automated conversations in order to make sure we are effective, not just efficient.

In an attempt to improve the effectiveness of our customer’s digital conversations with a marketing bot, one area we decided to analyze was around the “Offer” language during the digital conversations.  We looked into how we introduce an offer in order to engage the customer or prospect more, ultimately to schedule a meeting with a live person.  One of the most common offers on a website is to have a salesperson provide a demo of a product or solution. From the data we analyze across our entire customer base, a “Demo Offer” is often one of the most effective offers that will convert website traffic into a sales funnel opportunity.

With the marketing team of one of our customers, we decided to run an experiment around the digital conversation language within a Chatbot, and the words used to describe the “Demo Offer”.

The Strategy:

To test our thesis, we decided to run a “Controlled Experiment.” A controlled experiment is a scientific test that is directly manipulated by a scientist in order to test a single variable at a time. The variable being tested is the independent variable and is adjusted to see the effects on the system being studied. The controlled variables are held constant to minimize or stabilize their effects on the subject.

In our “Demo Offer” experiment, our independent variables (the element we tested) was to use specific product names in the initial greeting presented by the bot vs no specific product named.

We tested the language on two of the primary products that our customer sells. On each of the sections of their website that they provide information on these two products, we presented a bot that used specific product information vs generic product information. We evaluated these variations on two success criteria:

  1. Bot Engagements to Email Collected.
  2. Bot Engagements to Meeting Booked.

The Experiment:

The following were the “Demo Offer” Greeting Test Variations:

  1. Control Variable – “Let’s get your personalized demo set up!”
  2. Independent Variable #1 – “Let’s get your personalized [Product Name #1] demo set up!”
  3. Independent Variable #2 – “Let’s get your personalized [Product Name #2] demo set up!”
'Demo Offer' Greeting Bot Variations
“Demo Offer” Greeting Bot Variations

The Results:

The results of the experiment surprised us. We expected the test variations where we used product-specific language in the greeting of the bot to produce much higher conversion rates of both emails captured and meetings booked. However, this was not the case. The results were largely the same regardless of the language we used relating to the product. In fact, people seemed to slightly prefer generic language that didn’t specify product names.

This is how the experiment faired by the numbers:

ChatFunnels Results Analysis Bot Engaged Emails Captured Meetings Booked Email Conversion Rate Meeting Conversion Rate
Product 1 Control 387 19 9 4.90% 2.32%
Independent Variable #1 370 18 7 4.86% 1.89%
Product 2 Control 563 18 8 3.19% 1.42%
Independent Variable #2 553 17 7 3.07% 1.26%

These were the A/B experiment test results:

Controlled Experiment 'Demo Offer' - Test Results
Controlled Experiment “Demo Offer” – Test Results

Lessons Learned:

The experiment suggests when the bot did not use the product name in the initial greeting people felt slightly more comfortable providing their email (between 1-4% more comfortable) and eventually booking a meeting through the bot (about 12-19% more). These results were contrary to our expectations, we anticipated increased specificity would lead to increase conversions. The explanation behind this is not clear to us. It is possible that the increased sentence length effected continued engagement, or there may have been a limiting feeling by using more specific language. However, we are only speculating on possible causes.

Takeaways from this Experiment:

  1. It is best to use generic, not product specific language in describing offers within the bot dialogues.
  2. As always, do not trust intuition, experiment to identify the highest performing language with your bot dialogues.