Guest Post

Guest Article: Are Thoughtless Analytics Driving Your Customers Away?

Profile1-215x300The thoughtless algorithm drives thoughtless analytics.  The algorithm that makes Amazon send me an email a day about shoes after I couldn’t find the brand I wanted on their site.  The one that put a father’s deceased daughter on his year in review and said, “Go brag about how great 2014 was.”  The one that shows me testing software ads on LinkedIn because I ran a software test team ten years ago.  These are fire and forget algorithms that work enough of the time to be kept in use but not well enough to really be useful.  They used to be an annoyance but are quickly becoming a customer turn off.

There’s a growing backlash against thoughtless analytics.  As data science is used on customers more often, they’re starting to set expectations for how well the algorithms work.  As I talk to customers about the analytics and targeting they see every day, I’m starting to hear a deep hatred for the thoughtless algorithm.

“I know they have all this information about me.  How do they not know?” – Your Customers

Customers know businesses are gathering data about them.  That’s been going on for long enough to sink into our collective consciousness.  What’s new is customers expect a return for all the data they’re giving away for free.

That list of expectations includes:

  • Smarter targeted marketing & offers.
  • Greater simplicity in researching & buying products.
  • Increasing levels of human sensitivity in automation.
  • Greater personalization of brand experiences & loyalty rewards.
  • Less intrusive advertising.

My recent experience with Amazon is a great example of the new customer expectation.  I ran a search on Amazon that returned no results and then left the site to go make my purchase elsewhere.  For two weeks I was flooded with emails pitching shoes I didn’t want as a result.  One email I could understand. Maybe I didn’t find what I was looking for anywhere so here’s a list of alternatives.  Nice, thanks Amazon but I’m good.  I told them that by not clicking on anything in the email.  Yet the emails continued.

Somewhere in Amazon’s massive complex of servers and analytics there was a thoughtless algorithm that kept the deluge coming.  What it really highlighted for me was two things: Amazon doesn’t have the brands I want and they don’t know me well enough to have figured that out.  From a customer lifetime value (CLV) perspective, that’s brand suicide.

Customers Expect a Return for the Data They Allow Businesses to Collect

The backlash has hit companies thought to be data science masters.  Facebook’s year in review feature has been making headlines for showing deceased relatives and ex-wives as reasons to brag about how great 2014 was.  The negative press signifies the growing expectations customers have when it comes to how their data is used.

Disney is a great example of a company giving customers a great return in exchange for their data.  They ask new customers how often they want to receive offers and what types of offers are most interesting.  They set the expectation for what the customer will be getting from the Disney brand in terms of personalization by asking.  They live up to that expectation using a hybrid of survey data and analytics.

What Disney understands that companies using thoughtless algorithms don’t is the limits of their analytics capabilities.  A good algorithm for targeted marketing starts out being accurate in the high 60’s to low 80 percent of the time.  With refining it gets to the mid to high 90 percent accuracy.  What Disney looked at was the impact on customers when they got it wrong during the refining process.  For their brand, that impact was unacceptable.

A Hybrid Approach: Recognizing the Limits & Impacts of Analytics

The hybrid approach is Disney’s, and a growing number of other companies’, solution to using analytics while avoiding the customer disenchantment that comes from thoughtless algorithms.  The success rates of data science based marketing are leaps ahead of what businesses were using just four or five years ago and so are the customer expectations.  However, companies that are successfully navigating this new reality are admitting that even 80% to 90% accuracy just isn’t enough.  In response these companies are letting surveys and other traditional methods of data gathering fill in the information that the algorithms don’t.

Where algorithms provide high levels of certainty (greater than 95%), they’re leveraged.  Rather than abandoning areas where algorithms’ uncertainty is unacceptable, traditional methods are used as a stop gap.  The data gathered from these methods allows the algorithm to be refined until it achieves a level of certainty that the business is comfortable with.  As time goes on, these highly accurate algorithms can replace the traditional methods.

There is a massive gap between analytics hype and a disciplined, integrated approach to data science.  One key to keep in mind is that applied data science is ten years old starting with Google in 2004.  The majority of data scientists have between one and five years of experience.  There are a handful of us who have been on enough projects to understand the implications of the algorithms we’re implementing.  That’s the biggest contributor to the hype; the difference between theoretical and applied.  In theory we can do a lot with data.  In reality there are unintended consequences of overreaching practical applications.

Think Before You Deploy Analytics & Test Everything

There are some important questions to ask before putting an algorithm into production.  A data science steering committee is the perfect forum to discuss them.

  • How many customers will be impacted by the margin of error or uncertainty?
  • What’s the potential impact when the algorithm gets it wrong?
  • What are the downstream and future impacts of an inaccurate prediction?
  • What’s the benefit of this algorithm when it’s accurate vs. the harm of failure?
  • What would the customer reaction be if they learned the business is using this algorithm?

The underlying theme is to assess what could go wrong but that’s not always an easy thing to do.  Much of applied data science is highly innovative.  First of its kind algorithms are common even in smaller data science groups.  The only way to understand the impacts is to run traditional quality assurance on data science applications.  A good data test team is worth its weight in gold.  Having an early adopter group to beta test analytics driven features is another great way to discover unexpected, undesirable functionality.

Avoid thoughtless algorithms at all costs.  Your customers will thank you with their loyalty.

Author Bio:

Vineet Vashishta is the founder of V-Squared Consulting, a leading edge data science services company.  He has spent the last 20 years in retail/eComm, gaming, hospitality, and finance building the teams,infrastructure and capabilities behind some of the most advanced analytics companies in the US.
You can follow him on Twitter:  @V_Vashishta.

 

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

To Top