Marketing Automation: utopia or dystopia?

Businesses need to consider consumer psychology and resist the temptation to maximize short-term profits at the expense of consumers.

From segmentation to pricing, virtually every process involved in marketing can now be automated. The ability to track individuals’ online behavior and merge data sources increasingly allows marketers to target consumers at a granular level. Using machine learning-based algorithms, individuals can receive tailored product offers and advertisements, all in real time.

Such precise targeting increases business profitability, while allowing consumers to benefit from amenities and offers tailored to their needs. However, this can also lead to negative economic and psychological consequences for consumers. The question becomes, how do you ensure that marketing automation doesn’t create dystopia?

Profit Maximization   

Businesses maximize their profits when they sell their product or service at the highest price each customer is willing to pay. In the past, marketers could not easily determine individual willingness to pay (WTP), a situation that often resulted in consumers getting value for money. Today, machine learning-based prediction algorithms can provide ever more accurate estimates of a consumer’s WTP.

In one experiment, staffing firm ZipRecruiter.com saw it could increase profits by more than 80% by adopting algorithm-based individualized pricing, using more than a hundred consumption variables. Uber would use machine learning to set route-specific and time-of-day prices. Uber could easily use customer trip histories and other personal data to further personalize prices.

These developments can be alarming for consumers. While personalized pricing may benefit consumers with lower WTP who might otherwise be shut out of the market, many consumers will likely end up paying prices closer to their WTP.

Low remuneration of personal data

As a rule, consumers freely give the information necessary to derive their preferences and their WTP. But shouldn’t they be compensated for the inconvenience of customization? The companies, on the other hand, claim that consumers are rewarded with better deals and free services like YouTube videos, social media, and more.

In research I directed with INSEAD Daniel Walters and Geoff Tomaino, consumers have been found to systematically undervalue their private data when they barter it for goods or services rather than selling it for cash. Take users of social media platforms. They “pay” these services with private data, which the platforms use to generate advertising profits. Our experiments suggest that consumers undervalue their private data in such non-monetary exchange settings, even though they know how profitable social media platforms are. This unequal exchange of value likely contributes to the extraordinary valuations of dominant tech companies.

Autonomy loss

We all appreciate being autonomous in our choices, free from any outside influence. But such autonomy requires privacy. Without intimacy, we become predictable. The algorithms can then easily predict anything from our risk of credit default to our likelihood of buying certain products.

Further away experiences I directed with Wharton’s Rom Schrift and Yonat Zwebner showed that consumers act as if they were experiencing a threat to their autonomy when they understand that algorithms can predict their choices. When participants learned that an algorithm could predict their choices, they chose less preferred options to restore their sense of empowerment. To maximize the acceptance of prediction algorithms, marketers will need to frame them in such a way that they do not threaten consumers’ perceived autonomy.

Algorithms as a black box

The complexity of the algorithms often makes them difficult to explain. Also, many cannot be made transparent for competitive reasons. Regulators worry — and consumers get angry — when they don’t understand why an algorithm does what it does, such as when it blocks a desired financial transaction or grants a specific credit limit .

Articles 13 to 15 of the GDPR oblige companies to provide customers with “meaningful information about the logic involved” in these automated decisions. In another set of experiencesinforming rejected consumers of the goals of an algorithm was just as meaningful to them as how the algorithm arrived at its negative evaluation. Consumers derived a sense of fairness from understanding the goal of the algorithm.

How to mitigate the dystopia associated with marketing automation

Preventing dystopian outcomes is usually the purview of regulators, but companies must also put policies in place to address consumer concerns. Marketing automation poses complex challenges that require an array of solutions. These include data privacy regulations, mechanisms to ensure effective prices for personal data, and the deployment of fair privacy policies by companies. The following measures should also have a mitigating effect.

Regulation to support both privacy and competition

To improve market efficiency (by preventing the collection of personal data without adequate compensation for consumers), regulators must both protect consumer privacy and encourage competition. This poses a conundrum: policymakers need to protect innovation and competition between data-driven companies so that companies cannot monopolize their markets too easily. But fostering competition requires the sharing of consumers’ personal data between companies, which implies less privacy (witness Apple’s iOS requirement that apps obtain user permission to be tracked on other websites). other applications, which has an impact, among other things, on the targeting capacity of Facebook). This paradox requires a delicate balancing act. One solution could be to give consumers legal ownership of their data and create mechanisms for them to sell or rent their data to foster competition.

Data Transparency

Instead of opposing regulators’ efforts, companies should give consumers more say in their own data. Transparency about the collection and use of personal data can help restore consumer confidence in automated marketing routines. Losing some control over consumer data may limit opportunities for price discrimination, but will protect brands and long-term profits.

Framing algorithms in a positive light

Although algorithms sometimes breed mistrust, they can be more efficient and accurate than humans and improve our lives. However, companies must consider consumer and regulatory concerns when designing; otherwise they risk triggering a large resistance. Rather than emphasizing that algorithms can predict what a consumer will do, marketers should portray them as tools to help consumers make choices in line with their preferences. Transparency of algorithms can further reduce skepticism. Otherwise, explaining the goals of the algorithms can go a long way in reducing fears associated with AI-based decisions.

Avoiding a marketing automation dystopia is in everyone’s best interest, at least in the long term. In this horizon, companies must take consumer psychology into account and resist the temptation to maximize their short-term profits at the expense of consumers.

This article is adapted from an original article published in the NIM Marketing Intelligence Review.

Klaus Wertenbroch is Novartis Professor of Management and Environment and Professor of Marketing at INSEAD. He leads the Strategic Marketing Programone of INSEAD’s continuing education programs.

INSEAD Knowledge is now available LinkedIn. Join the conversation today.

Follow INSEAD’s knowledge of Twitter and Facebook.