Personalisation is one of the most important components of modern marketing strategies. To relate to this thought, just look back and think how many times you have ignored those standard marketing emails from brands but clicked the ones that seemed to be relevant to your needs. Personalisation has many dimensions, and in this blog post, we are focusing on one unique dimension – ‘the emotional state of the user interacting with the brand’. At ParallelDots, our AI research group has taken some steps in this direction and created a unique algorithm that detects the underlying emotion behind a string of text.
Earlier we gave you heads up on the emotion detection technology and how we do it at ParallelDots. In this article, we will discuss the applications of emotion detection technology with marketing in particular. The term marketing probably has the most dynamic definition ever. But if you think of it, we haven’t completely achieved dynamic marketing yet. And, that is where emotion detection technology chimes in.
Emotional Connect – The secret ingredient in effective marketing campaigns
Emotions drive users to purchase. Brands have started shifting to more emotionally connecting advertisements, officially emerging from 2015’s Super Bowl commercials. Consumers used words like “empowering,” “positive,” “moving,” inspirational,” “touching,” and “uplifting” to describe the ads. They forwarded, shared, and celebrated ads from The Ad Council, Gillette, Mattel, Lean Cuisine, and Microsoft, making them five of the top viral emotive ads of 2015. These ads appealed to people of all ages, genders and other demographic breaks irrespective of the product or subject matter. They delivered messages that people overwhelmingly declared as the “best thing about the ad”. The positive emotions that these campaigns evoke lead to the association of the brand with those emotions and boost the brand’s image.
Understanding emotions – Explicit Structured Feedback
What could be better than customers actually telling you how they felt about a marketing campaign by casting their vote for a particular emotion out a structured list of possible options. This sounds good in theory, but practically almost impossible to implement. However, there is one exceptional company that is able to pull this off- Facebook. Now that Facebook has incorporated emotional reactions for the posts, it can better personalise the browsing experience for each person and also better understand what kind of ads a customer likes. The implications of how this can be used to improve the platform are huge.
Understanding Emotions – Analysing Unstructured Text
With increased digitization, there are many avenues where brands can get feedback from customers in the form of text – like emails, chats and social media messages. With the advent of deep learning, the ability of algorithms to understand the meaning behind text has improved considerably. At ParallelDots, we have built one such algorithm that analyses a particular string of text and deduces whether the underlying emotion is happy, excited, angry, sad or indifferent. This algorithm is used by some of our clients to analyze how people are reacting to their marketing campaigns on social media platforms like Youtube, Facebook, Twitter, and Instagram. This gives them the objective feedback about kind of content is working and enables them to fine tune their marketing strategy accordingly.
Companies can leverage such AI tools and classify all the customers into categories related to the emotion they express when they interact with the brand. This can be used to drive targeted engagement. For example, if a brand wants to launch the limited version of a new product it can target showing ads to only those clients who have expressed an emotion of “excitement” in the previous product launched. Or alternatively, if a client is typically angry when interacting with a brand, targeted discounts can be offered to assuage them.
Understanding Emotions – Analysing Images
We thought of explaining how AI (convolutional neural nets to be precise) can now effectively analyse the emotion behind images. But then we decided to let the picture below speak a thousand words. Using Faceapp, you can see how a portrait of Beethoven has been transformed to show how he would have looked had he been a child, a woman or very happy. The fact that we can transform emotions in existing images leaves little doubt our ability to detect emotions in the image.
In January 2016, Apple acquired Emotient, an emotion detection technology company. Emotient has a patent for a method of collecting and labeling up to 100,000 facial images a day, supporting a computer’s ability to recognize facial expressions. It will be fair to believe that Emotient’s emotion recognition technology will start appearing in iPhones and iPads soon. This could possibly be used for more targeted and dynamic engagements when users are browsing through or interacting with their platforms.
Another company, called Affectiva, claims to have built the world’s largest database of facial expressions and their corresponding emotions. It is using its innovations to help media companies, market research firms, and brands get more detailed consumer insights. By 2020, the emotional detection tech market will be worth over $20 billion and hence would be a highly investible area. Companies like Unilever, P&G, Mars, Honda, Kellogg, and Coca-Cola are using emotion analysis for their audiences.
A sneak peek into the future
It wouldn’t be a stretch if tomorrow you complain about the battery life of a particular brand on social media and see a personalised ad about the superior battery from a competing brand. The cameras in your phones and laptop could potentially record how your facial expressions are changing based on what kind of ad you are seeing. So if you are a vegan and you cringe on seeing ads showing a bucket of KFC chicken wings, that subtle expression could become a signal for KFC to not show you this ad copy. We could see the rise of emotion based A/B testing where multiple copies of an ad can be tested and the ones which generate best emotional response could be selected. Sounds too far fetched? Well, global market research leader Nielsen is already working on testing media content for clients by measuring neural activity of people. The difference is that Nielsen does it by deploying medical hardware to track the neural and biometric activities of a focus group.
At its core, brands are defined by the emotions they evoke in people. The key insight is that at an emotional level, we subconsciously form opinions that have a huge influence on our purchase decisions. With the advent of AI, it seems inevitable that brands will increasingly assess what kind of emotions they evoke in their customers and the adoption of these techniques could be faster than you expect. It remains to be seen how exactly this will evolve, but at ParallelDots we have taken some steps in this direction with our emotion analysis API. If you have any thoughts on this topic, we would love to hear about it.
ParallelDots is an Artificial Intelligence research and Deep Learning startup that provides AI solutions to clients in multiple domains. You can check out some of our text analysis APIs and reach out to us by filling this form here.