TEST
Putting it to the test
An AB test for the Psychology of Engagement
While the previous discussions around self-determination theory and intrinsically motivating experiences make good intuitive sense, we weren’t satisfied to take this at face value. If it really is the case that modern businesses need to compete for attention like never before, we wanted to go in search of hard, unarguable proof that a correct application of psychology can help companies win this competition and succeed in their growth objectives.
So, we did the only sensible thing and partnered with media giant Nielsen and the world-leading team at Lumen Research to design a scientific study to prove the effectiveness of all this psychological theory, one way or the other.
The idea for the study was simple – take a single piece of content and reproduce it in two very different forms:
One “control” version which would be in a standard, unenhanced reading format.
One “psychologically-enhanced” version which would deploy all of the learnings of self-determination theory, the picture superiority effect and more.
The piece of content we landed on for the test was “The Challenge of Attention”, Nielsen’s own landmark study on the subject of audience engagement. This was many months in the making and was expected to generate significant interest irrespective of how it was presented, making it an excellent test case for us to really prove or disprove the value of a psychological approach to content design.
Each version of the content would contain exactly the same chapters, words, and messages – only the medium would differ between the two.
Rather than providing readers with a traditional, linear “start at the top and scroll to the bottom” experience, we designed one which allowed each reader to take their own journey through the content and read the sections of interest in the order that made the most sense to them.
Again, rather than delivering a standard scrolling experience, we gave each reader the chance to take positive actions within the content by clicking and navigating to the pieces that felt most interesting to them. This was designed to deliver a sense of competence because the reader would need to explore in their own way.
Instead of just providing a one-way reading experience, we allowed readers to answer questions and engage with video and other interactive elements as they consumed the content. This was designed to create more of a two-way conversation within the content with the reader putting as much into the piece as they got out of it.
Each new chapter in the piece was accompanied by a contextually relevant image to help introduce the new subject and cement the concept in the reader’s mind for improved recollection.
Each page layout made extensive use of imagery and different layouts to ensure every part of the experience was visually rich, striking, and distinctive to further increase attention and retention of the information presented.
Numerous other psychological principles and aspects were deployed in addition to the above, but these would require a far deeper level of explanation which we will explore in future articles.
Can’t wait to further develop your own psychological know-how?
Subscribe to our Noggin’ Notes newsletter, a monthly roundup of top psychological tips and tricks you can use in your marketing, sales, and other business efforts straight away.
Sign up
With the two versions of the piece complete and ready to go, attention turned to finding the perfect sample group of guinea pigs to test drive them.
Usually, marketers are excluded from market research studies. But in this case, all parties agreed that senior marketers would prove to be far tougher critics of our psychologically-improved content form than members of other disciplines. So we asked Panelbase to find just that – 150 senior marketers to take part in our study.
With 150 carefully selected test subjects in place, Lumen Research’s state-of-the-art technology was used to run a forensic “attention analysis” of the two different content forms. Their kit bag includes eye-tracking technology, attention measurement techniques, and open-ended questioning to fully understand how the audience was responding to the two different content forms.
And because the test wasn’t thorough enough already, we decided it would be interesting to understand how (if at all) the effectiveness of the different forms would be impacted by device type.
So, for good measure, we asked the audiences to use both mobile and laptop devices to engage with the content provided to them.