Time and again I see brilliant researcher sand analysts getting frustrated about the businesses they serve not listening to their insights, despite these being wonderfully researched and fully backed up with data. They present findings, explain their case and nothing happens. Then, along comes another person/ agency/ department who says the same exact thing and THEIR feedback gets implemented.
Sometimes this is because the internal team isn’t trusted or believed to be delivering accurate information. There are ways to address this through internal brand building for the discipline etc. but I shan’t go into that here.
Too much information
What I do think happens more often than not is that the team being presented to is given way too much data and details. In this instance, it is counterintuitive to show your workings out. Let me explain what showing ALL your workings out does for the receiving party:
The sheer volume of information overwhelms them, leading to cognitive overload and also leading them to dismiss what you are presenting as irrelevant
The fact that you need to prove your statements to nth degree of detail tells the audience that you lack confidence in your statements
Showing all the research data forces the audience to draw their own conclusions from the research and risks them applying a completely different filter to the findings - good or bad
I have yet to meet a researcher or data analyst that hasn’t struggled with finding the right amount of insight to share - we’re all geeks and find this fascinating, after all.
So what do we do to avoid the information overload trap?
We need to go back to the very beginning of the project to remind ourselves of the initial aim, regardless of how many exciting things we discovered along the way.
Good questions to stress-test with:
What was the original aim of the research/ piece of analysis?
Have we answered that question?
What is the most interesting insight we uncovered along the way?
Why would the audience care?
What is the most concise way I can summarise my findings?
As analysts and strategists, our value is in turning the research insight into the “so what?”. Dragging out implications can be painful and the distillation of information often leaves out some very juicy data and it doesn’t always work. Occasionally some key points get missed.
But mostly, the feedback we hear time and again is that concise, summarised findings help make research far easier to understand and then, crucially, to implement.
How do you solve this issue? Is there a better way? We’re all ears!
P.S. The irony of a long blog post that calls for brevity is not lost on us ;)