17th February 2021
This post is written by Julie Adams, a Communication & Content Specialist at DisplayNote. This is the first in our Psychology in Action Blog Series, exploring the research and relationships between psychology, communication, and collaboration.
Persuasion is regarded as both a science and an art form. There’s a wealth of literature out there dedicated to the valuable skill; how to be persuasive, how to leverage it in the workplace, how to pick up when it’s being used. We’re told to speak clearly and confidently, make direct eye contact, use positive language, and direct questions to engage our audience.
Yet ask any person, and they’ll probably tell you they’d like to learn how to be more persuasive, particularly when it comes to the meeting room. Psychologists have put forward an explanation for why unbeknown to us, we might actually be weakening our arguments through the simple act of trying to be persuasive.
The dilution effect is a judgment bias in which people underutilize diagnostic information when non-diagnostic information is also present.
Diagnostic information: this refers to information that is useful in making a particular decision. For example, say you’re buying a second-hand car - diagnostic information would be the fuel type, model, and mileage per gallon.
Non-diagnostic information: this is information that is not relevant to the decision being made. Using the same example - this could be how many children the seller has or if they go to church. These details aren’t really anything to do with the actual car, but a person may use this non-diagnostic information to decide.
When both kinds of information are present, people tend to under rely on diagnostic information. Non-diagnostic information, therefore, weakens or dilutes the impact of diagnostic information on the judgment. The dilution effect results in less-extreme assessments than those made using only diagnostic information.
Scientist Christopher Hsee devised a clever example to test this principle in action. In the example, students were asked how much they would hypothetically pay for a dinner set they wanted from a store. One set contains 24 pieces, all in-tact. The other set contains 40 pieces; 9 of the items are broken; all the rest are in perfect condition.
Time and time again, when asked to compare the two sets jointly, respondents in the study were willing to pay significantly more for the 24 in-tact set. Why? Because of the dilution effect. Even though there were more in-tact pieces in the second set (31), the inclusion of the broken pieces diluted the perceived value of the whole set.
When you’re trying to convince someone of your viewpoint in a meeting (say it’s for increasing the spending for a particular marketing campaign or changing a process), often we’ll tend to go in with a list of convincing arguments in favor of our decision. This is actually harmful to your argument. Why? The dilution effect. When it comes to assessing diagnostic and non-diagnostic information: less is more.
In short, when presenting an argument, it’s better to keep it concise. We tend to think that the more reasons we give someone to go with our idea, the more likely they will go with it. In reality, we’re diluting our argument. By keeping it concise and of high quality, the person you’re trying to persuade will be more likely to focus on the quality of the one argument. This will be more potent than giving them an endless list of ‘pros’ or advantages.
The same is true for marketing when listing the benefits of a product or service to customers, including non-diagnostic information, may weaken the perceived value. Say less may feel counterintuitive, but the science is out there to back it up. Studies of financial auditors have found that their risk assessments are more likely to be clouded by non-diagnostic information, like the auditees field of business.
Humans like to think categorically by nature; it’s what helps us to process the bombardment of information and sensory input we receive every second. But this categorization can also be harmful, as we often use available non-diagnostic information to assign a category or value to something or somebody (even though there may not be enough diagnostic information to back this up). So the next time you’re in a meeting, and you want to persuade the room to see your perspective, keep your argument short - it’ll be more impactful.