Today I want to show you what people are inclined to DO. ....I want to show you how you are inclined to act. Behavior. Different than thoughts. We think we want X but we buy Y. I want to show you not only what people are thinking but also what they do... An example. If I say, pick a number from 1-10, you are inclined to say "7." The majority of people do in the English speaking culture. If I ask you to think of a shape and say it out loud you are inclined to say "triangle." The majority of people do in the English speaking culture. But knowing what someone is going to say or think is matched in importance by knowing what people are likely to DO. If you see what people do in given situations you can predict their behavior in future situations. For decades each nation's military has typically been afforded the most current information about human behavior... it's helpful to know what your opponent will do, how they will respond and react...and what their next move will be. The same is true in sales and marketing. Some decisions to buy something are made under great "pressure." Others are decided in the absence of "pressure." Knowing what will cause someone to decide X can be pretty helpful information! Knowing WHY they decide X can change how you communicate with people forever. Knowing HOW people perceive you and the rest of the world... is downright invaluable. Research into how people make decisions while under pressure could help the U.S. military improve training for its leaders and lead to better decision-support systems. Studies have shown that when people process information, they develop unconscious strategies – or biases – that simplify their decisions. Now, research at the Georgia Tech Research Institute (GTRI) is revealing how these biases affect people when they're dealing with lots of information – and little time to form conclusions. The same research applies to all of us in all decision making situations. Watch and see what I mean as I check the "biases" that GTRI has listed as being the most "important." The examples I give are not those from GTRI but those I am giving YOU to optimize influencing others. These
are the biases (unconscious strategies) GTRI has revealed are
important in decision making: b) Availability. Recent events or well-known conjecture provide convenient explanations. If I read it in the paper today it's got more impact than something I read last month... If it is in THIS month's issue of the magazine or TODAY's edition of the newspaper, it means A LOT. If it's in yesterday's, the value is GREATLY diminished. c) Oversensitivity to consistency. People give more weight to multiple reports of information, even if the data came from the same source. Repetition is VERY powerful. If people start to hear the same thing over and over again, it makes little difference where they hear it...it simply becomes... true. d) Persistence of discredited information. Information once deemed relevant continues to influence even after it has been discredited. Someone can tell you they lied to you about where they were on a given night... but a few weeks from now, you will continue to believe they were there. Once you tell someone something, it's what they will remember, if anything. Saying later that you might have been wrong or that someone else was wrong...or that it "might not be right," simply doesn't matter. e) Randomness. People perceive a causal relationship when two or more events share some similarity, although the events aren't related. People will believe anything. I had a guy take my seat at the blackjack table because I won $5000. You think that stool knows what cards are coming out next??? People will mistake what causes what... almost all the time. f) Sample size. Evidence from small samples is seen as having the same significance as larger samples. People just don't have a clue as to how unimportant their own personal experience is...or the experience of a friend or a relative. The fact is that a single well-told story will convince with far greater magnitude than a computer filled with statistical PROOF of the opposite. g) Vividness. When people perceive information directly, it has greater impact than information they receive secondhand -- even if the secondhand information has more substance. You can read it in a comic book and it has more weight than what a scientist tells you he learned at the neuroscience convention.... To test the affects of these biases, Folds had experiment subjects view an inbox on a computer screen containing a variety of text messages, maps, photographs and video and audio recordings. Subjects (the majority being Georgia Tech ROTC students) were instructed to report certain military situations, such as incidents of sniper fire or acts of suspected sabotage. They were not to report other events, such as normal accidents in an urban area unrelated to enemy activity. To decide whether or not an event should be reported, subjects reviewed a series of messages that contained both bona fide evidence as well as information created to trigger the biases that cause poor decisions. In each trial, subjects were allowed enough time to spend an average of 20 seconds per element data plus one additional minute for reporting; they were also asked to attach information that supported their decision. In the first experiment, all seven biases appeared with the greatest number of errors caused by vividness and over sensitivity to consistency. In addition, Folds discovered two new biases that can hinder the quality of rapid decisions. These
are two newly discovered biases (unconscious strategies): i) Sensationalist Appeal. Items containing exaggerated claims or threats influence a decision-maker even when there is no substance to the content. Folds was surprised at how well subjects could perform the task while under pressure, he said. Although he expected an accuracy rate of about 50 percent, subjects correctly reported 70 percent of incidents. In a second experiment, researchers divided subjects into two groups, using one as a control group while training the other group how to spot conditions that spark decision-making biases. Subjects who received training were able to detect about twice as many "false-alarm opportunities" as the control group. The biggest difference between the two groups involved "persistence of discredited information" and "small sample" biases. Forty-eight percent of trained subjects were able to recognize when a "persistence" bias existed compared to 18 percent of the control group. Fifty percent of trained subjects caught the "sample-size" traps versus 11 percent of the control group. Although training helped participants recognize when traps existed, it didn't help them identify the specific bias. "When subjects were under pressure to make decisions rapidly, the distinctiveness of the categories fell apart," Folds explained. "That's significant, because it helps us tailor training efforts." The experiments also revealed what kind of information is meaningful to decision-makers, Folds noted. Software designed especially for the trials tracks when subjects open a document for the first time and when they go back for a second time or third look. The amount of time that subjects spend reviewing data – along with the data they attach to reports showed a decided preference for text messages over other formats. Folds' team is conducting more research: Two new sets of trials are examining how decision-making errors occur in groups, while another experiment is trying to pinpoint how rapidly individuals can make good decisions.
About the
Author:
The viewing and use of this website signifies your agreement, acceptance, and understanding of our: Legal Disclaimer l Terms and Conditions l Privacy Policy l Earnings Disclaimer
|