Selection Bias and Skewed Perceptions
Back in World War II, the U.S. military worked with a statistical research group at Columbia University to ascertain the optimal ways to improve armor protection on warplanes, since many, especially bombers, were being shot down. It would be impossible to put armor over the entire plane, because that would be too heavy, so they needed to find the priority target zones and protect them.
Columbia undertook an analysis of aircraft that returned from missions to see where the planes were taking the most bullet hits to determine where they should add armor. They gathered a large data set and aggregated a cluster diagram of the bullet holes. They recommended adding armor to the places with the highest concentration of hits. Seems like the right approach, right?
Well, among the Columbia group there was an astute statistician named Abraham Wald, who happened to be a Jewish Hungarian who'd managed to escape the Nazis. Examining the plan and seeing it as flawed, Wald convinced a whole bunch of military brass that they were looking at the problem wrong.
Wald asserted that by concentrating on the bullet holes of surviving planes, the analysis was neglecting to consider the bullet holes of the planes that didn't survive. In other words, Wald argued, the best place to put the armor was instead in the places where there weren't any bullet holes, because, presumably, that's where planes were getting hit and not coming home because of it. The planes that came back with the bullet holes, after all, were the ones that survived.
Once you see it, it seems so obvious. But at first glance, it's not obvious at all. That's because of selection bias, which is just one of many human heuristics, or cognitive biases that affect our ability to understand the world in a comprehensive way. In selection bias (in this case specifically referred to as "survivorship bias"), we mistakingly draw conclusions based on an incomplete picture. We use data sets that omit something very important indeed: the elements that don't "survive" to make it into the final round of analysis.
It's worth considering where this might affect our lives.
Consider our perceptions of ourselves relative to others. There's all kinds of ways it might play a role in generating feelings of envy. Think about the skewed impressions we get of people's lives from what they choose to put on social media. If we find ourselves feeling envious of other people's seemingly beautiful lives, we shouldn't forget that this curated imagery is only what has been selected. There's a whole lot of ugly stuff, failure, and messiness that doesn't make it into people's posts. By contrast, we're keenly aware of our messiness. That only sharpens the potential for envy, right?
Same for our skewed impressions of people who make it to the top of various hierarchies. Our initial thought might be to assume they've been lucky or just attribute their success to some kind of privilege. We might even consider them suspect for their success. We see only their remarkable resume, competitive salary, or esteemed social recognition. We aren't seeing the the ugly failures that they most assuredly suffered along the way.
Consider our impression of institutions that claim remarkable successes and track records. They might be leaving out a few things. For example, if a university claims a 99% acceptance rate into competitive medical schools among its alumni, that looks very impressive. But what if half the students who enroll there regularly drop out instead of becoming alumni? Those are the planes that didn't make it back.
This is very important to keep in mind when you're reading sensational journalism, or even not-so-sensational scientific studies that cite data sets. If a broad and compelling survey reports x result among a population, for example, bear in mind that only the people who were willing to take the survey can be included in that data set. Once understood, the innate tendency toward selection bias can be a very useful (and very sneaky) tool for those with, say, an ideological or political agenda. It's an easy thing for savvy people to exploit, unfortunately.
Point being, when we're aiming to derive meaning from numbers, ratios, or the appearance of things, it's very important to consider the data points that aren't included in any given analysis. This could be intentional, meant to deceive or skew things. But most often, it's just our own or others' selection bias at work.
As much as we might think we're smarter than that, it's actually pretty natural to omit what isn't visible. Those military analysts weren't unintelligent. They just weren't accounting for what they couldn't see. Most of us would probably draw their conclusions, not Wald's. Seeing past selection bias takes work. We have to be intentional about considering what else is going on besides the numbers or images or impressions that are immediately evident.
#heuristics #cognitive #bias #cognitive_bias #selection_bias #selection #discernment #envy #self_worth #social_media #media