I’m currently listening to an audiobook of Jim Collins’ Good to Great, in which he attempts to develop a data-driven model of why some companies shift from mediocrity to excellence. To define "excellence," he uses the proxy variable of matching vs. substantially beating the general stock market over a fifteen year period. It’s a very interesting read, so far, and I’ve quite enjoyed the emphaticness with which Collins, who narrates the audiobook, makes his points. (For example, he complains vehemently that “leadership” serves as a semantic stopsign for modern audiences trying to understand how businesses succeed or fail, ala elan vital.)

One idea of Collins’ that most struck me is “red flag mechanisms,” a practice intended to help create an environment where the truth can – and will – be heard.

We live in an information age, when those with more and better information supposedly have an advantage. If you look across the rise and fall of organizations, however, you will rarely find companies stumbling because they lacked information…. The key, then, lies not in better information, but in turning information into information that cannot be ignored.

It seems to me that it’s not just large publicly-traded corporations who have trouble with sometimes ignoring data that’s right in front of them; this happens on the level of individuals, too. And Collins’ principle of editing the way you receive data such that it’s difficult or impossible to ignore seems like an important technique for avoiding self-deception and rationalization. For example, using my phone’s GPS to keep track of how fast I go when I’m exercising makes it automatically easier to work hard, because I can see whether I’m “measuring up” against my past performance, and whether I’m improving. I could ignore this information about whether I’m working hard, but it’s much harder than ignoring fuzzy sense data about whether my legs feel tired yet.

Collins describes a much more drastic information-promotion technique he used in one of his classes, in which he gave each student a bright red sheet of paper to be used only once during the quarter. He explained to his students

If you raise your hand with your red flag, the classroom will stop for you… You can use it to voice an observation, share a personal experience, challenge a CEO guest, respond to a fellow student, make a suggestion, or whatever. There will be no penalty whatsoever for any use of a red flag. Your red flag can be used only once during the quarter.

…In one situation, a student used her red flag to state, ‘Professor Collins, I think you are doing a particularly ineffective job of running class today. You are leading too much with your questions and stifling our independent thinking. Let us think for ourselves.’

… A student survey at the end of the quarter would have given me that same information. But the red flag – real time, in front of everyone in the classroom – turned information about the shortcomings of the class into information that I absolutely could not ignore.

In this case, rather than making the data more precise, or easier to interpret, he is increasing the cost of ignoring it. If he ignored the student’s comment by continuing to run the class as planned, he would lose face in front of his students, and probably feel like a jerk. Perhaps the comparable technique from my exercise example would be to have a beeminder graph, where I would lose money if I didn’t usually match or beat my previous paces. Or if I wanted to use social incentives as well, I could post the graph publicly to my facebook page, so all my friends could see if I wasn’t keeping up with my goals.

Are there areas in your life where you find yourself flinching away from (or consciously ignoring) information that’s important to you? What about other examples of ways you’ve found to make that important data harder to ignore? Ideas for other folks to try? I’m rather excited about experimenting with something similar to Collins’ red flag system in large-group meetings at CFAR, to make sure that everyone’s most important ideas are heard.