The first story is usually too small
Why a little information can be more dangerous than none at all
A founder loses a deal and says the market is not ready.
A sales leader observes three weak conversations and says the team needs better training.
A manager has one difficult employee and starts redesigning their hiring profile.
A board hears from two frustrated customers and suddenly everyone is talking about a shift in the market.
This is how a lot of bad decisions begin. Not with stupidity or laziness or even a lack of information. They begin with a small amount of information that feels larger than it is. When we have no information, we usually know we are guessing. We are cautious and ask more questions. We leave room for uncertainty because the uncertainty is staring us in the face.
But when we have a little information, our mindset changes. The analytical part of our brain gets involved. It starts connecting dots, filling gaps, building a story, and defending the story before we have enough evidence to know whether the dots belong together in the first place.
That initial story can feel incredibly convincing. It feels like pattern recognition, and sometimes it is. But most of the time it’s a proven default habit of our brains to over index small sample sizes into large conclusions.
One customer says the price is too high. Maybe the price is too high or the value was not clear. Maybe the person you were talking to was never the buyer or the timing was wrong. Maybe this was the wrong customer. Or maybe they say price is the issue because that is the easiest thing to say.
One team member misses a deadline. Maybe they lack accountability or the expectation was unclear. Maybe they did not have the authority to get it done or they did not understand why it mattered. Maybe they are overwhelmed. Or maybe you avoided the conversation six weeks ago that would have prevented this moment.
The point is not that the first explanation is always wrong it is that the first explanation is usually incomplete and incomplete explanations are dangerous because they often contain just enough truth to survive.
Which is what makes them hard to challenge. A completely wrong idea usually collapses quickly under scrutiny whereas a partially true idea can run a company for months. It can shape hiring decisions, sales strategy, product direction, team structure, and leadership behaviour long before anyone realizes the original diagnosis was built on a sample size of almost nothing.
The problem is not the data point but rather what we do with it. A single data point should make you curious. It should not make you certain.
That said leaders are often under pressure to sound certain. The team wants confidence, investors want a story, and customers want conviction. The board wants a clean explanation. Nobody wants to hear, “I’m not sure yet, but I’m paying attention.”
So the leader turns a single or small group of signals into a narrative.
The market is price sensitive.
The team lacks urgency.
The candidate pool is weak.
Customers do not get it.
The timing is wrong.
So the question becomes: how do you know for sure?
Most bad decisions do not happen because leaders fail to think. They happen because leaders think too quickly around the wrong centre. They take the first believable explanation and start building around it. They start breaking down the first available wall between them and where they think they want to go.
Once that happens, the explanation becomes sticky. People start gathering evidence to support it. They notice the next customer who complains about price and ignore the one who bought because the value was clear. They notice the next employee who misses a deadline and ignore the part where the system keeps producing for most employees.
The mind does this naturally. It wants coherence and for the discomfort to end. It wants a story it can hold. The mind is designed to be lazy.
Being a leader requires the discipline to stay with the unfinished thought a little longer, but not hiding behind it.
There is a version of “we need more data” that is just avoidance. Leaders use it when they are trying not to make a hard call and they keep researching because acting would make them measurable. They ask for one more report, one more meeting, one more round of input, when everyone in the room already knows enough to move.
That is not wisdom. That is hiding. The real skill is knowing the difference.
A weak signal deserves an appropriate amount of attention. A repeated pattern d
eserves a reasonable level of investigation. A real wall deserves decisive action.
That is where judgment lives. Not in being cautious all the time or in being bold all the time. In knowing what kind of reality you are dealing with. But you cannot do any of that with fog. Fog just keeps you circling. That is what small samples often create: fog with the confidence of fact.
The leader’s job is to slow the conversion from signal to story long enough to ask better questions, collect enough data, and to recognize that a few pieces of information can be just as dangerous as no information.


