Your Next Big Discovery May Be the Thing You're About to Clean Up

A local friend of mine, Andrea Lius, recently had an article published in Smithsonian Magazine about water-breathing bumblebees. She sent it my way, and what started as a quick read turned into a couple hours of thinking about serendipitous discovery. Which probably tells you something about how I spend my weekends.
The story begins a few years ago, when conservation biologist Sabrina Rondeau opened a laboratory refrigerator expecting to find bumblebee queens quietly hibernating in their soil-filled tubes. What she found instead was a flooded mess, the tubes filled with condensation water, the queens submerged. She assumed they were dead. Then they started moving.
Rondeau could have toweled off the tubes, muttered something impolite about the fridge, and moved on. Instead she paid attention, repeated the submersion deliberately with 126 queens, and eventually partnered with ecological physiologist Charles Darveau at the University of Ottawa to figure out the mechanism. The result, published this month in Proceedings of the Royal Society B, is remarkable: diapausing bumblebee queens can actually breathe underwater, a capability nobody expected to find in a terrestrial insect. The discovery has real conservation implications as climate change increases the frequency of winter flooding in the soil habitats where bumblebee queens overwinter.
What strikes me most about the story isn't the biology. It's the moment of decision: clean it up, or look closer?
The prepared mind, and its limits
Louis Pasteur's line about fortune favoring the prepared mind is one of science's great truisms, and it holds up. Rondeau needed sufficient entomology to understand that submerged queens moving around was genuinely anomalous, not just bees being bees. Alexander Fleming, discovering penicillin in 1928 after a mold contaminated his petri dishes, needed enough bacteriology to recognize that the mold wasn't just spoiling his experiment, it was actively killing the bacteria around it. The preparation isn't incidental. Without it, the anomaly is invisible.
But here's the problem with stopping at "prepared mind" as the full explanation: other researchers almost certainly saw the same mold appear on contaminated plates and threw them away. Fleming had colleagues. They had prepared minds too. So the prepared mind is necessary but not sufficient, and that's where it gets interesting.
Zen master Shunryu Suzuki observed that "in the beginner's mind there are many possibilities, in the expert's mind there are few." Taiichi Ohno made the same point from the factory floor: observe without preconceptions, with a blank mind. I've written before about how this concept of shoshin, or beginner's mind, connects lean thinking and Zen practice, and it's directly relevant here. Deep expertise can close off the very perception it enables. You see what your mental model predicts, and you rationalize away what it doesn't.
The researchers who made the great accidental discoveries weren't just prepared. They were prepared and open simultaneously, which is harder than it sounds. It requires holding expertise and curiosity in tension, using what you know without letting it tell you what you're allowed to find.
The unexpected is information
Most organizations, and most people, treat anomalies as problems. Something went wrong; fix it, document it, move on. The mental model is: expected outcomes are signal, unexpected outcomes are noise. Get back to signal as quickly as possible.
That framing is exactly backwards.
The unexpected is information. Often it's the most valuable information available, precisely because it wasn't generated by your existing assumptions. Expected results confirm what you already believe. Unexpected results have the potential to change what you believe, which is a much rarer and more valuable thing. Fleming's contaminated plate wasn't a failed experiment. It was a more important experiment than the one he intended to run.
This is why the lean concept of the andon cord runs deeper than it first appears. The foundational principle isn't just "stop when there's a defect." It's stop when something doesn't match what you expected, because the mismatch itself is worth understanding. The cord isn't a failure signal; it's a learning signal. Rondeau pulled the andon cord on her flooded refrigerator, and the result was a discovery that nobody in entomology was looking for.
The organizational conditions that enable this aren't complicated, but they're surprisingly rare. Psychological safety to say "this is strange and I want to investigate it" without being seen as inefficient. Enough slack in the schedule to pursue an anomaly past the first inconvenient data point. And crucially, access to collaborators with different mental models, because sometimes you need someone else's framework to understand what you're actually looking at.
That last point matters more than it gets credit for. Rondeau's initial discovery was serendipitous, but the mechanism paper required collaboration. She brought her observation to Darveau, who was already studying bumblebee metabolism from a different angle. The cross-disciplinary collision, an entomologist's anomaly meeting a physiologist's toolkit, produced the deeper insight. Similarly, Barnett Rosenberg was a physicist studying electric fields in bacterial cultures in 1965 when he noticed the cells stopped dividing near his platinum electrodes. The culprit was cisplatin leaching from the electrode; today it's one of the most widely used chemotherapy drugs. Rosenberg had to reach across into oncology to understand what he'd found. Neither discovery was fully accessible from within a single discipline.
3M's longstanding policy of giving engineers unstructured time for personal projects (you've heard the Post-it note story, so I'll spare you) is the institutional version of the same idea: build in slack, create space for anomalies to surface and be pursued without immediately needing to justify the detour against a deadline.
The real constraint on serendipitous discovery isn't funding structures or research design. It's the quieter pressure to move quickly toward expected results, to treat the unexpected as a disruption rather than a dispatch from unexplored territory.
Not just for researchers
It would be easy to read all of this as a story about science and leave it there. But the same dynamics play out in any environment where learning matters, which is to say, everywhere.
In manufacturing, in business strategy, in personal development, anomalies appear constantly. A process metric that doesn't behave the way the model predicts. A customer complaint that doesn't fit the usual pattern. A habit you've tried to build that keeps breaking down in the same specific circumstance. Most of the time we clean up the flooded tubes and move on, because we're busy and the anomaly is inconvenient and we already have a hypothesis we're trying to confirm.
Continuous improvement, real continuous improvement, requires treating the unexpected as a resource rather than an interruption. That means cultivating the prepared mind through deliberate learning, while simultaneously protecting the beginner's mind against the calcification that expertise naturally produces. It means building the personal and organizational habits that make "huh, that's strange" the beginning of an inquiry rather than the end of one.
Rondeau's bumblebees were underwater for a week. She almost missed it entirely. What are you currently cleaning up and moving past?