I was always a big fan of Dan and Chip Heath’s book Made to Stick: Why Some Ideas Survive and Others Die (2007). It’s one of the few that has remained in my library and not been donated (although it’s been loaned out a few times) despite book purges with every move. While I would not put Upstream in that same category, it’s one big idea was definitely “Made to Stick.”
Like a lot of business books, the author has one big idea and then has a series of examples illustrating that idea. For Heath, the idea this time in Upstream: The Quest to Solve Problems Before They Happen, is that the world would benefit from more upstream interventions to problem solving. He suggests that we too often focus on downstream (or reactionary) solutions to problems and not enough on the upstream solutions that could prevent the problems in the first place.
Heath points out the many reasons why we do this.
- Upstream solutions often take time to come to fruition.
- Measuring ROI of problems “not” happening is difficult.
- There’s no glory in preventing problems.
- There’s no urgency in the implementation of upstream solutions. To act is optional, downstream activities demand reaction.
- Downstream solutions in comparison are often necessary, tangible, easier to see and easier to measure.
So why focus on upstream interventions?
“Downstream efforts are narrow and fast and tangible. Upstream efforts are broader, slower, and hazier — but when they work, they really work. They can accomplish massive and long-lasting good.” Heath, Dan. Upstream: The Quest to Solve Problems Before They Happen (p. 9).
Another major impediment moving from downstream (or as I call it “whack-a-mole”) problem solving to upstream, is blocking off the time to think objectively and creatively. Upstream intervention means detecting problems early, deconstructing complex systems and pioneering new ways to solve the problem and measure the solution. This is innovative and innovation does not happen in an organization that is constantly putting out fires. Heath refers to this as “slack” — a reserve of time or resources that can be spent on problem solving. In today’s lean / shareholder focused companies, there is often no room for slack. But slack is what we need for upstream actions that could make companies more efficient, more sustainable and even profitable in the long run.
Upstream solutions require what is called systems thinking. What is systems thinking? There are many definitions out there, but in general it involves the recognition of the interconnectedness of the world. Seeing the world in patterns, interdependencies, circular economies and biological systems v. linear. Structurally, it’s the ability to break complex problems into parts and to build and leverage resources into sustainable systems out of those parts.
“You do not rise to the level of your goals. You fall to the level of your systems.”
This is one of my favorite quotes from James Clear’s Atomic Habits: An Easy & Proven Way to Build Good Habits & Break Bad Ones (2018). Unfortunately, corporate America today is so goal-focused that not enough attention is paid to systems, or to the systems thinkers within their midst.
So how can an organization adapt to systems thinking approach, and/or how can system thinkers be more effective? Heath breaks it down into 7 Questions:
- How will you unite the right people?
- How will you change the system?
- Where can you find a point of leverage?
- How will you get early warning of the problem?
- How will you know you’re succeeding?
- How will you avoid doing harm?
- Who will pay for what does not happen?
Here are a few key takeaways:
Build a team that surrounds the problem from every dimension. An example Heath uses from Iceland talks about their success in reducing teens use of drugs and alcohol by involving the teenagers themselves along with parents, teachers, coaches and anyone who influences or touches teenage life in creating a solution.
Make the solution clear and provide a constant stream of useful data from a variety of sources and then let the team alone. While the solution may be more general, Heath notes, that aligning efforts towards preventing specific instances of the problem prevents the problem solving from becoming a policy discussion. This is a key point. Policy doesn’t fix things, actions do. And people can’t take action unless they know what needs to be done. Heath effectively uses an example about reducing domestic violence by looking at the patterns of both abusers and the abused and developing an Danger Assessment Tool. The team then using that tool to create specific community plans for potential victims. It made the policy actionable on a personal and individual level.
Preventing problems means changing the systems that create them in the first place. Systems have results, so if your results are not good, you need to go back and look at your systems. Changing systems is never easy because they tend to be built into the culture of an organization. An example Heath uses is of a philanthropic program that provided financial coaching to low-income people. It was a program where many people got money (coaches, foundation employees, foundation’s investment managers) except for the low-income people who just got coaching. This example seems quite relevant in the days of COVID19 where no amount of coaching is going to create jobs where there are none.
The problem with a cost benefits approach to upstream solutions is that it’s difficult to measure ROI.
Discussions of upstream interventions always seem to circle back to ROI: Will a dollar invested today yield us more in the long run? If we provide housing to the homeless, will it pay for itself in the form of fewer social service needs? If we provide air conditioners to asthmatic kids, will the units pay for themselves via fewer ER visits?
These aren’t irrelevant questions — but they aren’t necessary ones, either. Nothing else in health care, other than prevention, is viewed through this lens of saving money. Your neighbor with the heroic all-bacon diet — when he finally ends up needing heart bypass surgery, there’s literally no one who is going to ask whether he “deserves” the surgery or whether the surgery is going to save the system money in the long haul. When he needs the procedure, he’ll get it. But when we start talking about preventing children from going hungry, suddenly the work has to pay for itself.
Heath, Dan (pp. 127–128).
If everything is cause for alarm, nothing is cause for alarm. Have you ever worked for a company where everything was a crisis or a priority? As a result, you end up playing “whack-a-mole” with system failures rather than focusing resources on fixing the system. This is why it’s necessary to design “warning signs” for real system failures. But these warning signs need to also provide you with enough time and information to act effectively.
Be careful what you measure/reward. People will game the system to meet the numbers for which they will be rewarded, even if it goes against the ultimate mission.
Any upstream effort that makes use of short-term measures — which, presumably, is most of them — should devote time to “pre-gaming,” meaning the careful consideration of how the measures might be misused. Anticipating these abuses before the fact can be productive and even fun, in sharp contrast to reacting to them after the fact. Here are four questions to include in your pre-gaming:
The “rising tides” test: Imagine that we succeed on our short-term measures. What else might explain that success, other than our own efforts, and are we tracking those factors?
The misalignment test: Imagine that we’ll eventually learn that our short-term measures do not reliably predict success on our ultimate mission. What would allow us to sniff out that misalignment as early as possible, and what alternate short-term measures might provide potential replacements?
The lazy bureaucrat test: If someone wanted to succeed on these measures with the least effort possible, what would they do?
The defiling-the-mission test: Imagine that years from now,we have succeeded brilliantly according to our short-term measures, yet we have actually undermined our long-term mission. What happened? …
The unintended consequences test: What if we succeed at our mission — not just the short-term measures but the mission itself — yet cause negative unintended consequences that outweigh the value of our work? What should we be paying attention to that’s offstage from our work?
Heath, Dan. (pp. 168–169).
Go forth with humility and patience. It’s quite possible that any changes to systems in the upstream will have unintended consequences and will take years of resolve. This is why the goal of the upstream intervention is to create data/points from which you can navigate and record progress. But don’t get too lost in aggregated data.
The lesson is clear: You can’t help a thousand people, or a million, until you understand how to help one. That’s because you don’t understand a problem until you’ve seen it up close.
Heath, Dan. (p. 236).
Bottom line: COVID19 has revealed the underlying weaknesses in almost every organization, and it’s time to bring in the systems thinkers and let them do their thing.
If you are a systems thinker this book might not contain new information on how to systems think, but it may be very helpful in understanding the resistance to upstream problem solving in established organizations. In that sense it might make you more effective in incorporating systems thinking into your organization.
If you are a leader of an organization, consider if your systems need improving, and who might be the players best able to improve them. It will definitely not be the ones who are satisfied with the status quo. Once you find the right team members, allow them the time and resources to innovate.
Diane Danielson is the founder of the Future Proof Research Collaborative and has always been interested in the future of work, communities and transportation and works with companies who seek to “Future Proof” their businesses.