Building Better Care Through Experimentation: Inside Our Data-Driven Approach
Medbridge Chief Product Officer Sarah Jacob Singh shares a behind-the-scenes look at our iterative product testing process and how it's helping us ensure that Pathways truly engages clinicians and patients.
March 28, 2025
7 min. read

Henry Ford once said, “If I had asked people what they wanted, they would have said faster horses.” This is a great reminder that in order to innovate successfully, we often need to push beyond our initial assumptions to what actually works best. And in healthcare product development, this lesson is especially relevant, as our work can be full of fascinating surprises. We start with what we believe to be a great idea, one that’s logical, well thought-out, and even informed by experience. But after testing that concept to see how it performs, our viewpoint might shift in new ways.
At Medbridge, we take an iterative, data-driven approach to refining all of our products, including Pathways—testing, learning, and making improvements based on evidence rather than guesswork. This experimentation process isn’t just about making our product more robust; it’s about ensuring that Pathways truly engages clinicians and patients.
Let’s take a look at some of the testing processes we’ve been using with Pathways to achieve even better outcomes.
Why Experimentation Is Core to Pathways
With the products we develop at Medbridge, every feature is tied to a broader objective. In the case of Pathways, that’s primarily patient and clinician engagement. But as we’ve seen time and time again, what we expect to work doesn’t always align with how people actually use the product. That’s why it’s so important for us to take a data-driven approach to our product iteration.
Our experimentation typically follows this methodology:
Identify an objective, such as increasing patient engagement or improving clinician workflow.
Develop a hypothesis. For example, “Simplifying the Pathways home screen will make it easier for patients to navigate.”
Test through A/B-type experiments in which we compare different versions of features to see which performs better.
Analyze results and refine. If the data supports our hypothesis, we move forward; if not, we adjust and test again.
This process allows us to refine our product to better serve our users, and offer proof to providers and partners that our approach works.
Key Learnings from Recent Experiments
So what have we learned lately through our testing? Here’s a rundown of some of the useful insights we’ve discovered about patient and clinician engagement.
SMS Messaging Can Help Boost Engagement
One of our most significant new findings is that texting can boost patient engagement when certain parameters are followed. We learned that:
Opt-in messaging gets results. When patients were asked to opt into SMS reminders, those who agreed engaged 2.5 more times per month than those who didn’t. That might sound minor, but over the course of a month, it adds up to a significant improvement in adherence. We also tested how the wording of the opt-in message influenced response rates. When we included the provider’s clinic name in the message, opt-in rates jumped by 38 percent. This single tweak made a meaningful difference in patient engagement.
Follow-up reminders increase enrollment. Patients who received an invitation to join Pathways didn’t always act on it. By adding a follow-up reminder one day and one week after the initial invite, we saw enrollment rates increase by 37 percent. Simple, well-timed reminders helped ensure that more patients followed through.
A Simple User Experience Isn’t Always Better
One of the biggest lessons we took away from designing the homepage for the Pathways app is that simpler isn’t always better—at least, not in the way we initially expected. Our first homepage design included a wide range of features, but patients felt like it was too cluttered. To help streamline the experience, we redesigned it with a more focused, one-action-at-a-time approach inspired by platforms like TikTok. But then engagement unexpectedly dropped, because users felt they had lost visibility into other important tasks.
We realized that while it's important to focus on one action at a time, patients still need easy access to other actions so that they don't feel locked into a rigid workflow. Now, we’re refining a middle-ground approach that keeps the interface clean while ensuring users can easily find what they need and see next steps. Throughout this process we’ve seen once again that experimentation is key to building an experience that truly works.
Structured Pathways Improve Patient Adherence—If Clinicians Are Involved
In healthcare, there’s often a tension between intuition and data, as clinical decision-making is at times guided more by experience than by empirical evidence. Clinicians want the best for their patients and often have strong instincts about what will be most effective for them. They tend to prefer more control over patient care and might believe that manual interventions will deliver better outcomes than automated pathways. But when we run experiments, the data sometimes tells a different story.
For example, our data shows that structured pathways with limited customization actually improve adherence. Yet it’s still essential for clinicians to be closely involved. One of our biggest takeaways came from testing the effectiveness of a case management dashboard for clinicians. With the dashboard, clinicians can monitor patient activity, engagement levels, potential drop-offs, and areas of concern. The results were striking: Patients who had a clinician monitoring their progress engaged 2.4 times more than those who managed their care program independently. This reinforced that having a clinician oversee a patient’s journey makes a measurable difference in adherence and outcomes.
For organizations such as employer health plan providers that rely on self-managed patient care, even small interventions like having a single clinician monitor patients with our case management dashboard can make a big impact.
Innovating Responsibly: AI with Human Oversight
Looking ahead, we’re exploring how AI can improve patient engagement in lower risk areas such as wellness pathways. AI-powered check-ins could help with less complex, proactive wellness programs, such as posture improvement and general fitness, where patients might not need to work closely with a clinician. However, for serious conditions, patients need human intervention and closer oversight. Not only is the AI functionality not there, but patients with complex conditions are less likely to follow through on care plans if they feel like they’re engaging with an impersonal system rather than a clinician who’s invested in their recovery. Our goal is to leverage automation where it’s effective, but keep human oversight where it matters most.
Bridging the Gap Between Intuition and Data
We’ve seen firsthand that some assumptions hold true—but others don’t. As Pathways evolves, we’re working to bridge the gap between intuition and data. While clinician expertise will always be at the core of patient care, data-driven experimentation helps us make stronger decisions and design solutions that truly help patients move, feel, and live better.
So what’s next?
We’re going to test whether patients engage more if we ask them via SMS message whether they’ve done their exercises and then log the exercises for them, versus asking them to log in and record the exercises themselves. Will that increase engagement? We’re about to find out.
At Medbridge, experimentation never stops—because every insight gets us one step closer to building the best possible patient and clinician experience.