Blog articles

Why engineering management needs an evidence science

April 12, 2023

Author: Cat Hicks, PhD | Read time: ~ 5 mins 

Imagine you’ve just become a new manager for a team of software developers. It’s a long-awaited promotion– you’re excited to step up into engineering management, you’re working on technology you care about, and you can’t wait to put so many of your thoughts and ideas about engineering into practice with your brand-new team.

But as the weeks tick down to your role change, your excitement is met with a hefty dose of new information. Most alarmingly, you’re told that this team is struggling. Maybe your leader has told you, “they keep missing deadlines, they’re not shipping properly, I need you to get this team on track.” Maybe your new manager colleagues have said, “Our teams have felt some friction planning with this team. There’s a lack of trust, we need you to fix it.” 

You hear a lot about the outcomes and the frictions being felt by other teams, but you know that your job is to understand what’s really happening with this team. And lots of people are telling you there’s a problem, but no one is telling you how to fix it.

Gather Evidence

What do you do next?

The above example is a vignette that I’ve used in behavioral research with developers–an example story that helps people think through the real, and specific challenges that they face in complex work. In our research, we try to gather and learn from managers’ stories about grappling with the what next in common situations. And this particular one resonates with a lot of managers. Understanding why a team is struggling to ship and deliver software is a common challenge. 

In our research we’ve learned that thriving software teams place a high value on diagnosis, observing their processes, making their effort visible, and truly using evidence to inform their decisions. This has led me to believe that engineering managers can learn a lot from thinking of themselves as evidence scientists. 

An evidence scientist is someone who tries to implement and then evaluate change in the real world, in situated, specific contexts. For an engineering manager, this might be a new code review process, a change in tooling, or an investment in behavior changes such as creating a culture of more positive and frequent feedback within a team. These solutions might be very different in their specifics–process, tools, or culture–but the mindsets and skillsets we bring to making sure we’re making good decisions about them are the same. Evidence science is practical; it includes evaluating the relative impact of different changes, and asking what’s the biggest thing we can change for our teams. 

In the Developer Success Lab, we’ve gathered examples of what managers said they would do in this scenario. Across 465 engineering managers and leaders, we saw the vast majority talk about gathering better data about why the failures and frictions happened in the first place, and then using that data to make a change. This is a continuous cycle, not a one-stop fix. Almost every single manager in our research emphasized that no team actually starts “from scratch” with data. Developers have their own work histories, knowledge, and insight. Managers emphasized the importance of building trust with a new team by modeling an open-minded, and caring commitment to measurement as a shared practice.

So let’s imagine again you’re that new manager. What do you do next? Here are some of the key strategies we heard from the experienced managers in our research:

Evidence Science

1. Create space for real talk.


Many managers highlighted the importance of making sure you truly understood why planning was failing to reflect reality. In our previous research, we’ve found serious inconsistencies and uncertainties in what information developers have to learn from–for example, many developers don’t know if their managers use software metrics at all! Managers suggested spending your first time with the team gathering past examples of projects and doing a blameless, collaborative evaluation of whether the planning had really worked and where there are gaps in understanding or information sharing.

2. Make it easier to access and then use information.

Managers frequently mentioned that both they and their teams can struggle to accurately recall “off-the-cuff” information about how long tasks took. Visualizing data, work velocity, and rates over time and putting reflection moments on the calendar were suggested strategies to reduce this load. Many managers noted that it was important to represent different types of information together, which underlines the importance of remembering that developer productivity is multidimensional, and that we should get clear on differences between measures of production, productivity, and performance.

3. Make your team your most valuable source of evidence.

Woven throughout managers’ responses in our research was a constant emphasis that developers need to feel “invited in” and like partners in the detective work of evidence science. This connection echoes a finding from our previous research on Developer Thriving, which found that developers’ agency to speak out about the ways technical work is measured and evaluated is significantly associated with increased productivity. Managers suggested strategies such as interviews with individuals, holding team-level historical reviews of story points or cycle times, and ensuring that all perspectives are given a way to contribute to the “diagnosis” of the team.

When I talk to engineering leaders and managers about our research on why measuring developer experience matters, I frequently repeat something that an engineering leader once told me: “Never forget that truly listening is a first step, is an action.” 

Rigorous evidence science doesn’t mean ignoring human experience. From the managers in our research, we’ve learned that one of the most productive things managers can do when they face a moment of friction is elevate information cycles and reflection practices on their teams. I believe in this because we have robust empirical research that shows its impact, but also because it’s the same approach I take as a scientist working to help developer teams. We need to bring real data together with compassionate understanding, and work to identify the key levers and causal elements that are causing similar patterns to emerge over and over again. 

With the tools of evidence science, managers can become pattern masters, helping their teams synthesize and spot unknown connections. Evidence science lets us remove the burden of being the only one with the “fix,” and invite our teams in to solve problems together. As one manager told us, the difficult moments of helping a team improve are “the real opportunity to deliver a higher quality product.” 


Cat Hicks, PhD

Social & Data Scientist

Cat Hicks is the Director of the Developer Success Lab and a social science leader in tech with expertise leading applied research teams to explore complex areas of human behavior, empirical intervention, and evidence science. Cat is a research affiliate in STEM education research at UC San Diego and an advocate for increasing education access. She holds a Ph.D. in Quantitative Experimental Psychology from UC San Diego, was an inaugural Fellow in the UC San Diego Design Lab, and has led research at organizations such as Google and Khan Academy.

Cat Hicks, PH.D.