This article is part of a series called “Data-Driven Engineering,” interviews with software leaders who are using data to help their teams become even more effective.
Andrew Templeton is the Engineering Director at Tuple Labs, a digital product agency based in Austin. Andrew is an advanced AWS Trainer, and one of fewer than 80 people worldwide to hold all possible Amazon Web Services certifications, including AWS Certified DevOps Engineer. We recently had a chance to speak with Andrew to learn about how he uses data to lead his team, and what makes software developers productive.
Can you tell us a bit about Tuple and what you do?
Sure! Tuple is a digital product consultancy. We work much like you’d expect R&D + Design to operate at a product company—where we handle everything from market research, design, and user testing to engineering and operations. We design and build ambitious products for our customers. We target the higher end of the market, tackling projects that are custom, complex, and creative.
When did you recognize that data could be useful to help lead your engineering team?
A few years ago I began asking leaders I admire from other industries about how they’ve built and grown successful teams. Over and over I kept hearing that they used “KPIs” (Key Performance Indicators) to help set goals and help everyone on their team understand how well they are doing.
There’s a longstanding opinion that software engineering is more art than science. That it can’t be measured. I began to wonder if this is actually true… if engineering can be measured.
We’ve never really had a meaningful way for software developers to see progress in their career path with data. So you have a bunch of ambiguity around if people are doing well. Everything is subjective. I want everyone on my team to be able to go home confident they’ve done their job well, and not worry if I’ve just had a bad day.
That was the tipping point for me. I began socializing the idea with my team and searching for ways to measure team productivity.
If we’ve never really had any good metrics in software engineering, why was that a problem?
Tuple runs a bit like a professional sports team. You can feel the friendly competition. Everyone wants to know how they're doing, and if they’re pulling their weight. If everyone is individually improving, the entire team is improving.
In a company like ours, where we work on a lot of different projects and tasks across time, it hasn’t been easy to see how everyone is doing and what trends are. Everyone wanted a way to do that.
Alright, so it seems like you had been thinking about this for a while. What is the most misunderstood aspect of using measuring software developer productivity?
I think too many people in the industry—managers, developers—subscribe to pretty flawed metrics. Some that come to mind are butt-in-seat time, or arbitrary things like “business value,” or bugs, or tickets and story points.
Butt-in-seat time clearly falls down when we realize that some developers are more productive than others on a per-hour basis, which everyone agrees.
Measuring "business value" or feature and bug counts does not really work, because different projects face different challenges. Truly ambitious projects may not see business-visible results for some time while they are being worked on.
Story points fail because of a well-known problem with estimations in software engineering—story points are very, very subjective. By using these flawed metrics, stakeholders tend to over or under evaluate how productive an individual or team is.
What would good metrics look like actually?
In the book Lean Analytics, Croll and Yoskovitz define a “good metric” as something that’s a) comparative, b) understandable, c) uses a ratio or a rate, and d) changes the way you behave.
That last one is interesting, because using metrics invariably changes the way a team behaves. That’s why tracking the wrong things is so toxic, and measuring the right things is so important.
Since business value is subjective, as are story points, we our team has settled on metrics that are objective and quantitative. If it's going to be quantitative, it has to be code focused. Commit size, frequency, consistency, code rework levels, and code volume are fair indicators when evaluated in the context of languages and project.
Our team pays attention to:
- Commit frequency: we encourage everyone to check in multiple times daily
- Commit size: keeping work surface area small to lower risk
- Code churn: in retrospectives and to notice when someone is stuck
- Legacy refactoring: to notice when people are cleaning up the codebase
- Helping others: to recognize team players
- Review Speed: because it can be tied directly to business value delivered
How did your team react when you proposed the idea of tracking metrics?
Initially I socialized the idea in small one-on-one type settings. Invariably, the team had a healthy dose of skepticism, and I had conversations like this multiple times:
Me: “Hey, what do you think about incorporating metrics on our team.”
Engineer: “I don’t like metrics.”
Engineer: “Because software development can’t be quantified.”
Me: “You mean no quantifiable not at all? Or just unfair?”
Engineer: “Unfair. Common metrics suck. Lines of code sucks. Tickets and story points suck.”
Me: “What metrics would be fair? …”
The team went on to propose metrics in the similar vein as the ones I just described. Once people felt comfortable with the metrics and realized they would be fair and objective, everyone was pretty open to the idea.
What have been the results with your group?
We are a very transparent team. 100% of the metrics and reports we track are open to every person on every team in the whole engineering organization.
In the six months we’ve been using GitPrime, we've seen productivity go up—both qualitatively and quantitatively.
Qualitatively, it feels like things have picked up. There’s a perceivable uptick in energy and it just feels a bit snappier. People are more aware about things like pushing code every day, checking in early and often just feels better. And the "high-risk" commit metric has made people more wary of large, risky commits. So we see less regressions and issues.
Quantitatively, commits are about half their original average size, we're seeing 2.5x as many commits, 15% more code volume, and 10% less churn and rework.
Culturally, what has been the impact of incorporating metrics into your team?
Retrospectives are much more matter of fact. Conversations tend to contain less emotion, less ego, and less emphasis on artistry.
Fear-based things like anxiety and imposter syndrome just don’t creep up anymore. Those things come up when people don’t understand how well they’re doing and they are trying to hide or over compensate. Everyone knows where they stand at all times. In fact, we see some friendly competitions and laughs over lunch sometimes.
Anything else you’d like to add about your journey as data-driven engineering leader?
Software engineering is still in it’s infancy.
Unlike, say, civil engineering, where people have been building bridges for hundreds and thousands of years, and we know how to estimate projects as a result. We have only been creating software engineering for decades. To compound things, our building materials are hours and brainpower, not physical goods.
The biggest disconnect in organizations happens with the mutual distrust between engineering and everyone else. That’s because among stakeholders, from the product team to the C-level execs—no one really understands what software engineers do, how they work, or the unique challenges they face.
People work together best when they can see their coworkers contributions translated into quantifiable measures. Software development metrics are the way to help engineers improve, and help the rest of the organization to finally understand software engineering.
Andrew Templeton is the Engineering Director at Tuple Labs, a digital product agency based in Austin. He is an advanced AWS Trainer, and one of fewer than 80 people worldwide to hold all possible Amazon Web Services certifications. You can follow him on Twitter @ayetempleton.