Analytical 42: Activity

18 June, 2020


Showing activity is considered by many to be THE critical pre-requisite to managing their teams productivity. What if they’re wrong?

Transcript available
[Music] Everything can be improved, iterated and refined and if you don't think that's true, maybe you haven't analyzed it enough. Calculator choices carefully considered. Absolutely analytical. Analytical is part of the Engineered Network. To support our shows, including this one, head over to our Patreon page and for other great shows visit today. Activity. Are you working today? Are you being productive today? Are you demonstrating activity today? You might think activity is a thing that others need to see, but I wonder if that's really completely true. When we combine produce with activity we get productivity. On balance though, It's more about expectation over a certain time horizon to get a result. So, if we were to look at an activity that a major corporation, let's take for example Apple because they're a good example, why not? And let's say that they release a new phone every year or maybe an incremental update every year and then a major update every two years. So, when people look at what Apple is doing in the meantime between those two sorts of release dates, every 12 months, to 24 months, there's a lot of speculation, a lot of guesswork, a few leaks of information sometimes from the inside. But ultimately, though, there's very, very rarely any actual evidence that there's any activity happening. There's no proof that anything is going on. We all assume, we all know, but there's no evidence to suggest it. There's no external visibility that anything is happening. So, so far as we know, Apple's not doing anything. Well, we know that that's not true though, because after 12 months or 24 months, we see the results. So we know, and we can infer in the interim that activity was happening. When the next device is released, it's probably amazing. People love it. Well, probably, maybe they do, maybe they don't, but faith is restored. There was activity all that time. Of course there was and we didn't need evidence of that to get a good result at the end. So our expectations are set because we understand implicitly and we're trained by observing a cycle of productivity that these things, like making a new phone, takes a certain amount of time, effort and energy and that's normal and that's okay. And a lack of visible activity isn't indicative of all of a lack of productive progress. So as we sort of break that down, it comes to the underlying trust that actions are occurring. So let's consider the opposite extreme. We're going to look at now micromanagement and metric driven reporting. Let's say that you're a new manager coming into a business and you're trying to understand what a team is actually doing. You have no long-term view, no real time horizon, no set expectations, no evidence of a productive output and hence no evidence of an end result. On the surface, the team isn't visibly doing much, it's not producing much visible activity and the pressure is on to demonstrate that activity is happening. To do that, the new manager sets some activity-based KPIs and starts reporting those KPIs to the next layer of management above them. The team shifts from a trust-based system of work handover and progress internally to pushing to close bug reports faster, open change control requests, formalize interactions with other team members, and everything starts to become very transactional and takes on a transactional overhead. The problem is that overhead isn't free and that overhead costs, let's say 25% of their overall efficiency, where that previous energy spent doing the doing is spent proving the doing and their final delivery is proportionally late or hours worked by the team increases to compensate for that. So that overhead costs about 25% of your overall efficiency where previous energy that was spent doing the doing is instead spent proving the doing and then the final delivery will be proportionally late or the hours that they're working increases to compensate for all that extra overhead. The manager though is demonstrating that activity in the team is happening and their management above have great visibility that activity is happening but it's come at a huge cost. So trust, trust is eroded, interactions in the team and between teams become transactional and impersonal, actual productive work is slowed down, the result is actually a net loss overall. Morale starts to drop, people become more distant, more hostile, and then there's an unhealthy tension that builds between teams and silos begin to be built for some perceived self-preservation or superiority versus inferiority or this team is doing better than that team, so we're not going to help them, they're not going to help us. That works directly against teamwork in the context of the whole organization. The infection of KPI-driven or metric-driven activity measurement seems to come from ignorance, I think, and a belief that humans can be evaluated as discrete productive entities with different levels of performance for different activities. That kind of mindset completely ignores the fact that humans are complicated. Invisible work like tidy up activities, a desire to help others in need is a real thing, along with learning curves. And humans, we change focus based on our outside lives and pressures and influences. That changes any equation that people might have and could possibly ever document to say, "This person can produce this amount of work in this amount of time," because all of it is extremely variable. But ultimately, if you're going down the path of metric-driven activity measurement, and there's a push down the chain to demonstrate that movement is occurring in a team, and activity is happening, and the longer that time horizon is for those final deliverables, the more the pressure will build. You get comments like, "Oh, are Are they still working on that? It's been two weeks, it's been two months, it's been a year. When in fact it's always taken a year between phones, just like every other year for the past decade. Your ignorance of the past is not a measure of the team's performance, it's just a measure of your own ignorance. All it takes is a lack of trust or even a so-called respectful challenge of why is this taking so long? Can we go faster? So that drives evidence-based management versus trust-based management. Now don't misunderstand, I just think it's very important to acknowledge that cost tracking is important, sure. Milestone reporting is also important and I think reporting up the management chain about roadblocks and disruptions is very, very important. But I do think that reporting on activity alone is entirely and completely pointless on every single level. And yet it seems to persist, probably because it's easier for some people to quantify and understand, despite the fact that it doesn't actually add any value. Now, you may be wondering, I'm talking in very general terms, why don't we talk about something specific. So let's talk about an example in programming terms. One of the activity-based metrics is bugs closed over a time period. I mean it could be that we could all just as easily be talking about service tickets and so on. It doesn't really matter but let's go with bugs closed over time. So assume there are three teams and all of them are working on different sections of the code maybe different code libraries, different controllers physically, it really doesn't matter. The business though, it's trying to determine performance between those teams to grade them and by measuring activity metrics. Team A closes 5 bugs, Team B closes 10 and Team C closes 50 bugs and that's all over the same time period. So Team C, 50 bugs? Well they're the highest performing team aren't they? Well maybe they are, maybe they're not. What if I actually told you that Team A objectively was the highest performing team? Because when we deep dive into it, well it It turns out that Team A worked for 25% more hours than the other two teams did, and resolved issues that had caused system crashes and catastrophic failures. And both teams B and C had been trying to fix them for months, and they'd failed many times to resolve those issues. Turns out that Team C's bug fix count list consisted of 30 of them, which is typographical errors. And the balance of those bugs they closed very straightforward, low priority bugs that really didn't add any specifically huge value and yet they counted against their metric and it looked amazing but it just wasn't. So the problem with the metric you might say is actually the lack of depth of detail since not all bugs are created equal in terms of effort to resolve them and their impact to the system which is true but then we reach that point how do we honestly objectively and fairly assess just how difficult a bug is to resolve. Sometimes we measure priority of bugs like a P1, priority one, you know, can't ship without it, P2 can wait for the next service pack and so on, but estimating effort to fix a bug is extremely difficult even for experienced developers. There's always an element of interpretation, professional judgment and balance to determine which team was actually the highest performing and ultimately only the manager that technically understands what their team is doing is able to interpret this and if they can't understand it then their only option is to metricize their team's work. Of course there's other options, maybe they don't want to spend the time to understand, or as in like they're too lazy, or maybe their management layer above demands that they must have a metric. So when I come across the situation of activity measurement as a measure of output. The things that strike me are, first of all, A) the manager either doesn't understand what their team does or B) they're too lazy to do their job or C) they've allowed their management to push irrelevant activity metrics upon them. For all these reasons or any of these reasons, they shouldn't be in management. Try something else like cross stitch or something. Thinking of it like a mathematical equation just for a second. Productive plus activity equals productivity, which is what we care about. Whereas, activity minus productive equals, well, just activity, which is a waste of time, a waste of money, just a waste of everything. If you're enjoying Analytical and want to support the show, you can via Patreon at at or one word. With a thank you to all of our patrons and a special thank you to our silver producers Mitch Bjorgar, John Whitlow, Joseph Antonio, Kevin Koch, Oliver Steele and Shane O'Neill. And an extra special thank you to our gold producer known only as R. Patrons have access to early release high quality ad free episodes as well as bonus episodes and this is done by Patreon. Visit to learn how you can help this show to continue to be made. Of course, there's lots of other ways to help like favoriting the episode in your podcast player app or sharing the episode or the show with your friends or via social. Some podcast players let you share audio clips of episodes, so if you have a favorite segment, feel free to share that too. All of these things help others discover the show and can make a big difference too. You can follow me on the Fediverse at [email protected], on Twitter @JohnChigi or the network at engineered_net. Accept nothing. Question everything. always a good time to analyze something. I'm John Cheerjee. Thanks so much for listening. [Music]
Duration 13 minutes and 4 seconds Direct Download

Show Notes

Links of potential interest:

Episode Gold Producer: 'r'.
Episode Silver Producers: Mitch Biegler, John Whitlow, Joseph Antonio, Kevin Koch, Oliver Steele and Shane O'Neill.
Premium supporters have access to high-quality, early released episodes with a full back-catalogues of previous episodes


John Chidgey

John Chidgey

John is an Electrical, Instrumentation and Control Systems Engineer, software developer, podcaster, vocal actor and runs TechDistortion and the Engineered Network. John is a Chartered Professional Engineer in both Electrical Engineering and Information, Telecommunications and Electronics Engineering (ITEE) and a semi-regular conference speaker.

John has produced and appeared on many podcasts including Pragmatic and Causality and is available for hire for Vocal Acting or advertising. He has experience and interest in HMI Design, Alarm Management, Cyber-security and Root Cause Analysis.

Described as the David Attenborough of disasters, and a Dreamy Narrator with Great Pipes by the Podfather Adam Curry.

You can find him on the Fediverse and on Twitter.