Best practices sound good in isolation, but they can suck the life out of developers.
The other day, at lunch, I had a bit of an epiphany. I also had the pulled pork, but that’s another story. In any event, something came into clarity that had been bothering me below the surface for a long time.
Over the past few years, the software industry has become increasingly focused on process and metrics as a way to ensure “quality” code. If you were to follow all the best practices now, you would be:
- Doing full TDD, writing your tests before you wrote any implementing code.
- Requiring some arbitrary percentage of code coverage before check-in.
- Having full code reviews on all check-ins.
- Using tools like Coverity to generate code complexity numbers and requiring developers to refactor code that has too high a complexity rating.
In addition, if your company has drunk the Scrum Kool-Aid, you would also be spending your days:
- Generating headlines, stories and tasks.
- Grooming stories before each sprint.
- Sitting through planning sessions.
- Tracking your time to generate burn-down charts for management.
In short, you’re spending a lot of your time on process, and less and less actually coding the applications. I’ve worked on some projects where the test cases took two- or three-times as much time to code as the actual code, or where having to shoehorn in shims to make unit tests work has reduced the readability of the code. I’ve also seen examples of developers having to game the tools to get their line coverage or code complexity numbers to meet targets.
The underlying feedback loop making this progressively worse is that passionate programmers write great code, but process kills passion. Disaffected programmers write poor code, and poor code makes management add more process in an attempt to “make” their programmers write good code. That just makes morale worse, and so on.
Now, I’m certainly not advocating some kind of Wild-West approach where nothing is tested, developers code what they want regardless of schedule, etc. But the blind application of process best practices across all development is turning what should be a creative process into chartered accountancy with a side of prison. While every one of these hoops looks good in isolation (except perhaps Scrum …), making developers jump through all of them will demoralize even the most passionate geek.
I don’t have a magic bullet here, but companies need to start acknowledging that there is a qualitative difference between developers. Making all of them wear the same weighted yokes to ensure the least among them doesn’t screw up is detrimental to overall morale and efficiency of the whole.
Now, this may sound a little arrogant: “I’m an experienced developer, I don’t need any of these new-fangled practices to make my code good.” But, for example, maybe junior (or specialized) developers should be writing the unit tests, leaving the more seasoned developers free to concentrate on the actual implementation of the application. Maybe you don’t need to micro-manage them with daily updates to VersionOne to make sure they’re going to make their sprint commitments. Perhaps an over-the-shoulder code review would be preferable to a formal code review process.
And as an aside, if you’re going to say you’re practicing agile development, then practice agile development! A project where you decide before you start a product cycle the features that must be in the product, the ship date, and the assigned resources is a waterfall project. Using terms like “stories” and “sprints” just adds a crunchy agile shell, and it’s madness to think anything else. And frankly, this is what has led to the entire Scrum/burndown chart mentality, because development teams aren’t given the flexibility to “ship what’s ready, when it’s ready.”
Unless the problems I’m talking about are addressed, I fear that the process/passion negative feedback loop is going to continue to drag otherwise engaged developers down into a morass of meetings and metrics-gaming.
James Turner / O’Reilly Radar | @blackbearnh