Do the right thing. Assuming you know what that is.
I have kind of a love/hate thing going for Roy Osherove's blog. The "hate" part comes because he always challenges my perspective when I least expect it. Some of his posts seem like they are baiting people and it is easy to discount them as biased based on his position with TypeMock.
But given what I think I know about him, I read these "inflammatory" posts with a different view. That is, as someone who is challenging the view of my personal echo chamber. I read tons of posts extolling the virtues of mock objects and Rhino Mocks. So it hurts my little brain to see someone I respect saying it's okay to keep your current "creative" design and still be able to test it.
It starts, "Yeah, whatever. Like I'd ever do that." Then I start mulling it over and grumble to myself, "well, I guess that would have made sense in this past situation", then "actually, that makes sense for quite a few scenarios." Eventually, the train of though leads to my lying on my bed sobbing on the phone with my old employer to apologize for the colourful analogies I made to describe his project when I was unceremoniously let go for rocking the boat too much. Which is an interesting thing to account for when drawing up the timesheet for my current contract.
Anyway, this Roy-love isn't the real reason for this post but hopefully, you're used to the hillbilly's verbose lead-ins.
Roy's most recent post is another example of one that seemed bound from the beginning to give me a headache just from the title alone. But luckily, it touches on a subject I've thought about before, particularly when dealing with the fledgling Bahamian software industry.
I won't paraphrase because chances are, you've read it already but the question it tickled in my mind was, "Are the best practices I've adopted over the last two years practical?"
I've mentioned this a little before from the perspective of a small application for my family who can't afford a senior developer should I take up shark-baiting in the near future. And it's going to be hard to talk about this without sounding elitist so I'll call up the good will of my 98 previous posts at CodeBetter and hope y'all assume the best.
I took JP's Nothing But .NET course over a year ago and had an absolute blast. But quite a lot of people struggled with it. Since then, I've made a more concerted effort to learn more about things like mocking, dependency injection, the SOLID principles, etc, etc, and so on and so forth. It hasn't always been easy but it's been tremendously rewarding. Learning all of it has made development fun again. Plus it's allowed me to connect with a ton of other people both as mentor and learner. And as a mentor, I've seen my share of people struggle with it. Even before I started, I saw a lot of people fight to understand things like AJAX calls, NAnt scripts, even CSS.
By all accounts, these are reasonably bright people. They want to do a good job and are receptive to new ideas. But quite frankly, some of these things are hard. Let no one forget that learning how to properly use mock objects is *not* an easy task and until you "get" them, they will seem like unnecessary overhead. I resisted TDD for many a moon. Even today, it's still not quite second nature. That's mostly because I'm stuck in Livelink-land these days which contains code that would make Michael Feathers shake his head in defeat.
When I entertain these thoughts, it's usually a battle between "should we cool our heels a bit until we hit the tipping point" and "should we keep going full tilt until the message starts getting across". As Roy mentions in his point, the learning curve is high. Do we keep pushing the learning so that more people get over it or do we lower the learning curve until most people get it, then raise it a little?
I'd like to think we can do the first. The second seems like giving up. And worse, if we start to "dumb things down" in actual projects, it will be that much harder to actually lower the learning curve because no one will be pushing the boundaries. No Fluent NHibernate, no MvcContrib (or even ASP.NET MVC), no StructureMap. We've all seen what teaching to the lowest common denominator has done for the North American education system.
Yes, we can put the onus on programmers to "do a better job" but let's face it, these people have lives. My dad is a land surveyor and my mom was a registered nurse. By all accounts, they were/are very good at their jobs and I remember very few instances of them taking their work home with them or advancing their learning outside of their work. With software development, it is almost expected that if you want to get better, you need to do it on your own time. And when you're done, you face going in to work to find that the rest of your team hasn't done their part and thus, are not amenable to the changes you want to make.
So how far can we reasonably push people? The work needs to actually get done and it seems a good chunk of programmers still don't put any special effort into making code maintainable over the long term. Do we need to change the message? Or the medium? Are user groups and code camps doing their part or simply enforcing the status quo? Is alt.net making a difference or fragmenting the industry? Or simply being ignored?
Will be interested in hearing people's comments on this as I actually have an over-arching reason for this line of questioning.
Lord Tunderin' Jayzus, what a friggen essay this turned out to be. Now I gotta go dig up a couple of halfway-relevant images to balance out this tome. Serves me right for thinkin'.
Kyle the Introspective