Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think a bigger epidemic is we're putting too much emphasis on "do this" and "do that" and "if you don't do this then you're a terrible programmer". While that sometimes may be true, much more importantly is to have competent, properly trained professionals, who can reason and think critically about what they're doing, and who have a few years of experience doing this under their belt. Just like other skilled trades, there's a certain kind of knowledge that you can't just explain or distill into a set of rules, you have to just know it. And I see that in the first example in this article, where the junior programmer is writing terrible tests because he just doesn't know why they're bad tests (yet).


It seems to me that management is taught that a dependence on expensive experts, is a problem to be optimized. They want to manage the development process in a way that allows them to easily swap out one developer or team for another, or to ramp up production simply by increasing the number of developers assigned to a task.

It is almost as if they see the success that we have had in dev/ops with the "cattle, not pets" philosophy, and want to apply that in their own field. Making the subject behave consistently and predictably, whether it is a machine or a human professional, would be a prerequisite for that.


Will they succeed?


Sure, as long as their users unanimously abide by a very large rulebook ;)


I'd say they are "succeeding" in the sense that they continue to make money. Whether they're truly successful in the sense of providing proper value to their customers to justify that money, and treating their programmers right, is another story, and I've seen enough cases where they don't do either of these, while the executives make plenty of money to live comfortably for years by exploiting both customers and their own programmers.


That's why I'm highly skeptical when I hear the word "best practices".

Sure, the intention is good, but it promotes mindless repeating of patterns over thinking about what really helps.


On the other hand, every discipline that I can think of has its own set of best practices. Why should software development be any different? I know a lot of people are susceptible to the mindless repeating, but that's not a fault of best practices.


I'm not against the idea per se, just highly skeptical when I hear it used, because more often then not, I've seen it used to promote mindless repetition of previously used patterns.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: