There is debate over the term 10x developer which is the idea that some developers are ten times as productive as other average programmers. This was popularized by some research which was subsequently refuted but the idea lives on. And if true, how can we all become "10x developers"? From the time I began taking programming classes in school, which is when I was first exposed to other people's programming abilities, I knew that skill level varied and varied greatly. In an arithmetics class, the best student might score 20 or 30 percent higher than the average. But there's no way to score 100 times the average as that level of ability isn't measured in class. But in a programming class, there will be a number of students who can't complete a working program in a given amount of time, and a student who is a slow coder but eventually gets something functioning is infinitely better than the student who doesn't. Those who can't typically don't go on to become career programmers, but they might need to pass the class for some IT-related business management degree and go on to become Excel wizards.

But among profressional programmers what is a good standard deviation in productivity levels?

Read the rest of this article...
Syndicate content
© 2010-2014 Saigonist.