Skip to main content

"Practice" as it applies to development

CNN Money had an interesting article on "deliberate practice".   It essentially says that the top performers in any field get there by relentlessly learning to be better at what they do, rather than due to inborn talent, and that "deliberate practice" is not what most people do.

This is not a new idea, but it seems to be a good one.  Ronny Max blogged about it over two years ago, and she cited "Freakanomics" as her source, which was first published in 2005.
Screenshot
Geoff Colvin (the author of the CNN piece) cited eight elements that distinguish it from what most other people consider practice:
  1. It is designed specifically to improve performance.
  2. It is usually most effective when repeated a lot.
  3. Continuous feedback is available.
  4. It is mentally demanding.
  5. It is difficult.
  6. Practitioners set specific goals focused on process (not just outcome).
  7. Practitioners "think about their thinking" while practicing.
  8. Practitioners evaluate their own performance carefully.
I like the concept for a couple of reasons, not least of which is that it offers hope for improvement in any endeavor.  I disagree with one of the author's central points - that "talent is overrated".  I suspect that you must have a base level of talent for anything to reach the pinnacle of achievement in it, and that practice is how you turn "base talent" into "world class".


But how does this apply to software development?


Good question.  This isn't like practicing a physical skill, and most of the examples in the article were based on sports.  Here are three concepts that might prove useful:

Design the Goal (Items 1 & 6 above)
Think in advance what skill you want to improve with any given programming exercise.  Give yourself the opportunity to practice the skill as part of the goal (i.e., improving the skill is the primary goal), and decide ahead of time how you will measure performance toward the goal.  Some example skills include:
  • Retaining and applying knowledge of the toolset (i.e., using a specific set of libraries or a design approach).
  • Writing code rapidly without constantly referring to a book or API doc.
  • Improving fidelity to requirements.
  • Writing thorough tests quickly.
Give Yourself Constant Feedback (Items 3 & 7)
Ask yourself "how am I doing" regularly. Review the goals that you set for yourself to ensure that you're doing / learning what you wanted to do / learn. Also consider your mental state - are you remaining focused on task?  Are you letting yourself  get distracted? Do you really want to attain this goal?

Evaluate Your Performance (Item 8)
As soon as you're done with the process part of the exercise, consider carefully what you should do differently next time. Take some notes!  You want to do it right next time, rather than avoiding the situation entirely.


Post a Comment

Popular posts from this blog

Agile Performance Management: Why Performance Reviews Suck

Many thanks to Mary Poppendieck, who wrote about this topic in 2004, and proposed a comprehensive solution.  She is the inspiration for much of my thinking on this subject.  She is also a better writer than I am a cartoonist.


Performance reviews suck.  I don't know of anyone who goes into their appraisal without some trepidation.  Your boss is guaranteed spring some surprise criticism on you that is ill-informed or misses the point as you see it.  It's a real challenge not to get defensive about that.

The only thing that makes your own performance review suck less is having to give them.  As a manager, I have dished out quite a few, and some of them went pretty badly.  (To the people at my first management job: Thanks for helping me learn how to get better at them.  Your sacrifice was not in vain.)  Since then, receiving one isn't nearly as gut-wrenching, if only because I try to make it easier for the guy on the other side of the desk.  I've been there, and I know how …

Do. Not. Optimize.

You've probably heard this quote before:
Premature optimization is the root of all evil.
 - Tony Hoare
Speculative optimization is always wasted time.  In the absence of an actual performance problem, you're just burning time that could be better spent on refactoring your code to make it clearer.  This is exacerbated because performance-optimized code is usually harder to read than code which hasn't received such treatment.

Here is what you're doing when you optimize:
Adding code that now must be maintained.Obfuscating the existing code.Spending time writing code that doesn't add value. But what's that you say?  You have the experience and know-how to decide when optimization is needed?  Maybe, but probably not.   The people at Sun and Oracle may or may not be smarter than  you or me, but they certainly know more about optimizing Java bytecode than we do.

For example, some people think that having a large number of classes is slower than the alternative.  This …

Showing Off: How to Do a User Demo

Rather than repeating what has been said elsewhere, here is a nice short post on agile-for-all that covers the basics.

Here are a few things for my own future reference and teams that I'm working with...

Try to keep each demo to 5 minutes or less.   If it's longer than that, it's possible that you should be demoing more than one story.  More likely, you're just being too wordy.

TALK LOUDLY.   No, louder than that.  Louder.  Do you feel like you're yelling?  OK, that's about right.  You need to put your voice in public-address mode for 5 minutes.

Focus on why your audience should care about the story  This is particularly important for back-end work.  For example: Your story generates a feed of XML that will be consumed by another application. Show the output, and point to a couple of salient features in it.  Then be done.

The important part of the above is "show the output."  Showing the end users how to interact with your service is a separate sit-d…