A friend of mine pointed me to a talk by Greg Wilson, an author of software engineering books. The point of his talk is that there is a lot of bogus claims and practices around software that aren’t backed up with data (for instance, the usefulness of UML in code design). He then listed several rules of thumb that *are* backed up by data and many of these rang true with me in my 20 or so years of experience. Here they are:
Adding 25% more features doubles the complexity.
Project failure tends to come from poor estimation and unstable requirements.
If you have to rewrite more than 20% of a component, start from scratch.
Reading code (code review) is the best technique known for fixing bugs. Caveat #1: Most value comes from the first review during the first hour. Caveat #2: The most code a person can really review is ~200 lines in an hour.
Software reflects the organization that wrote it. Case in point: A Microsoft study of Windows Vista showed that fault rates were most dependent on organizational chart distance!
Lines of code is the strongest metric. Other more complicated metrics tend to scale with lines of code.
Nobody uses UML.
No comments:
Post a Comment