Size Matters

Some say that software development is challenging because of complexity. This might be true, but this definition does not help us find solutions to reduce complexity. We need a better way to explain complexity to non-technical people.

The reality is that when coding a project size matters.  Size is measured in the number of pathways through a code base, not by the number of lines of code. Size is proportional to the number of function points in a project.

There are many IT people that succeed with programs of a certain size and then fail miserably when taking on programs that are more sophisticated.  Complexity increases with size because the number of pathways increase exponentially in large programs.

Virtually anyone (even CEOs 🙂 ) can build a hello, world! application; an application that only has a single pathway through it and is as simple as you can get.  Some CEOs write the simple hello, world! program and incorrectly convince themselves that development is easy. Hello, world! only has a single pathway through it and virtually anyone can write it.

main() {
printf( “hello, world” );
}

If you have an executive that can’t even complete hello,world then you should take away his computer 🙂

CallTreeComplexity Defined

As programs get more sophisticated, the number of decisions that have to be made increase and the depth of the call tree increases.  Every non-trivial routine will have multiple pathways through it.

If your average call depth is 10 with an average of 4 pathways through each routine then this represents over 1 million pathways.  If the average call depth is 15 then it represents 107 million pathways.

Increasing sophisticated programs have greater call depth than ever and distributed applications increase the call depth even because the call depth of a system is additive. This is what we mean by complexity; it is impossible for us to test all of the different pathways in a black box fashion.

Now in reality every combination of pathways is not possible, but you only have to leave holes in a few routines and you will have hundreds, if not thousands, of pathways where calculations and decisions can go wrong.

In addition, incorrect calculations or decisions higher up in the call tree can lead to difficult to find defects that may blow up much further away from the source of the problem.

What are Defects?

Software defects occur for very simple reasons, an incorrect calculation is performed that causes an output value to be incorrect.  Sometimes there is no calculation at all because input data is not validated to be consistent and that data is either stored incorrectly or goes on to cause incorrect calculations to be performed.

Call Tree and Defects, discoveredWe only recognize that we have a defect when we see an output value and recognize that it is incorrect. More likely QA sees it and tells us that we are incorrect.

Basically we follow a pathway that is correct through nodes 1, 2, 3, 4, and 5.  At point 6 we make a miscalculation calculation, and then we have the incorrect values at points 7 and 8 and discover the problem at node 9.

So once we have a miscalculation, we will either continue to make incorrect calculations or make incorrect decisions and go down the wrong pathways (where we will then make incorrect calculations).

Not all Defects are Equal

It is clear that the more distance there is between a miscalculation and its discover will make defects harder to detect.  The longer the call depth the greater the chance that there can be a large distance between the origin and detection, in other words:

Size Matters

Today we build sophisticated systems of many cooperating applications and the call depth is exponential with the size of the system.  This is what we mean by complexity in software.

Reducing Complexity

Complexity is reduced for every function where:

  • You can identify when inconsistent parameters are passed to a function
  • All calculations inside of a function are done correctly
  • All decisions through the code are taken correctly

The best way to solve all 3 issues is through formal  planning and development.Two methodologies that focus directly on planning at the personal and team level are the Personal Software Process (PSP) and the Team Software Process (TSP) invented by Watts Humphrey.

Identifying inconsistent parameters is easiest when you use Design By Contract (DbC) , a technique that was pioneered by the Eiffel programming language. It is important to use DbC on all functions that are in the core pathways of an application.

Using Test Driven Development is a sure way to make sure that all calculations inside of a function are done correctly, but only if you write tests for every pathway through a function.

Making sure that all calculations are done correctly inside a function and that correct decisions are make through the code is best done through through code inspections (see Inspections are not Optional and Software Professionals do Inspections).

All techniques that can be used to reduce complexity and prove the correctness of your program are covered in Debuggers are for Losers.  N.B. Debuggers as the only formalism will only work well for systems with low call depth and low branching.

Conclusion

Therefore, complexity in software development is about making sure that all the code pathways are accounted for.  In increasingly sophisticated software systems the number of code pathways increases exponentially with the call depth. Using formal methods is the only way to account for all the pathways in a sophisticated program; otherwise the number of defects will multiply exponentially and cause your project to fail.

Only projects with low complexity (i.e. small call depth) can afford to be informal and only use debuggers to get control of the system pathways. As a system gets larger only the use of formal mechanisms can reduce complexity and develop sophisticated systems. Those formal mechanisms include:

  • Personal Software Process and Team Software Process
  • Design by Contract (via Aspect Oriented Programming)
  • Test Driven Development
  • Code and Design Inspections
VN:F [1.9.22_1171]
Rating: 0.0/5 (0 votes cast)
VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)

NO Experience Necessary!!!

Did you know that we have never found a relationship between a developer’s years of experience and code quality or productivity?

The original study that found huge variations in individual programming productivity was conducted in the late 1960s by Sackman, Erikson, and Grant (1968).

This study has been repeated at least 8 times over 30 years and the results have not changed! (see below)

Sackman et al studied professional programmers with an average of 7 years’ experience and found that:

  • the ratio of initial coding time was about 20 to 1
  • the ratio of debugging times over 25 to 1
  • program execution speed about 10 to 1
  • program size 5 to 1

They found no relationship between a programmer’s number of years of experience and code quality or productivity.  That is there was NO correlation between experience and productivity (i.e. ability to produce code) and there was NO correlation between experience and quality (i.e. minimizing defects) .

Think about that for a minute…

That is the worst programmers and the best programmers made distinct groups and each group had people of low and high experience levels.  Whether training helps developers or not is not indicated by these findings, only that years of experience do not matter.

Without considering legality, this means that it is simpler to get rid of expensive poor performers with many years of experience and hire good performers with few years of experience!

Results Have Been Confirmed for 30 Years!

There were flaws in the study, however, even after accounting for the flaws, their data still shows more than an order of magnitude difference between the best programmers and the worst, and that difference was not related to experience.  In years since the original study, the general finding that “There are order-of-magnitude differences among programmers” has been confirmed by many other studies of professional programmers (full references at the end of the article):

  • Curtis 1981
  • Mills 1983
  • DeMarco and Lister 1985
  • Curtis et al. 1986
  • Card 1987
  • Boehm and Papaccio 1988
  • Valett and McGarry 1989
  • Boehm et al 2000

Technology is More Sophisticated, Developers are not

You might  think that we know much more about software development today than we knew in 1968, after all today:

  • we have better computer languages
  • we have more sophisticated technology
  • we have better research on effective software patterns
  • we have formal software degrees available in university

It turns out that all these things are true, but we still have order of magnitude differences among programmers and the difference is not related to years of experience.  That means that there is some other x-factor that drives productive developers;  that x-factor is probably the ability to plan and make good decisions.

The bad news is that if you are not a productive developer writing quality code  then you will probably not get better simply because of years of experience.

Developers face making decisions on how to structure their code every day.  There is always a choice when it comes to:

  • laying out code pathways
  • packaging functions into classes
  • packaging classes into packages/modules

Because developers face coding decisions, many of which are complex, the best developers will plan their work and make good decisions.  Bad developers just ‘jump in’; they assume that they can always rewrite code or make up for bad decisions later. Bad developers are not even aware that their decision processes are poor and that they can become much better by planning their work.

Solution might be PSP and TSP

Watts Humphrey tried to get developers to understand the value of estimating, planning development, and making decisions in the Personal Software Process (PSP) for individuals and the Team Software Process (TSP) for teams, but only a handful of organizations have embraced it.  Capers Jones has done analysis of over 18,000 projects and discovered that1:

PSP can raise productivity by 21.2% and quality by 31.2%
TSP can raise productivity by 20.9% and quality by 30.9%

All of these findings should have a profound effect on the way that we build our teams. Rather than having large teams of mediocre developers, it makes much more sense to have smaller teams of highly productive developers that know how to plan and make good decisions.  The PSP and TSP do suggest that the best way to rehabilitate a poor developer is to teach them how to make better decisions.

Be aware, there is a difference between knowledge of technologies which is gained over time and the ability to be productive and write quality code.

Conclusion

We inherently know this, we just don’t do it.  If the senior management of organizations only knew about these papers, we could make sure that the productive people get paid what they are worth and the non-productive people could seek employment in some other field.  This would not only reduce the cost of building software but also increase the quality of the software that is produced.

Unfortunately, we are doomed to religious battles where people debate methodologies, languages, and technologies in the foreseeable future.  The way that most organizations develop code makes voodoo look like a science!

Eventually we’ll put the ‘science’ back in Computer Science, I just don’t know if it will be in my lifetime…

Check out Stop It! No… Really stop it. to learn about 5 worst practices that need to be stopped right now to improve productivity and quality.

Bibliography

Boehm, Barry W., and Philip N. Papaccio. 1988. “Understanding and Controlling Software Costs.” IEEE Transactions on Software Engineering SE-14, no. 10 (October): 1462-77.

Boehm, Barry, et al, 2000. Software Cost Estimation with Cocomo II, Boston, Mass.: Addison Wesley, 2000.

Card, David N. 1987. “A Software Technology Evaluation Program.” Information and Software Technology 29, no. 6 (July/August): 291-300.

Curtis, Bill. 1981. “Substantiating Programmer Variability.” Proceedings of the IEEE 69, no. 7: 846.

Curtis, Bill, et al. 1986. “Software Psychology: The Need for an Interdisciplinary Program.” Proceedings of the IEEE 74, no. 8: 1092-1106.

DeMarco, Tom, and Timothy Lister. 1985. “Programmer Performance and the Effects of the Workplace.” Proceedings of the 8th International Conference on Software Engineering. Washington, D.C.: IEEE Computer Society Press, 268-72.

1Jones, Capers. SCORING AND EVALUATING SOFTWARE METHODS, PRACTICES, AND RESULTS. 2008.

Mills, Harlan D. 1983. Software Productivity. Boston, Mass.: Little, Brown.

Valett, J., and F. E. McGarry. 1989. “A Summary of Software Measurement Experiences in the Software Engineering Laboratory.” Journal of Systems and Software 9, no. 2 (February): 137-48.

VN:F [1.9.22_1171]
Rating: 4.3/5 (6 votes cast)
VN:F [1.9.22_1171]
Rating: +1 (from 1 vote)

Understanding your chances of having a successful software project

We have been building software systems for over 50 years, and yet success rates remain extremely low, see Dan Galorath for some information.  Different reports put the success rates at different levels but successful projects are rarely higher than 30-40%.

Report Year Successful Challenged Failure
Standish Chaos Reports 2009 32% 44% 24%
Saur & Cuthbertson 2003 16% 74% 10%
Tata Consultancy 2007 38% 62%

It might seem strange, but we don’t all have the same definition of success for software projects.  Success is when a project delivers the expected benefits within 10% of cost and schedule.  For me, a project is NOT successful when:

  • It does not deliver what was promised
  • It has cost or time over runs of over 10%
  • It has its scope dramatically reduced so that victory can be claimed
  • It does not have a positive net present value, i.e. it never breaks even

Under those conditions I’m guessing that there are even fewer successful projects out there.  Let’s make software success extremely concrete.  Imagine that you are on a street corner watching people cross the street to the other street corner.  Imagine that out of every 10 people trying to cross the street only 3 people cross successfully, the other 7 get maimed or killed.

How interested would you be in crossing the street?

Sun Tzu wrote:

War is of vital importance to the state; hence it is a subject of inquiry which can on no account be neglected.

In our modern world, software is of vital importance to your organization.   If you can solve your business issues by building software consistently and reliably you will gain a tremendous advantage over your competition.

One misconception is that software projects fail because “I am surrounded by idiots!   Just because we get frustrated at being unable to get software built does not make this statement true.  In fact, the exact opposite is true, the average IQ of Computer System Analysts is 111.3[1], and any IQ above 110 is considered to be Superior Intelligence[2]. 

Don’t get me wrong; you might have a bunch of developers from the shallow end of the gene pool, but that does not explain how thousands of organizations fail to build quality software.  The point is that software does not fail because there are not enough smart people looking at the problem.

There are plenty of consultants, a.k.a. snake oil salesmen, that are willing to sell you “silver bullet” solutions that will solve your every problem.  Have you ever seen any of these really work?  Each of these solutions will generally solve one aspect of your problem and leave you with a larger one to fix later; they will also leave big holes in your budget. Unfortunately, we often succumb to “silver bullet” solutions because they tell us what we want to hear.

To really fix your software development problems requires better understanding of basic principles; after all, there are software projects that succeed out there.  Not surprisingly, the companies that have figured out how to develop software consistently and reliably tend to have the fewest failures.  Learning how to develop software consistently and reliably requires that you learn how the following 5 elements intersect and affect each other:

  1. Requirements
  2. Project Management
  3. Principles
  4. Developers
  5. Executives

Each of these elements will intersect with all of the others.  You will continually stumble through software development until you get the minimum level of execution and synchronicity between these 5 rings.

Organizations that do not understand these 5 rings will create organizational structures and processes that are doomed to fail.  Poor organizational structure and processes will create systemic problems that will lead to the following problems:

  • Constant fire fighting (blog)
  • Inflexible software (blog)
  • Poor architecture (blog)

Problems from poor organizational structure and processes will lead to failed software projects because of the SYSTEM, not the PEOPLE.  However, people always assume that someone is to blame; they rarely look for problems inside the system.  This will lead to severe morale problems and the loss of competent personnel.

 Fixing your software development is a matter of understanding the principles of good organizational and process design.  Once you understand how to balance the 5 elements you will begin to experience success in building software.

 Appendix: Modern IQ Ranges for Various Occupations

According to modern IQ ranges, computer system analysts have one of the highest intelligence quotients of all professions.


[1] Average IQ by occupation (estimated from wordsum scores), January 22, 2011.  Available from http://anepigone.blogspot.ca/2011/01/average-iq-by-occupation.html

[2] What Different IQ Scores Mean, April 12, 2004.   Available from http://wilderdom.com/intelligence/IQWhatScoresMean.html

VN:F [1.9.22_1171]
Rating: 0.0/5 (0 votes cast)
VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)