The difficulties of grasping complexity
Gerald Weinberg has suggested in his writings that we face a natural law that he calls the Square Law of Computation: "Human brain capacity is more or less fixed, but software complexity grows at least as fast as the square of the size of the program".
The first response of companies is often to try and hire the smartest, most talented people they can... and I won't claim it is a mistake, but the magic soon evaporates when the ambition rises again. The law is the law!
They thus strive to find innovative ways to overcome this complexity.
All efforts we make to perfectly understand the present, guess and control the future are doomed, but it is not forbidden to build approximations models... As Weinberg (again) states, "we'll never have complete control, but neither are we victims"
We easily distinguish two families of effort:
- The scientific approaches try to build a complete theory describing a coherent system.
Some examples I think about are:
- MBTI for human psychology
- Newton gravity, and then Einstein relativity for astrophysics
- Taylorism for car industry
- Cascade for software engineering
- Modularity and low coupling for software design (to divide and conquer)
- XP principles and practices as an approach to software projects
- Lean Thinking for continuous improvement in any construction process
The much younger software industry confusingly tries to imitate the elder-sister, with a mitigated success: the same recipes are disappointing, as the properties and constraints of software development are not exactly the same! Well, maybe we are not so far to attain our next plateau thanks to a subtle combination of XP and Lean, the second permitting to scientifically measure, control and improve the first. Unfortunately, I do not have enough hindsight and experience to go further, but hope to be able to experiment in my current project.
Special thanks to Regis, for our very interesting discussion on the topic, that inspired this post.