Evolutionary Software Development


In this short paper I am proposing that the software development methods and systems in place today are not only ineffective but can never be effective.

The premises underlying modern systems development and software development in particular are not so much broken as inappropriate to the task.

Instead I propose that we consider software to be like a living organism and that it’s growth, development, and success will be the result of an evolutionary process that recognises useful and useless aspects and builds on them to eventually make the system required by the users and patrons.

One of the important aspects of this is not that the final system reflects some pre determined design, but rather that it meets the needs of the sponsors in both predicted and unpredictable ways.

This is the new definition of correct software and it changes the concept of completed software because it recognises that there is no such thing but rather the current version is part of a continuum of design and implementation moving at all times in and around the target.

Earlier work on this was done by Eric Raymond in his now seminal work ‘The Cathedral and the Bazaar’.


Somewhere in the mythology of Unix (and therefore Linux) is the question: Why did Ken Thompson write Unix? According to the mythology (and as far as I know it is correct) Ken wanted to keep playing a space game, but the machine it ran on was to be decommissioned and he needed to get an operating system working on the new machine (PDP 7). Between Dennis Ritchie writing C to provide the portable programming language and Ken writing the operating system we have the beginnings of the backbone software of the entire internet and most large corporate systems of today. Was that Dennis’ and Ken’s goals? Improving on Multics was a major driver, but I like the idea that Ken just wanted to keep playing computer games. Did AT&T have a plan to dominate an as yet non-existent technology? No. Of course not. And yet here we are today. The whole planet totally dependent on the evolution of an idea that made a computer game available to a geek working deep in the bowels of the largest telephone company in the world.

After 40 years writing software, much of it complex, it is time to reflect on why so many of my software projects can be considered successful and why in general the accepted failure rate for software projects is 80%+.

This essay is a reflection on how successful software is written and what happens in the industry that leads to so much failure.

The really interesting thing is that very large pieces of software can and should be regarded as successful. These successful works are things like Windows, Unix/Linux, compilers, database managers, social networks, word processors, the large body of open source software and many proprietary ERP and HR systems. Also successful are ‘apps’ and many disruptive programs.

The failures seem to be almost exclusively the domain of corporate and government projects. Large specialised systems designed to meet an important need. In this context even small projects can ‘fail’.

A lot of time and thought has gone into discovering the reasons for failure. This is a doomed effort. The reasons for failure are many. There seems however to be one overriding reason for success and that should be explored.


Software systems are the most complex things human beings have built. Even small systems have a complexity that makes the great architectural and engineering works we see around us seem very simplistic.

We know this because one of the gifts of the study of software systems is the concept of function points and counting them tells us how complex something is.

When we calculate complexity we don’t count components, we count functions. A large building has many components but it has relatively few functions. Modern ships and airplanes have many, many components, but again not many function points. When you deliver buildings, bridges, planes, cars, ships, you simply can’t allow them to be too complex or they won’t be delivered.

Software on the other hand is, by it’s very nature, complex. A large program can be easily composed of millions of function points and a large system can comprise thousands of the large programs.

The most successful programmers must therefore be comfortable working in a complex environment. This is not to everyone’s taste and it’s not something that most people just ‘do’. Human beings who can work in complex environments have almost always grown into it. In the same way that people seem to get better and better at skills needed in an industry, from music to brain surgery, waiting tables to driving trains. There might be some training to start but over time experience grows the skills and abilities of the person. Working in complex environments is the same. Over time a person learns to cope with the increasing complexity of the things they manage. In software it is often because we add complexity and grow with it.

Overview of Unibase

Unibase was my attempt to build a piece of software that could help me deal with complexity of building software. Yes, I meant what I said there.

Most of the software I build is for business and it didn’t take long to realise the truth in an old saying that a COBOL programmer writes the same program 15 times and that’s their career.

The truth is that commercial software is variations on a few themes. Today we see that in the browser application technology MVC – Model, View, Controller. Reference required

Unibase is an early attempt at such an architecture and today continues to provide that framework with one addition.

Unibase consists of a few core programs – report writer, transactional screen program, data capture, forms frontend to scripts, and a small number of supporting javascript functions. For scripts it is linked to Tcl as the script language of choice. Python has been considered, and can be used, but Tcl provides, in this author’s opinion, the most complete solution for complex scripts. References on comparisons

The unique feature of Unibase is the inclusion of a data dictionary. This provides a central repository for knowledge about the tables used in an application, their relationship, and the calculations used in the application. Importantly the calculations can involve all tables in the application and the built in summing function eliminates looping code. There are some other minor innovations to do with arrays, repetitive text, and so on.

The Unibase programs all support an option that lets you ask Unibase how it is going to access the data in the various tables.

The inclusion of the data dictionary as a core part of the system enables any team member to find out how something is done without the need for extensive supporting documentation. Designers can plan changes to database tables knowing that the change is the programming.

A report is simply a series of text formats that refer to the database using tables and attributes that can be looked up in the dictionary and from there Unibase can derive an execution plan.

Reports, like other components, are broken into components that match the layout of a report. Records layouts; headers; footers; sub-totals; etc Each is specified in a simple declarative syntax which, when combined with the dictionary, becomes an action plan. The plan is not concerned with what will be done with the outputs. It can be used to write a web page, a Tcl script, XML, postscript, Python, php. In fact any text you like.

Unibase works on tables stored in the current working directory making it automatically a multi-tenanted system (long before that became a buzz word).

A set of programs and a built in session system mean that direct outside access to an application’s code and programs is not possible. This is the byproduct of a carefully designed http access structure.

More details, manuals, and examples at http://unibase.zenucom.com.

Unibase and Determining Correctness

One of the challenges I faced early on in the development of Unibase was the problem of how to specify that data is wrong. Today you see this concept in ‘form validation’.

Forms themselves have been with us at least as long as that wonderful British invention – bureaucracy. One of the great delights of the typical bureaucrat is tormenting the humble petitioner over the failure of the form filling rather than understanding the reason for subjecting themselves to the humiliation of the form.

How do we know an entry in a form field is wrong? We can do it two ways. We can list all the wrong answers and check if the entry matches; or we can assert the things that must be true. The latter is always a relative short list while the former is a long and always incomplete list.

Example: to be completed

Epicurus in defining pleasure didn’t give a long list of pleasurable things. He instead defined it as the absence of pain and fear.

How would you define pleasure? A fine meal; extraordinary sex; fast cars; walking in the national park; dinner with friends; and so on. Worse the list is different for everyone.

Epicurus may or may not have found the perfect definition of pleasure, but by defining it as he did, any activity can be compared and if it involves the absence of pain and fear then it can be deemed pleasurable.


The Bureaucratic Problem and Project Management

Bureaucrats and middle management have a big problem. They are usually commissioned to obtain results when they don’t know how to achieve the results and they don’t understand how someone else will achieve the results.

To cope they build systems and processes designed to ‘manage’ and in particular to control expenses while still having the appearance of reaching goals.

That this works at all is amazing. That it fails in software development is to be expected.

Project Management has an underlying assumption that all the steps required to do something can be first identified and then accurately planned and costed.

In the complex world of software development this is simply not true. Ask any practitioner to write down in advance all the things that need to be done to write a piece of software and then compare them later with what was actually done. Any correlation is mostly accidental.

But the managers persist.

At one time we debated the merits of top down design vs bottom up design. Most ended up with a compromise – sort of top down broad brush, bottom up details, and a shifting of both ends as the problem and it’s difficulty became better understood.

Then the Waterfall model which is basically a top down approach that has been shown to be inadequate but persists because supposedly it can be costed and measured. Costing and measuring something that doesn’t work just means you have spent a lot of time getting answers that are wrong.

Agile development came along as a new silver bullet. Big goals are replaced by a bottom up approach of smaller, easier to achieve goals. This has been better but at the expense of accurate costing. A bit like the weather report, the next few days can be planned but next week is a mystery.

Today Agile is under fire as management realise that they are losing control of projects and budgets. With nothing or little to measure development against they worry about the unknown and unmanageable costs coupled with uncertain outcomes.

To some extent this is justified because Agile is the bastard child of top down control and creative enterprise. It was always going to have a difficult life.

Complexity. An introduction

The Myth of Software Development Methodology

The persistence of software project failure should be taken as an indicator that software development methodology itself is the problem.

The premise of software development methodology is that if we can just get the process right then we can get any piece of software developed on time and budget.

For this to be true it would mean we have mastered complexity. It would mean we can know the unknown before we know we don’t know it. It would mean that clients know exactly what they want and they can communicate it in unambiguous mathematical terms.

None of these things can be or are true.

It is time to admit that any software development methodology, let alone a good one, does not exist. This in turn means the Masters Degree in IT that you worked hard for is not only meaningless but a disadvantage as you pursue the wrong way to get a result.

The Evolutionary Principle


The Evolutionary Principle applied to software is very simple.

A small piece of software is found to do something useful. A user thinks well that was good, but what if it could do this as well? A developer thinks about that and says yes, I can get it to do that. After many iterations a large, complex, functional, and competent system emerges.

After a millions of years a light sensitive cell progresses to become a myriad of complex eye structures. This is how complex systems come about.

Software versioning is a testimony to the approach. Windows 10. Why not just Windows. There are many answers put forward. The technology didn’t exist when Bill Gates wrote MS DOS; graphics had to be built; and the list goes on. The truth is a mix of ideas, technology, customer demands, programmer abilities, etc. grew over time allowing Windows 10 to exist.

Other Evolutionary Activities

Evolutionary development isn’t a new concept. Science in general should be considered as the evolution of ideas.

Very little, if anything, in physics, maths, medicine, biology, etc stands alone. Even the revered “Relativity” stands on the shoulders of many ideas and problems that came before.

In computer science we have a major dilemna. The worst problem of all is the N != NP problem. That is do there exist problems that cannot be solved in polynomial time? (Think real time). Trouble here is that we don’t know if the problem itself is NP or not. That must be determined to know if we can solve the underlying problem.

And so we are left with some uncomfortable and difficult problems and decisions.

To show just how bad this is consider the ‘Intelligent Design’ peoples favourite example. The eye. How could that possibly come about from evolution? Deeper problem, why design an eye in the first place? Forget the complexity of the organ, why should it exist?

So now we are left with the existential problem of software development. Why should that function/program exist?

It exists because it is shown to be useful to someone or something. I design it, not because it is a good idea, but because it meets a need.

Evolutionary Practice

This is all well and good. We know that project planning and management in software is predominantly a very bad guess about what we want to do and how to do it.

When we build a large software system, even if we are inspired by a need, the important thing is that we don’t work at that need directly. It will no doubt arise, but it will arise from our plat and experimentation with lost of things, many of which will be failures or diversions, but the net result of which will be a working system.

If the system has evolved then it will almost certainly exceed the original specifications and ways that make it even more suitable for use.

How do we do this?

Well it’s rather simple really.

  • Don’t have rigid goals. They stifle creativity by prohibiting play and rejecting irrelevant (at the time) innovation.
  • Encourage play. Play and experimentation are the core ways to find out new and possibly helpful things. One of the great oxymorons of today (at least in Australia) is government mandated innovation. You can’t require that people or businesses be innovative but you can allow them to play which will likely result in innovation.
  • Sack the bureaucrats. IT managers and project managers exist to control the programmers and designers. The opposite of what you want. Instead assign developers to users or user groups that need things. As part of the department with shared goals these people will help advance the department through digital assistance.
  • Dispense with budgets. Time and monetary budgets squash innovation and evolution. That doesn’t mean they shouldn’t be monitored and the players themselves reviews and call to explain, but starting with a budget immediately changes the focus from producing useful things to not exceeding budgets – two unrelated objectives.
  • Change your view. The organisation and the technology supporting it is actually a living thing. It consumes energy to reduce entropy (maintain order). Technology is an important part of how the organisation lives and grows, get over it. It is not a cost center. It is as vital to the life of the organisation as cells to an organism.

How will we get the projects completed?

This question is itself a big part of the problem. You want a better organisation and have decided (top down) that doing X will achieve that. What will actually make a better organisation is for the members of the organisation to be enabled and assisted to to a better job. There are no projects, no budgets, no timelines. Just people getting better at their jobs; an organisation as a result delivering better products and/or services; a complex entity growing and refining itself as it goes, it’s mission (well actually it’s just doing things and discovering they are useful along the way).


All of this started in the late 1970’s when a shoe retailer, Andrew Herzfeld, came to a lecture I was giving on programming where I proclaimed with the bravado and confidence of youth that C would become the most important programming language. I had my reasons and to date they have been born out.

Andrew had a pile of retail performance reports under his arm and the organiser of the event a small computers he was trying to sell. Processor Technology microcomputer, Z80, 64K RAM, 1.75MB on 4 8.5inch floppy disks. In the pre personal computer days that was a major piece of personal technology. Spreadsheets were not generally available.

The project to build a sophisticated retail management system on minimal hardware had begun.

Does this work?

detailed examples for REMAS and shoeman

The ultimate question. The answer could be 42. Why 42? Doesn’t matter that it’s 42, just that there is an answer. The question, by the way, doesn’t matter either.

After 40 years writing large, complex systems this is my conclusion. Not only does it work, but it works better than anything else.

How can one unknown guy sitting in an unknown office produce several systems that are way ahead of the best efforts of billion dollar behemoths?

It works. You just have to ask the customers.

Unibase is a big part of the success. Itself the result of evolution and like Windows, Word, Excel, Linux, Apache it has benefited from development that has not been limited by commercial direction. The same goes for your car, plane, building, paint, phone, etc. Unibase is a piece of software that supports evolutionary development of larger systems. It grows in ability according to the needs of the larger systems.

shoeman is a system built for one customer that today includes a comprehensive feature set that is, as far as I know, unmatched.

REMAS and FORUM are the gold standard in retail management and have been extensively copied.

Denari is a point of sale software that is complete and adaptable. Lots of POS have some of our attributes but not all of them.

You don’t have to use Unibase (or any of our products) but they are proof of just how powerful is the model of using a platform that grows with the user to build an unexpectedly complex and competent system that could not be created by trying to second guess the functionality and purpose of the system before it is built.