Overall, programming languages have been relatively stable for several decades. Almost all modern languages are derived originally from Algol and, more directly, from C.
While there continues to be development of existing languages such as C++ and Java, and of new languages such as Python, Ruby and Groovy, these are recognisable as incremental improvements on an existing paradigm, rather than new paradigms, and therefore exploit widely available programming skills. Notable exceptions are COBOL and FORTRAN, which are firmly established in particular industries, but also stable providing that skills are maintained.
Similarly, programming tools such as compilers, interpreters and debuggers have improved over many years. The introduction of integrated development environments (IDEs) just over a decade ago provided a significant increase in programming productivity, which continues to be improved year on year.
No other technologies are in sight that might offer significant productivity increases and, therefore, current attention is focused on ‘agile’ development methodologies, which seek to offer shortened development cycles and increased confidence in the outcomes of development projects.
For the most part, these methods are based on iterative development techniques in which a subset of function can be demonstrated early in a development project and then reviewed against user needs, and enhanced or refined as the project progresses. The success of these techniques is based primarily on refining specifications rather than the development process itself. In other words, answering the question ‘am I developing the right thing?’ rather than ‘am I developing the thing right?’
In the light of the potential risks of project failure or error-prone operations, some respected authorities have suggested that all development projects should commence with a formal specification of requirements, expressed in a mathematically precise notation.
Whilst this advice has been followed in a few safety-critical industries such as air traffic control, it is ignored in almost all other industries, for simple yet valid reasons: the requirements are not initially known in sufficient detail to write such a specification, and they change within the lifetime of the development project. The result is that any large, multi-year development project that does not include a process for refining and revising requirements during its course has a significant probability of failure. As noted above, agile development methods are the primary response to this challenge.
One particular way in which requirements are changing at present is the rapid evolution of the technological environment in which business applications are expected to operate. The evolution from operating within privately owned and controlled networks supporting fixed function terminal devices to operating within the publicly accessible internet and supporting programmable end-user personal computers was already demanding.
The current evolution to support a wider range of end-users devices, including mobile devices, tablets, smart TVs and sensors, and the increasing likelihood that these devices are personally owned and / or managed, is much more demanding. There are as yet few stable standards for the silicon ‘system on chips’, operating systems or programming environments used in these devices and therefore, projects that commit to building applications for such environments must carry significant risk.
By contrast, environments for the server-based segments of business applications are much more stable, with essentially no server architectures, operating systems or application execution environments emerging in the last decade and many manufacturers are currently emphasising upward compatibility with future releases of hardware and software. Pressure for change arises from the rapid, cost-driven consolidation of data centre systems using virtualisation techniques and the further potential for cost savings by outsourcing applications using cloud-based services.
Both of these approaches offer worthwhile gains with relatively low levels of change to existing applications. However, longer-term gains including decreased operational costs, increased flexibility in provisioning services and improvements in the speed with which new applications can be introduced are dependent on reducing complexity in the data centre environment.
This can only be achieved by taking a strategic view of the IT infrastructure deployed, including server architecture, storage, networks, operating systems and application execution environments, and may therefore require change to some application systems. Organisations that succeed in reducing complexity will benefit from much improved capability to respond to change in the business environment.
Grady Booch, who gave 2013’s BCS Lovelace Lecture, has, in the past, referred to object-oriented design and development methods for applications. He has been a proponent of these and they can help to simplify some coding. However, a well known index of programming languages (see link below) shows that C (non-object oriented) is still the most popular language for new projects - so object- orientated development methods have not, as yet, been universally adopted.
There is a common theme of simplification, but I am suggesting the need to simplify at a coarser level of granularity - in terms of hardware / software platforms and so on, where many organisations have a multiplicity of heterogeneous bases on which they run different applications, mainly for historical reasons.
Choosing which of these to maintain and enhance is sometimes a difficult decision as it involves discarding some and losing or replacing the applications that run on them, but it's necessary to avoid eventual ‘hardening of the arteries’ in which any change at all becomes difficult.