From chicken farms to server farms: Matt Lucas FBCS looks at what the poultry industry can teach us about efficient computing.
Let's talk about chickens.
Did you know that we end up consuming only around 81% of the chicken that arrives in our stores? 15% gets wasted in the kitchen and the remaining 4% is wasted in the shop.
From an environmental point of view, not only does this lead to problems of disposal, it also means that we're breeding more chickens than would otherwise be necessary, which incurs its own costs on the planet.
In order to be green, we need to make more effective use of the resources we have, and that means minimising wastage like this.
The wastage problem
Now let's move back up the supply chain and look at chicken production: here virtually no part of a chicken is unused. After accounting for the cuts we commonly buy, the remaining edible parts are exported, ground into stock or converted into pet food. Non-edible parts such as feathers are used in the production of materials such as paper and plastics. Even the faeces (sorry) is used as an organic fertiliser.
The point is, businesses hate waste. Every resource that cannot be used or sold harms profit; scaled over the kind of volumes that producers often deal with, the effects can be huge - both in terms of a company's bottom line and the environment.
Wastage is also relevant when talking about computing resources. Think about the energy wasted when keeping your laptop or phone idle. Much of the time, these devices are doing nothing useful at all, or they're busy optimising, phoning home or downloading updates - services which might be ultimately useful to me, but require a constant drain of system resources for what should be relatively infrequent operations.
Again, scale out this problem to the size of a typical business. Fifteen years ago, it was not uncommon for a new IT project to come with its own hardware budget, which usually meant wheeling in new servers. Machines were sized for maximum potential capacity, plus some leeway for growth. All this meant that, historically, IT hardware ended up being around 20% utilised, thereby wasting a huge amount of resource.
Then, a couple of things happened…
Innovations
Vendors such as Amazon recognised the idle capacity of computers in its own data centres and spied a business opportunity to sell it on, thus giving rise to the concept of cloud computing. Simultaneously, virtualisation technology grew in popularity, which allowed for multiple logical servers to share the same physical hardware.
These innovations, while successful at driving up average processor utilisation, led to new inefficiencies. Sharing computing resources by virtualising workloads requires partitioning, so that applications cannot interfere with each other. This meant that key services were duplicated multiple times on the same hardware, which increased the overall resource requirements.
For you
Be part of something bigger, join BCS, The Chartered Institute for IT.
To solve this problem, an increasing number of businesses are now using containers, in which individual applications run in their own sandbox, but share access to an operating system as well as common services, such as logging and monitoring. Containers are portable and can be easily scaled, which allows for workloads to be directed to the places in which they can make the most efficient use of available resources.
So are containers the future of driving efficiency in business computing? Maybe, but we're not there yet.
Personal and environmental incentives
Building an efficient containerised application takes careful thought. While much of the traditional software on which businesses rely has been containerised, simply putting a monolithic application into a container is unlikely to be effective; a badly containerised application can easily dwarf the system requirements of the equivalent traditional application. Services like Mono2Micro can help with this process, but it's early days for sure.
We also need to address the consumer side. The use of more online services for day-to-day tasks means that raw computing requirements are moving out of the home and into more efficiently utilised data centres, but this comes at the cost of network bandwidth, and the myriad routers and gateways that also need power and processors.
Efficiency in computing, as with any finite resource, comes from an alignment of personal incentives with those of the environment's. Make it beneficial for people to minimise waste - just as it can hurt a business's profitability - and they will work to those incentives. This is not a problem we can solve purely with technology - it will also require political and social will.
As for the chickens - they will play a key part in making computer processors work more efficiently. How will they do that? Overclucking.