“Layers are Leaking”

Andrew Richards (@codeandrew), the CEO of Codeplay posted an article “Layers are Leaking”, which inspired some thoughts – tangential and perpendicular – beyond what I could reasonably fit into a stream of tweets (not that that stopped me trying).

From the article:

If you look at any computer system architecture diagram (and there are many) you will almost always see a series of “layers” on top of each other. Each layer is based on a layer underneath. It is a very common pattern in computers, and I believe that it is fundamental to the success of the development of computer technology. What layers allow you to do, is let individual people concentrate on only one part of a problem.

This brought to mind the first chapter of Bell & Newell’s Computer Structures book from 1971, which illustrates this kind of layering idea.  Additionally, I was reminded of Jeannette Wing’s excellent Computational Thinking article which (amongst other things) makes the case that this type of thinking, associated with computer science, is something that is worth teaching and applying more widely.

But, recently, the layers have started leaking.

I don’t really agree that this is a recent thing (layers leak, abstractions change – some leaks are more obvious than others), but the article refers to the significant changes that have come about in the eternal quest for faster computation, and the difficulties involved in replacing established abstractions. Renegotiating the interface between two layers is difficult when the people responsible for each layer don’t speak the same technical language. What’s worse is trying to change the interface to one layer without adequately considering the consequences to other layers in the system. I think time-consuming discussion and debate is inevitable, and it is not inherently a reason to avoid changing an otherwise broken abstraction.

A number of the problems (or – dare I cliché it – opportunities) that are appearing with regard to increasingly parallel computation are not new. Consider Peter Denning’s recent article. I find it interesting that a number of the suggested solutions involve changes to deeply ingrained abstractions (like programming languages) for which there would be (is) significant resistance to change.

I read an interview with Fred Brooks recently, and the following quote stood out:

Software is not the exception; hardware is the exception. No technology in history has had the kind of rapid cost/performance gains that computer hardware has enjoyed. Progress in software is more like progress in automobiles or airplanes: We see steady gains, but they’re incremental.

Which echoes something that Andrew said in another article (“Why Intel Larrabee Really Stumbled” – google cache is here)

Infinite Truth: Software takes longer to develop than Hardware

A lot of time has been spent on steady, incremental gains for software that attempts to work around the limitations of many common layer abstractions – examples that spring to mind include compiling general purpose programs for small register sets, converting scalar programs to utilise vector instructions, and trying to make parallelism out of inherently single threaded code. We can’t necessarily redesign our systems to make these problems go away, but I think we can strive to address these problems in the best way possible – which will likely involve reshuffling some layers in the process. It’s likely to be messy, time consuming, and require a lot of hard work.

Sounds like fun :)

My thanks to Andrew for his thought-provoking article.

Additional:
Stephen Hill (@self_shadow) tweeted links to wikipedia on Leaky abstractions, and a related article by Joel Spolsky.

But, recently, the layers have started leaking.