[ Jocelyn Ireson-Paine's Home Page | Publications | Dobbs Code Talk Index | Dobbs Blog Version ]

The Usefulness of Broken Glasses

Glasses are better than computers, because: they are lighter than laptops; they don't need batteries; and you can buy them in many exciting fashion sizes, colours, and shapes. In the UK, thick plastic sidearms came into fashion two years ago. Now, by the same kind of stylistic extension that I imagine generated flares and drainpipe jeans, thick plastic entire frames are "in" again, for the first time in a decade. But it's the computational properties I'm interested in today: graceful degradation, profligacy, and rolling downhill.

Graceful degradation is what a three-legged dog does. Its gait is not graceful; but still it walks. Not perfectly; but better than an air-traffic control program would fly planes if you deleted a quarter of its source code. Glasses degrade gracefully too, once you've found the fragments.

"Graceful degradation" is a phrase well-known to computing. But "profligate" and "rolling downhill" are my own coinages, because I couldn't think of existing adjectives. You needn't log in to a pair of glasses. There is no user database restricting access to those the administrator believes are truly entitled to that hard-won output from heroic algorithms achieved under Mission Impossible odds against almost insuperable time and memory pressure. Sit behind a glasses-wearer on the bus, and divert yourself by watching the informational overspill from their lenses. This may not be useful to your eyesight; but it is so cheap that all optical companies, even those who charge house-mortgage prices for their precision SLR cameras, just give it away. The computations are profligate.

By "rolling downhill", I mean going with the flow of physics, processing what the computational substrate naturally processes. Lenses just "do it", in the way that the program you were sweating over since 7:30 this morning, and all weekend because of that deadline so you missed your son's school sports day, does not.

Graceful degradation makes me think of a concept that topologists call "continuity". It was probably developed — I don't know the history in detail — from attempts to define precisely what it means for numerical functions to have breaks or jumps. Descartes said that a function is continuous if you can draw its graph without lifting your pencil from the paper. Nowadays, topology has several sophisticated and closely-related definitions of continuity. One of these says that "if f is a continuous function, the image of every connected set under f is connected". The gist of this is that if you apply a continuous function to a sequence of values, it won't introduce breaks in the sequence.

I believe we deadline-fighting programmers need a language that is a continuous function from textual representation to behaviour.

Let me explain what continuity would mean. I'll use my three-legged dog but make him a centipede because dogs have too few legs. And I'll call my language SMOOTHOL. (It's late and I can't think of a better name.)

So imagine a chunk of perfect bug-free SMOOTHOL source code. Think of this as a perfect hundred-legged centipede. Think of your source code with one typo as a centipede with one leg lost. Think of your source code with two typos as a centipede with two legs lost. Each centipede, note, is pretty similar in structure to its neighbour with one leg more and its other neighbour with one leg less.

Now run each centipede — er, program — through your SMOOTHOL compiler. Observe their outputs. Being a continuous function, SMOOTHOL will not introduce jumps or breaks. So the hundred-legged centipede will behave pretty much like its neighbour with one leg lost, which will behave pretty much like its neighbour with two legs lost.

Now extend the idea to hardware. I could put an axe through my laptop and both halves would continue to do something fairly useful.

So tell me, how do we invent SMOOTHOL?