High Performance Computing

Many New Colours of HPC Clouds

Peter CoffeeThis is a guest blog post by Peter Coffee. Peter is VP for Strategic Research at Salesforce. He has been with the company for nine years, following nineteen years as a columnist and editor with the industry publications PC Tech Journal, PC Week and eWEEK. He works with IT managers and application developers to build a global community on the Salesforce1 cloud platform, combining the Force.com, Heroku1 and ExactTarget Fuel service portfolios. Peter holds an engineering degree from MIT and an MBA from Pepperdine University. He is the author of two books, How To Program Java and Peter Coffee Teaches PCs. He is a winner of the Neal Award for excellence in business journalism and the McGan Silver Antenna Award for service to amateur radio.

Follow Peter on Twitter or read more of his publications here.


As we look toward the next generations of high-performance computing, we will do well to recognise and challenge anything that resembles incremental thinking. Many names have been attached to the observation that the light bulb wasn’t invented by continuously improving the candle.

The high-performance cloud, in particular, must be recognised and developed as more than a relocation and scale-up of the way that things are currently done. Far better to imagine the cloud as a place where unconventional and rapidly evolving models of computing can be offered, tested, and improved in rapid cycles offering heterogeneous toolkits with high levels of interoperability, rather than merely building massive minimal-cost capacity for plain-vanilla processing.

Incremental progress is neither trivial nor automatic, and talented people will always find employment in moving things to the next level along a path that’s easily seen. What passes for prediction, though, may sometimes be just the solution of an equation for a future value of time Moore’s Law, for example, which is as much a business plan as a technical forecast when what is needed is readiness to imagine an equation with substantially different variables and constraints.

On the hardware side, for example, FPGAs (soft hardware) were barely emerging as a credible option eleven years ago (when I wrote about an FPGA specification-generating language just about to make its debut). Today, Intel and others are driving FPGA technology into a variety of accelerators for some of the most compute-intensive aspects of the things that we increasingly want to do.

Soc FPGA

Altera Graphic

People with bare-metal, low-level machine-code expertise may find it counterintuitive (almost heretical) to speak of softer hardware as a path to higher performance, but that’s nothing compared to the heresy of making machines less accurate intentionally. We have to think hard, though, about making good use of alternatives that might accelerate image processing a hundredfold while cutting its power consumption by a factor of fifty.

You would not propose approximation architectures as the general-case successor to COBOL most of us want our paychecks accurate to the penny, even if it makes the hardware work harder but a good toolbox includes rasps as well as razor blades.

A final example of something that the cloud may hugely accelerate is our understanding of what we can actually do with quantum computing working with not merely inaccurate, but actually ambiguous values. IBM’s Quantum Experience is among the present opportunities to explore this model, now more than theoretical, without needing a megacorp’s budget just to have access to an experimental machine.

Technology forecasts of this kind are much too important to be left to the world of non-fiction. People who know what’s possible today, well enough to improve upon it (or even just to describe it with rigor and depth), may struggle to overcome their knowledge of what’s impossible.

Counterexamples include fiction writers (often with no tech credentials) whose forecasts have held up well, or have even become the vocabularies and templates of actual inventions. Star Trek has inspired any number of engineers to make real its visions of what technology might someday enable: Martin Cooper’s conception of the cellphone was driven by Star Trek’s communicators, with the first flip-phone the StarTAC deriving design as well as name from that 1960s TV series.

Some writers are definite in their suspicion that more knowledge might have limited their vision. In 1962, Vernor Vinge wrote a visionary story (Bookworm, Run) on the subject of intelligence amplification: Perhaps it’s fortunate, he later wrote, that at the time I had no technical knowledge of computers. I might have become discouraged, ended up writing really hard-core science fiction about punch cards and batch processing. We see Vinge’s concern made real in the pilot episode of Lost in Space, aired in 1965, when the 1997 control room for an interstellar flight is filled with refrigerator-sized mechanical-switch cabinets and desk-filling blinking-light displays.

In the world that we call, today, the cloud, it’s generally agreed that the word cyberspace is a coinage of fiction writer William Gibson, appearing in his 1982 short story Burning Chrome and described there as a consensual hallucination… a graphic representation of data abstracted from the banks of every computer in the human system. (The word appeared earlier, in the late 1960s, as a description of adaptive artwork and architecture but Gibson’s usage is by far the most common today).

Note that Gibson puts the cyber word in a context, not of what the machinery does, but of the reason why people want it and the experience that they seek from it: as he put it, a consensual hallucination that helps people make sense of otherwise overwhelming quantities and velocities of data. What’s happening today at Salesforce, to mention one example, is very much in the vein of putting enormous computational power into a purely background role as a facilitator of experience. This is not the old HPC of building tools that merely enable the human expert to carry out more calculations: rather, it’s about elevating the performance of many more people to levels of comprehensive knowledge and value-adding insight not previously achieved.

Gibson was not merely imagining what could be built, but was rather envisioning what people would find useful: we’ll do well to favour that point of view in the real world, where we’ll try to live up to the most compelling fiction, as we collaborate to co-create the real-world future of HPC.

Leave a Reply