Some might think that's a rather harsh way to look at things, but when I read the title of this HPCwire article, I couldn't help but think of the dozens of claims other big names in the tech industry have made over the years which have been proven wrong time and time again. One of the most famous blunders of this kind has to be the rumored quote by Bill Gates that "640k ought to be enough for anybody." The legitimacy of this quote is still up for debate, but the idea still stands: big tech people love to make these sensationalist claims about the long-term future of technology.
Now before anyone gets the idea that I'm trying to discredit Sterling here, I will point out he has some important qualifiers in this statement. The quote itself is really:
These words may be thrown back in my face, but I think we will never reach zettaflops, at least not by doing discrete floating point operations. We are reaching the anvil of the technology S-curve and will be approaching an asymptote of single program performance due to a combination of factors including atomic granularity at nanoscale.I'm glad he added his reasoning here, because I do agree with his stance that doing things exactly the same way as we do them today will not get us to zettascale. In fact, the big irons as we see them today may not even be enough to get us into exascale on their own -- that's something we'll have to watch in the decade to come.
Sterling goes on to say:
Of course I anticipate something else will be devised that is beyond my imagination, perhaps something akin to quantum computing, metaphoric computing, or biological computing. But whatever it is, it won’t be what we’ve been doing for the last seven decades. That is another unique aspect of the exascale milestone and activity. For a number, I’m guessing about 64 exaflops to be the limit, depending on the amount of pain we are prepared to tolerate.This is exactly why I think it's quite a claim to say we won't reach zettascale -- we don't know what's coming! Every day, there are advancements in alternate computing methods such as quantum computing and biological computing, and there's most likely other research going on that isn't being talked about yet because there's nothing worth noting. Ten years ago, 3TB hard drives seemed like an impossibility because we hadn't really seen perpendicular recording being used in hard drives.
Is reaching zettascale or exascale going to be easy? Not at all. There are a ton of milestones we're going to have to overcome in order to achieve these feats. As Sterling points out, even if we can achieve the hardware advancements we need, there's a need for more advanced software to take advantage of these future beasts. That being said, unless there's a time machine I haven't heard about, none of us know what's to come in the next few decades. For all I know there could be an advancement in using cats that will launch us into the exascale and beyond!