Skip to content
November 29, 2024

Investment information for the new generation

Search

Supercomputing: Then and now

Supercomputers first made their appearance in in the 60s and companies like CDC and Cray Research dominated the sector for almost thirty years.

These units went from relatively weak single chip architectures to today’s massively parallel processing units which score well into the petaflops. Flops refer to floating-point operations per second, a standard of measurement in high performance computing. Peta is the evolution of the Greek word ‘pente’ and denotes a unit of measurement of 1015 (one thousand million million).

Peta isn’t something I’ve used to describe anything in my life and I have an incredibly difficult time making that number mean anything besides huge.

I’ll try to put it into context. Today’s fastest supercomputer, China’s Sunway TaihuLight, runs with a LINPACK benchmark of 93 petaflops. My desktop is an Intel i7-6700T @ 2.80GHz and according to QwikMark, I run at 75 gigaflops. Giga means billion – a million times smaller than peta. Accurate math aside, the TaihuLight computes over one million times faster than my desktop.

This colossal crunching power makes super computing perfect for studying quantum mechanics, climate/weather prediction, oil and gas exploration, protein folding, molecular modeling, physical modeling for things such as astrodynamics, aerodynamics, fluid dynamics, nuclear fusion and cryptanalysis.

Supercomputing was an scientific curiosity in the 70s. Even in the 90s, it was a poorly understood national badge of honor and the sole property of an esoteric academia, but as today’s information age came into fruition, super computers moved from oddity to omnipresence.

According to Wikipedia there are approximately 500 supercomputers in operation today using 23 million processor cores for a peak operation of 453 petaflops. That’s a crapload of computing power, but why would anyone, besides someone in a lab coat, care?

Supercomputers have become integral to national security as well as research. Because of this, the United States is contending with some disturbing recent developments.

China’s ascendancy to the top of high performance computing (HPC) in 2013 didn’t bode well for Americans but the Chinese were doing it with Intel’s hardware, so it didn’t matter as much. However, TaihuLight is a completely different kettle of fish, using only Chinese-produced CPUs and hardware.

When you consider we are predicted to emulate the human brain by 2025 and by 2030, have zettaflop machines capable of accurately predicting the weather two weeks ahead, the division of intellectual power becomes paramount.

So, it is of little surprise that the US has things like the National Strategic Computing Initiative and the Department of Energy Exascale program. Then in the EU, there are programs like Horizon 2020 which has a very-specific long-term agenda driving toward the convergence of deep learning and supercomputing.

The game is on, most clearly illustrated by yesterday’s announcement that the Department of Energy had opened up its purse and dropped $258.0 million on six companies – AMD, Cray, HPE, IBM, Intel and Nvidia – to build America’s first exascale supercomputer.

China intends to have the first exascale supercomputer up-and-running in 2020, well before anyone else and the newly-minted technological giant wants to keep that superiority as evidenced by the country’s recent announcement that it was going to invest $161.0 billion in its semiconductor industry to start competing globally.

What does this mean for the future of supercomputing?

Short term, it’s hard to tell. Right now, it seems the U.S. is in a state of political flux. The current https://e4njohordzs.exactdn.com/wp-content/uploads/2021/10/tnw8sVO3j-2.pngistration may decide at three in the morning over a fried bologna sandwich that supercomputing is convfefe and shouldn’t be funded to the extent it is.

If saner heads prevail, we could see supercomputing make a move toward the cloud while processor vendors, contending with a slowed Moore’s Law, develop more robust cooling features and power delivery methodologies to meet the ever-increasing performance demands made by users.

Does that mean true AI is just around the corner?

We’re years away from a singularity, but deep learning systems are making incredible advances such as a report the Facebook Artificial Intelligence Research Lab released today which stated one of the chat-bots it had created to negotiate in a bot-to-bot conversation, “led to divergence from human language as the agents developed their own language for negotiating.”

This lingual autonomy is as frightening as it is exciting.

So, what about quantum computing? Where does it fit into all this? Will it become the next “supercomputer?”

Yes and no.

While Burnaby-based D-Wave is pushing the technological envelope with the world’s first line of commercially available quantum computers, there are very few on the market (three, I think) and the systems require a whole new understanding to operate. An understanding, that doesn’t currently exist in the mainstream.

However, as improvements are made to the physical architecture and a new generation of quantum programmers are trained to properly code, query and interpret quantum machines, a universe of possibilities will open to us.

That said, the answer will probably lie in hybridization – a triumvirate of neuromorphic, quantum and HPC may very well make up the next generation of “supercomputers.”

Whatever they end up as, supercomputers are the cornerstone to the advancement of our 21st century digital world as they protect, predict and model our future.

 

–Gaalen Engen

http://twitter.com/gaalenengen

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *