If the U.S. wants to worry about China's tech rise, it should look to supercomputing first

I'm not hugely concerned about the China vs. U.S. artificial intelligence race, as I've explained bef
ARCHITECHT
If the U.S. wants to worry about China's tech rise, it should look to supercomputing first
By ARCHITECHT • Issue #165 • View online
I’m not hugely concerned about the China vs. U.S. artificial intelligence race, as I’ve explained before.  To recap, that’s because (1) so much of early AI usage is in the consumer space, (2) so much AI work is done in the open (even if it’s not always open source), and (3) the United States is still home to Google, OpenAI, Microsoft, Facebook and lots of other leading AI research institutions. Yeah, the U.S. government could do a better job mapping out its AI future, like China has done, but at the moment the state of AI in the U.S. looks pretty good.
Such is not the case in the world of supercomputing, a field once dominated by the United States. For the last several years,  Chinese systems have topped the list, and this year Chinese systems claim the top 2 slots. The first U.S. system, Oak Ridge National Lab’s Titan supercomputer, debuts at No. 5. (An optimist might point out that the United States holds spots 5 though 8 – giving it 4 top 10 systems compared with China’s 2 top 10 systems – but Chinese systems account for 35.4 percent of total power in the list compared with 29.6 percent for U.S. systems.)
While there’s some debate about the benchmark historically used to measure their performance – and, indeed, U.S. systems do look better on newer and arguably more applicable benchmarks – large supercomputers like these typically run inside national laboratories (which are managed by the Dept. of Energy in the United States) and handle some very important workloads. Physics, climate change, national security, you name it. It turns out they can also be pretty good for training deep learning models – both in the lab and at large companies like Baidu.
Even if the United States does retake the throne next year, which it promises to do with an IBM system called Summit at ORNL, China expects to beat the world by developing the first-ever exascale system by 2020. 
Nobody thinks that having the fastest supercomputer makes you the world’s biggest superpower, but it can be a pretty big deal symbolically. It suggests a country that take science, and computer science, very seriously. And, practically speaking, bigger systems can handle bigger, more complex applications. 
It’s possible all of this international one upmanship is a waste of time and money, but if we’re going to concern ourselves with it, we probably don’t want to overlook current challenges because we’re too distracted by newer, shinier ones.

Sponsor: Cloud Native Computing Foundation
Sponsor: Cloud Native Computing Foundation
Artificial intelligence
New podcast episodes every week!
New podcast episodes every week!
Cloud and infrastructure
Big data and data science
Did you enjoy this issue?
ARCHITECHT

ARCHITECHT delivers the most interesting news and information about the business impacts of cloud computing, artificial intelligence, and other trends reshaping enterprise IT. Curated by Derrick Harris.

Check out the Architecht site at https://architecht.io

Carefully curated by ARCHITECHT with Revue. If you were forwarded this newsletter and you like it, you can subscribe here. If you don't want these updates anymore, please unsubscribe here.