Print Page | Contact Us | Report Abuse | Sign In | Register
News & Press: In The News

Energy Dept. shoots for exascale computer in a national lab by 2021

Thursday, August 31, 2017   (0 Comments)
Posted by: Lisa Arafune
Share |
Interview (Exascale discussion starts at 7:30):

https://1yxsm73j7aop3quc9y5ifaw3-wpengine.netdna-ssl.com/wp-content/uploads/2017/08/AoM-8-18.mp3?_=1


Energy Dept. shoots for exascale computer in a national lab by 2021

By David ThorntonAugust 21, 2017 3:33 pm

The Energy Department is taking another step toward the next generation of computing: exascale computers. In June, it awarded contracts to six companies for research and development into overcoming obstacles that so far have prevented exascale computers from being built.

“The goal of the Path Forward program is for the six companies who got contracts to develop technologies that are likely to lead to exascale computers that will be installed at DoE national laboratories as well as other sites,” Paul Messina, senior strategic adviser and director of the exascale computer project at Argonne National Laboratory said on Energy Department month.

Exascale computing refers to systems that are able to make a billion billion, or one quintillion, calculations per second, or roughly 50 times faster than the most powerful computers today. This is estimated to be roughly equal to the human brain’s neural processing power.

DoE isn’t expecting an actual system to come out of this right away; the goal is to have one installed at a national laboratory by 2021.

What the department is looking for are answers to some of the biggest problems standing in the way, such as power consumption. It would require half a gigawatt to operate, enough power for a small city. Messina said an exascale computer could cost around $500,000,000 per year to power.

Another obstacle is the increased parallelism. Most computers have roughly 1-2 million hardware pieces working at the same time.

“Multiply that by 1,000, almost,” Messina told the Federal Drive with Tom Temin. “So billion-way parallelism. How do you tackle that? With better design of the pieces of the system, better ways to connect them, because you’re talking about lots of things that are talking to each other. And we have to worry about how to program a system with a billion-way parallelism.”

Finally, Messina said that memory technology hasn’t increased as quickly as computing technology has, so with a system of that scale, it would be difficult to retrieve any data without improved memory.

Each of the companies is working independently and would get to keep the intellectual property developed from the research as part of a deal where the companies foot 40 percent of the bill.

The research is aimed at two areas: applications development and software development.

Messina said applications development is aimed at mission-critical issues like development of better materials for better, more efficient batteries. It’s complex and involves heavy lifting on issues like physics, chemistry, engineering and sometimes even machine learning. The Energy Department is currently funding 25 of these projects.

Meanwhile, there’s software. Messina said vendors mostly handle things like the operating system, but there are also projects like programming languages, mathematics libraries, tools for measuring the performance of finding bugs, and tools for visualizing the results, which often is the only way you can determine what you’ve learned.

The overall system, when complete, will comprise a few hundred cabinets roughly the size of a refrigerator, Messina said. After Energy installs the first at a national laboratory, it hopes to propagate them through other labs, agencies and industry.


Founded in 1989, the Coalition for Academic Scientific Computation (CASC) is an educational nonprofit 501(c)(3) organization with 86 member institutions representing many of the nation’s most forward thinking universities and computing centers.