News Release

Neural nets used to rethink material design

Rice lab's new strategy puts evolution of microscopic structures on fast track

Peer-Reviewed Publication

Rice University

MICROSTRUCTURE 1

image: Engineers at Rice University and Lawrence Livermore National Laboratory are using neural networks to accelerate the prediction of how microstructures of materials evolve. This example predicts snowflake-like dendritic crystal growth. view more 

Credit: Mesoscale Materials Science Group/Rice University

HOUSTON - (April 30, 2021) - The microscopic structures and properties of materials are intimately linked, and customizing them is a challenge. Rice University engineers are determined to simplify the process through machine learning.

To that end, the Rice lab of materials scientist Ming Tang, in collaboration with physicist Fei Zhou at Lawrence Livermore National Laboratory, introduced a technique to predict the evolution of microstructures -- structural features between 10 nanometers and 100 microns -- in materials.

Their open-access paper in the Cell Press journal Patterns shows how neural networks (computer models that mimic the brain's neurons) can train themselves to predict how a structure will grow under a certain environment, much like a snowflake forms from moisture in nature.

In fact, snowflake-like, dendritic crystal structures were one of the examples the lab used in its proof-of-concept study.

"In modern material science, it's widely accepted that the microstructure often plays a critical role in controlling a material's properties," Tang said. "You not only want to control how the atoms are arranged on lattices, but also what the microstructure looks like, to give you good performance and even new functionality.

"The holy grail of designing materials is to be able to predict how a microstructure will change under given conditions, whether we heat it up or apply stress or some other type of stimulation," he said.

Tang has worked to refine microstructure prediction for his entire career, but said the traditional equation-based approach faces significant challenges to allow scientists to keep up with the demand for new materials.

"The tremendous progress in machine learning encouraged Fei at Lawrence Livermore and us to see if we could apply it to materials," he said.

Fortunately, there was plenty of data from the traditional method to help train the team's neural networks, which view the early evolution of microstructures to predict the next step, and the next one, and so on.

"This is what machinery is good at, seeing the correlation in a very complex way that the human mind is not able to," Tang said. "We take advantage of that."

The researchers tested their neural networks on four distinct types of microstructure: plane-wave propagation, grain growth, spinodal decomposition and dendritic crystal growth.

In each test, the networks were fed between 1,000 and 2,000 sets of 20 successive images illustrating a material's microstructure evolution as predicted by the equations. After learning the evolution rules from these data, the network was then given from 1 to 10 images to predict the next 50 to 200 frames, and usually did so in seconds.

The new technique's advantages quickly became clear: The neural networks, powered by graphic processors, sped the computations up to 718 times for grain growth, compared to the previous algorithm. When run on a standard central processor, they were still up to 87 times faster than the old method. The prediction of other types of microstructure evolution showed similar, though not as dramatic, speed increases.

Comparisons with images from the traditional simulation method proved the predictions were largely on the mark, Tang said. "Based on that, we see how we can update the parameters to make the prediction more and more accurate," he said. "Then we can use these predictions to help design materials we have not seen before.

"Another benefit is that it's able to make predictions even when we do not know everything about the material properties in a system," Tang said. "We couldn't do that with the equation-based method, which needs to know all the parameter values in the equations to perform simulations."

Tang said the computation efficiency of neural networks could accelerate the development of novel materials. He expects that will be helpful in his lab's ongoing design of more efficient batteries. "We're thinking about novel three-dimensional structures that will help charge and discharge batteries much faster than what we have now," Tang said. "This is an optimization problem that is perfect for our new approach."

###

Rice graduate student Kaiqi Yang is lead author of the paper. Co-authors are Rice alumnus Yifan Cao and graduate students Youtian Zhang and Shaoxun Fan; and researchers Daniel Aberg and Babak Sadigh of Lawrence Livermore. Zhou is a physicist at Lawrence Livermore. Tang is an assistant professor of materials science and nanoengineering at Rice.

The Department of Energy, the National Science Foundation and the American Chemical Society Petroleum Research Fund supported the research.

Read the paper at https://www.cell.com/patterns/fulltext/S2666-3899(21)00063-5.

This news release can be found online at https://news.rice.edu/2021/04/30/neural-nets-used-to-rethink-material-design/.

Follow Rice News and Media Relations via Twitter @RiceUNews.

Related materials:

Mesoscale Materials Science Group: http://tanggroup.rice.edu/research/

Department of Materials Science and NanoEngineering: https://msne.rice.edu

George R. Brown School of Engineering: https://engineering.rice.edu

Video:

https://youtu.be/nWXuAb_JJ0Y

Image for download:

https://news-network.rice.edu/news/files/2021/04/0503_MICROSTRUCTURE-1-WEB.jpg

Engineers at Rice University and Lawrence Livermore National Laboratory are using neural networks to accelerate the prediction of how microstructures of materials evolve. This example predicts snowflake-like dendritic crystal growth. (Credit: Mesoscale Materials Science Group/Rice University)

Located on a 300-acre forested campus in Houston, Rice University is consistently ranked among the nation's top 20 universities by U.S. News & World Report. Rice has highly respected schools of Architecture, Business, Continuing Studies, Engineering, Humanities, Music, Natural Sciences and Social Sciences and is home to the Baker Institute for Public Policy. With 3,978 undergraduates and 3,192 graduate students, Rice's undergraduate student-to-faculty ratio is just under 6-to-1. Its residential college system builds close-knit communities and lifelong friendships, just one reason why Rice is ranked No. 1 for lots of race/class interaction and No. 1 for quality of life by the Princeton Review. Rice is also rated as a best value among private universities by Kiplinger's Personal Finance.

Jeff Falk
713-348-6775
jfalk@rice.edu

Mike Williams
713-348-6728
mikewilliams@rice.edu


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.