Deep Operator Networks can accelerate simulation solvers for materials processing!

We develop a framework that integrates a convolutional autoencoder architecture with a deep neural operator (DeepONet) to learn the dynamic evolution of a two-phase mixture and accelerate time-to-solution in predicting the mesoscale microstructure evolution in materials.
Published in Materials
Deep Operator Networks can accelerate simulation solvers for materials processing!
Like

The phase-field method has emerged as a powerful, heuristic tool for modeling and predicting mesoscale microstructural evolution in a wide variety of material processes. The Cahn-Hilliard, nonlinear diffusion equation, is one of the most commonly used governing equations in phase-field models. It describes the process of phase separation, by which a two-phase mixture spontaneously separates and form domains pure in each component. The Cahn–Hilliard equation finds applications in diverse fields ranging from complex fluids to soft matter and serves as the starting point of many phase-filed models for microstructure evolution.

Although the phase-field method accurately models mesoscale morphological and microstructure evolution in materials, it is computationally expensive. The presence of numerous time-evolving high-gradient regions in the concentration field defining the microstructures make it difficult for the conventional neural network based architectures to directly learn the dynamics of the underlying system in the primitive space. To overcome this difficulty, we demonstrate the effectiveness of learning the dynamics in the latent space of autoencoders.

 Our model consists of an encoder, a decoder, and a Deep Operator Network (DeepONet). As the first step, we train the autoencoder (encoder and decoder) to learn a non-linear transformation to a low dimensional latent space. This can be interpreted as a dimensionality reduction step. Next, we train the DeepONet model to learn the dynamics in the latent space previously learned by the encoder. The predictions made by the DeepONet in the latent space are retransformed using the pre-trained decoder to obtain the predicted microstructure at the query time.

We compare the predictions of the surrogate autoencoder-DeepONet model with solutions generated by the high-fidelity numerical model in Figure 2. The predicted results align well with the true microstructures. Increasing the weight given to the earlier time steps, where the microstructure evolves rapidly, enabled us to endow DeepONet with an inductive bias to learn the smaller features and fast dynamics accurately. 

Our proposed framework can also be used for extrapolation tasks and be integrated to the phase-field numerical solver to accelerate the predictions for initial microstructure and parameters that are outside the aforementioned distributions (extrapolation task). To demonstrate this point, we devised a hybrid approach that integrates the autoencoder--DeepONet framework with our high-fidelity phase-field Mesoscale Multiphysics Phase Field Simulator (MEMPHIS solver). This hybrid model unites the efficiency and computational speed of the autoencoder-DeepONet framework with the accuracy of high-fidelity phase-field numerical solvers. The hybrid 'leap in time' strategy demonstrated in this study achieves a speed-up of 29% compared to the numerical solver.

In this work, we developed and applied a machine-learned framework based on neural operators and autoencoder architectures to efficiently and rapidly predict complex microstructural evolution problems. Such architecture is not only computationally efficient and accurate but it is also robust to noisy data. The demonstrated performance makes it an attractive alternative to other existing machined-learned strategies to accelerate the predictions of microstructure evolution. It opens up a computationally viable and efficient path forward for discovering, understanding, and predicting materials processes, where evolutionary mesoscale phenomena are critical, such as in optimization and design of materials problems.

The paper is now published in:

Oommen, V., Shukla, K., Goswami, S. et al. Learning two-phase microstructure evolution using neural operators and autoencoder architectures. npj Comput Mater 8, 190 (2022). https://doi.org/10.1038/s41524-022-00876-7

Please sign in or register for FREE

If you are a registered user on Research Communities by Springer Nature, please sign in

Subscribe to the Topic

Materials Science
Physical Sciences > Materials Science

Related Collections

With collections, you can get published faster and increase your visibility.

Transport Mechanisms in Energy Materials

The integration of computational material methods, artificial intelligence technology, and advanced in-situ experimental characterization techniques constitutes a foundational approach for unraveling the microstructural transport mechanisms within energy materials. The recent surge in mechanism elucidation, powered by these integrated methodologies, is widely acknowledged as a pivotal avenue for material innovations, consequently propelling advancements in new energy applications. This collection is dedicated to tracking the latest developments and publishing intriguing investigations pertaining to transport mechanisms within energy materials.

Publishing Model: Open Access

Deadline: Oct 31, 2024

Computational Progress of High Entropy Materials

We invite submissions of papers focusing on the computational progress of high entropy materials.

Publishing Model: Open Access

Deadline: Jul 07, 2024