AUTHOR=Yang Zhenze , Buehler Markus J. TITLE=Words to Matter: De novo Architected Materials Design Using Transformer Neural Networks JOURNAL=Frontiers in Materials VOLUME=8 YEAR=2021 URL=https://www.frontiersin.org/journals/materials/articles/10.3389/fmats.2021.740754 DOI=10.3389/fmats.2021.740754 ISSN=2296-8016 ABSTRACT=

Transformer neural networks have become widely used in a variety of AI applications, enabling significant advances in Natural Language Processing (NLP) and computer vision. Here we demonstrate the use of transformer neural networks in the de novo design of architected materials using a unique approach based on text input that enables the design to be directed by descriptive text, such as “a regular lattice of steel”. Since transformer neural nets enable the conversion of data from distinct forms into one another, including text into images, such methods have the potential to be used as a natural-language-driven tool to develop complex materials designs. In this study we use the Contrastive Language-Image Pre-Training (CLIP) and VQGAN neural networks in an iterative process to generate images that reflect text prompt driven materials designs. We then use the resulting images to generate three-dimensional models that can be realized using additive manufacturing, resulting in physical samples of these text-based materials. We present several such word-to-matter examples, and analyze 3D printed material specimen through associated additional finite element analysis, especially focused on mechanical properties including mechanism design. As an emerging new field, such language-based design approaches can have profound impact, including the use of transformer neural nets to generate machine code for 3D printing, optimization of processing conditions, and other end-to-end design environments that intersect directly with human language.