AUTHOR=Li Yang , Jian Pengpeng , Han Guanghui TITLE=Cascaded Progressive Generative Adversarial Networks for Reconstructing Three-Dimensional Grayscale Core Images From a Single Two-Dimensional Image JOURNAL=Frontiers in Physics VOLUME=10 YEAR=2022 URL=https://www.frontiersin.org/journals/physics/articles/10.3389/fphy.2022.716708 DOI=10.3389/fphy.2022.716708 ISSN=2296-424X ABSTRACT=
It is very challenging to accurately understand and characterize the internal structure of three-dimensional (3D) rock masses using geological monitoring and conventional laboratory measures. One important method for obtaining 3D core images involves reconstructing their 3D structure from two-dimensional (2D) core images. However, traditional 2D–3D reconstruction methods are mostly designed for binary core images, rather than grayscale images. Furthermore, the reconstruction structure cannot reflect the gray level distribution of the core. Here, by combining the dimension promotion theory in super-dimension (SD) reconstruction and framework of deep learning, we propose a novel convolutional neural network framework, the cascaded progressive generative adversarial network (CPGAN), to reconstruct 3D grayscale core images. Within this network, we propose a loss function based on the gray level distribution and pattern distribution to maintain the texture information of the reconstructed structure. Simultaneously, by adopting SD dimension promotion theory, we set the input and output of every single node of the CPGAN network to be deep gray-padding structures of equivalent size. Through the cascade of every single node network, we thus ensured continuity and variability between the reconstruction layers. In addition, we used 3D convolution to determine the spatial characteristics of the core. The reconstructed 3D results showed that the gray level information in the 2D image were accurately reflected in the 3D space. This proposed method can help us to understand and analyze various parameter characteristics in cores.