AUTHOR=He HongYing , Yin XiHao , Luo DianSheng , Xi RuiYao , Fang Jie , Fu FangYu , Luo GuangWei TITLE=Triple Extraction Technique for Power Transformer Fault Information Disposal Based on a Residual Dilate Gated Convolution and Self-Attention Mechanism JOURNAL=Frontiers in Energy Research VOLUME=10 YEAR=2022 URL=https://www.frontiersin.org/journals/energy-research/articles/10.3389/fenrg.2022.929535 DOI=10.3389/fenrg.2022.929535 ISSN=2296-598X ABSTRACT=

This article presents a triple extraction technique for a power transformer fault information process based on a residual dilate gated convolution and self-attention mechanism. An optimized word input sequence is designed to improve the effectiveness of triple extraction. A residual dilate gated convolution is used to capture the middle-long distance information in the literature. A self-attention mechanism is applied to learn the internal information and capture the internal structure of input sequences. An improved binary tagging method with position information is presented to mark the start and the end of an entity, which improves the extraction accuracy. An object entity is obtained by a specific relationship r for a given subject. The nearest start-end pair matching the principle and probability estimation is applied to acquire the optimal solution of the set of triples. Testing results showed that the F1 score of the presented method is 91.98%, and the triple extraction accuracy is much better than the methods of BERT and Bi-LSTM-CRF.