文章摘要
WU Jin(吴进),PANG Wenting,WANG Lei,ZHAO Bo.[J].高技术通讯(英文),2023,29(2):213~222
Micro-expression recognition algorithm based on graph convolutional network and Transformer model
  
DOI:10. 3772/ j. issn. 1006-6748. 2023. 01. 012
中文关键词: 
英文关键词: micro-expression recognition, graph convolutional network (GCN), action unit (AU) detection, Transformer model
基金项目:
Author NameAffiliation
WU Jin(吴进) (School of Electronic Engineering, Xi’an University of Posts and Telecommunications, Xi’an 710121, P. R. China) 
PANG Wenting (School of Electronic Engineering, Xi’an University of Posts and Telecommunications, Xi’an 710121, P. R. China) 
WANG Lei (School of Electronic Engineering, Xi’an University of Posts and Telecommunications, Xi’an 710121, P. R. China) 
ZHAO Bo (School of Electronic Engineering, Xi’an University of Posts and Telecommunications, Xi’an 710121, P. R. China) 
Hits: 613
Download times: 663
中文摘要:
      
英文摘要:
      Micro-expressions are spontaneous, unconscious movements that reveal true emotions. Accurate facial movement information and network training learning methods are crucial for micro-expression recognition. However, most existing micro-expression recognition technologies so far focus on modeling the single category of micro-expression images and neural network structure. Aiming at the problems of low recognition rate and weak model generalization ability in micro-expression recognition, a micro-expression recognition algorithm is proposed based on graph convolution network (GCN) and Transformer model. Firstly, action unit (AU) feature detection is extracted and facial muscle nodes in the neighborhood are divided into three subsets for recognition. Then, graph convolution layer is used to find the layout of dependencies between AU nodes of micro-expression classification. Finally,multiple attentional features of each facial action are enriched with Transformer model to include more sequence information before calculating the overall correlation of each region. The proposed method is validated in CASME II and CAS(ME) ^2 datasets, and the recognition rate reached 69. 85%.
View Full Text   View/Add Comment  Download reader
Close

分享按钮