IEEE Transactions on Affective Computing

Papers
(The H4-Index of IEEE Transactions on Affective Computing is 46. The table below lists those papers that are above that threshold based on CrossRef citation counts [max. 250 papers]. The publications cover those that have been published in the past four years, i.e., from 2020-09-01 to 2024-09-01.)
ArticleCitations
Deep Facial Expression Recognition: A Survey639
Review on Psychological Stress Detection Using Biosignals301
EEG-Based Emotion Recognition Using Regularized Graph Neural Networks293
AMIGOS: A Dataset for Affect, Personality and Mood Research on Individuals and Groups265
GCB-Net: Graph Convolutional Broad Network and Its Application in Emotion Recognition205
Survey on Emotional Body Gesture Recognition195
EEG-Based Emotion Recognition via Channel-Wise Attention and Self Attention190
A Bi-Hemisphere Domain Adversarial Neural Network Model for EEG Emotion Recognition151
Issues and Challenges of Aspect-based Sentiment Analysis: A Comprehensive Survey134
Self-Supervised ECG Representation Learning for Emotion Recognition129
Feature Extraction and Selection for Emotion Recognition from Electrodermal Activity125
From Regional to Global Brain: A Novel Hierarchical Spatial-Temporal Neural Network Model for EEG Emotion Recognition122
An Efficient LSTM Network for Emotion Recognition From Multichannel EEG Signals122
Classifying Emotions and Engagement in Online Learning Based on a Single Facial Expression Recognition Neural Network120
Utilizing Deep Learning Towards Multi-Modal Bio-Sensing and Vision-Based Affective Computing117
Automatic Recognition Methods Supporting Pain Assessment: A Survey112
Deep Learning for Human Affect Recognition: Insights and New Developments111
Facial Expression Recognition With Visual Transformers and Attentional Selective Fusion100
Exploiting Multi-CNN Features in CNN-RNN Based Dimensional Emotion Recognition on the OMG in-the-Wild Dataset95
Beneath the Tip of the Iceberg: Current Challenges and New Directions in Sentiment Analysis Research94
Novel Audio Features for Music Emotion Recognition94
An EEG-Based Brain Computer Interface for Emotion Recognition and Its Application in Patients with Disorder of Consciousness84
A Mutual Information Based Adaptive Windowing of Informative EEG for Emotion Recognition74
Video-Based Depression Level Analysis by Encoding Deep Spatiotemporal Features71
A Review on Nonlinear Methods Using Electroencephalographic Recordings for Emotion Recognition69
Integrating Deep and Shallow Models for Multi-Modal Depression Analysis—Hybrid Architectures67
Facial Expression Recognition with Identity and Emotion Joint Learning65
Improving Cross-Corpus Speech Emotion Recognition with Adversarial Discriminative Domain Generalization (ADDoG)63
Spontaneous Speech Emotion Recognition Using Multiscale Deep Convolutional LSTM62
An Active Learning Paradigm for Online Audio-Visual Emotion Recognition60
The Ordinal Nature of Emotions: An Emerging Approach60
A Deeper Look at Facial Expression Dataset Bias58
Multi-Task Semi-Supervised Adversarial Autoencoding for Speech Emotion Recognition58
An Improved Empirical Mode Decomposition of Electroencephalogram Signals for Depression Detection58
TSception: Capturing Temporal Dynamics and Spatial Asymmetry From EEG for Emotion Recognition57
The Biases of Pre-Trained Language Models: An Empirical Study on Prompt-Based Sentiment Analysis and Emotion Detection56
Strategies to Utilize the Positive Emotional Contagion Optimally in Crowd Evacuation54
Facial Action Unit Detection Using Attention and Relation Learning54
Spectral Representation of Behaviour Primitives for Depression Analysis53
Virtual Reality for Emotion Elicitation – A Review52
Facial Expression Recognition in the Wild Using Multi-Level Features and Attention Mechanisms50
All-in-One: Emotion, Sentiment and Intensity Prediction Using a Multi-Task Ensemble Framework50
Induction and Profiling of Strong Multi-Componential Emotions in Virtual Reality50
SparseDGCNN: Recognizing Emotion From Multichannel EEG Signals49
Hybrid Contrastive Learning of Tri-Modal Representation for Multimodal Sentiment Analysis48
Facial Expression Recognition With Deeply-Supervised Attention Network48
UBFC-Phys: A Multimodal Database For Psychophysiological Studies of Social Stress46
0.02810001373291