IEEE Transactions on Affective Computing

Papers
(The H4-Index of IEEE Transactions on Affective Computing is 61. The table below lists those papers that are above that threshold based on CrossRef citation counts [max. 250 papers]. The publications cover those that have been published in the past four years, i.e., from 2022-05-01 to 2026-05-01.)
ArticleCitations
Does Gamified Breath-Biofeedback Promote Adherence, Relaxation, and Skill Transfer in the Wild?1160
Cost-Sensitive Boosting Pruning Trees for Depression Detection on Twitter705
ECPEC: Emotion-Cause Pair Extraction in Conversations428
Mechanoreceptive Aβ Primary Afferents Discriminate Naturalistic Social Touch Inputs at a Functionally Relevant Time Scale298
Classifying Suicide-Related Content and Emotions on Twitter Using Graph Convolutional Neural Networks298
Major Depressive Disorder Detection Using Graph Domain Adaptation With Global Message-Passing Based on EEG Signals294
ATTSF-Net: Attention-Based Similarity Fusion Network for Audio-Visual Emotion Recognition282
The Acoustically Emotion-Aware Conversational Agent With Speech Emotion Recognition and Empathetic Responses248
DGC-Link: Dual-Gate Chebyshev Linkage Network on EEG Emotion Recognition225
Perceived Conversation Quality in Spontaneous Interactions204
Mouse-Cursor Tracking: Simple Scoring Algorithms That Make it Work201
Automatic Estimation of Action Unit Intensities and Inference of Emotional Appraisals187
A High-Quality Landmarked Infrared Eye Video Dataset (IREye4Task): Eye Behaviors, Insights and Benchmarks for Wearable Mental State Analysis166
Creating an Affective Robot That Feels Both Touch and Emotion158
CAETFN: Context Adaptively Enhanced Text-Guided Fusion Network for Multimodal Sentiment Analysis148
Progressive Masking Oriented Self-Taught Learning for Occluded Facial Expression Recognition147
Identity-Free Artificial Emotional Intelligence via Micro-Gesture Understanding139
Sparse Emotion Dictionary and CWT Spectrogram Fusion With Multi-Head Self-Attention for Depression Recognition in Parkinson's Disease Patients139
Hug Synchronization Enhances Social Presence and Prosociality in Computer-Mediated Communication139
From EEG to Eye Movements: Cross-Modal Emotion Recognition Using Constrained Adversarial Network With Dual Attention132
CausalSymptom: Learning Causal Disentangled Representation for Depression Severity Estimation on Transcribed Clinical Interviews131
Towards Contrastive Context-Aware Conversational Emotion Recognition130
Vision Transformer With Attentive Pooling for Robust Facial Expression Recognition125
Analyzing Emotions and Engagement During Cognitive Stimulation Group Training with the Pepper Robot118
Subjective Fear in Virtual Reality: A Linear Mixed-Effects Analysis of Skin Conductance115
The Diagnosis Method of Major Depressive Disorder Using Wavelet Coherence and State-Pathology Separation Network113
MPRNet: A Temporal-Aware Cross-Modal Encoding Framework for Personality Recognition110
AM-ConvBLS: Adaptive Manifold Convolutional Broad Learning System for Cross-Session and Cross-Subject Emotion Recognition110
When is a Haptic Message Like an Inside Joke? Digitally Mediated Emotive Communication Builds on Shared History109
Real-World Classification of Student Stress and Fatigue Using Wearable PPG Recordings108
Learning Users Inner Thoughts and Emotion Changes for Social Media Based Suicide Risk Detection104
Hierarchical Multiscale Recurrent Neural Networks for Detecting Suicide Notes102
Facial Image-Based Automatic Assessment of Equine Pain102
Continuous Emotion Ambiguity Prediction: Modeling With Beta Distributions99
Cluster-Level Contrastive Learning for Emotion Recognition in Conversations92
Transformer-Based Self-Supervised Multimodal Representation Learning for Wearable Emotion Recognition86
An Efficient Framework for Constructing Speech Emotion Corpus Based on Integrated Active Learning Strategies86
The Deep Method: Towards Computational Modeling of the Social Emotion Shame Driven by Theory, Introspection, and Social Signals85
TouchTales: A Care-Centered Protocol for Recognizing Authentic Emotion from Naturalistic Touching and Telling81
EmoAgent: A Multi-Agent Framework for Diverse Affective Image Manipulation80
Are 3D Face Shapes Expressive Enough for Recognising Continuous Emotions and Action Unit Intensities?79
A Classification Framework for Depressive Episode Using R-R Intervals From Smartwatch78
Enhancing Cross-Dataset EEG Emotion Recognition: A Novel Approach With Emotional EEG Style Transfer Network76
Brain-Machine Enhanced Intelligence for Semi-Supervised Facial Emotion Recognition73
Emotion Arousal Assessment Based on Multimodal Physiological Signals for Game Users73
PainFormer: A Vision Foundation Model for Automatic Pain Assessment72
EEG Microstates and fNIRS Metrics Reveal the Spatiotemporal Joint Neural Processing Features of Human Emotions72
SparseDGCNN: Recognizing Emotion From Multichannel EEG Signals72
Smart Affect Monitoring With Wearables in the Wild: An Unobtrusive Mood-Aware Emotion Recognition System72
UBFC-Phys: A Multimodal Database For Psychophysiological Studies of Social Stress72
A Multi-Componential Approach to Emotion Recognition and the Effect of Personality72
CMCRD: Cross-Modal Contrastive Representation Distillation for Emotion Recognition69
Non-Invasive Measurement of Trust in Group Interactions68
MECA: Manipulation With Emotional Intensity-Aware Contrastive Learning and Attention-Based Discriminative Learning67
Age Against the Machine: How Age Relates to Listeners' Ability to Recognize Emotions in Robots' Semantic-Free Utterances67
Computation of Sensory-Affective Relationships Depending on Material Categories of Pictorial Stimuli66
InterARM: Interpretable Affective Reasoning Model for Multimodal Sarcasm Detection65
A Micro-Expression Recognition Network Based on Attention Mechanism and Motion Magnification63
Nasal Dominance and Nostril Breathing Variability: Potential Biomarkers of Acute Stress62
Towards Efficient and Robust Linguistic Emotion Diagnosis for Mental Health via Multi-Agent Instruction Refinement62
An Enroll-to-Verify Approach for Cross-Task Unseen Emotion Class Recognition62
0.33603286743164