IEEE Transactions on Affective Computing

Papers
(The median citation count of IEEE Transactions on Affective Computing is 6. The table below lists those papers that are above that threshold based on CrossRef citation counts [max. 250 papers]. The publications cover those that have been published in the past four years, i.e., from 2022-05-01 to 2026-05-01.)
ArticleCitations
Does Gamified Breath-Biofeedback Promote Adherence, Relaxation, and Skill Transfer in the Wild?1160
Cost-Sensitive Boosting Pruning Trees for Depression Detection on Twitter705
ECPEC: Emotion-Cause Pair Extraction in Conversations428
Classifying Suicide-Related Content and Emotions on Twitter Using Graph Convolutional Neural Networks298
Mechanoreceptive Aβ Primary Afferents Discriminate Naturalistic Social Touch Inputs at a Functionally Relevant Time Scale298
Major Depressive Disorder Detection Using Graph Domain Adaptation With Global Message-Passing Based on EEG Signals294
ATTSF-Net: Attention-Based Similarity Fusion Network for Audio-Visual Emotion Recognition282
The Acoustically Emotion-Aware Conversational Agent With Speech Emotion Recognition and Empathetic Responses248
DGC-Link: Dual-Gate Chebyshev Linkage Network on EEG Emotion Recognition225
Perceived Conversation Quality in Spontaneous Interactions204
Mouse-Cursor Tracking: Simple Scoring Algorithms That Make it Work201
Automatic Estimation of Action Unit Intensities and Inference of Emotional Appraisals187
A High-Quality Landmarked Infrared Eye Video Dataset (IREye4Task): Eye Behaviors, Insights and Benchmarks for Wearable Mental State Analysis166
Creating an Affective Robot That Feels Both Touch and Emotion158
CAETFN: Context Adaptively Enhanced Text-Guided Fusion Network for Multimodal Sentiment Analysis148
Progressive Masking Oriented Self-Taught Learning for Occluded Facial Expression Recognition147
Sparse Emotion Dictionary and CWT Spectrogram Fusion With Multi-Head Self-Attention for Depression Recognition in Parkinson's Disease Patients139
Hug Synchronization Enhances Social Presence and Prosociality in Computer-Mediated Communication139
Identity-Free Artificial Emotional Intelligence via Micro-Gesture Understanding139
From EEG to Eye Movements: Cross-Modal Emotion Recognition Using Constrained Adversarial Network With Dual Attention132
CausalSymptom: Learning Causal Disentangled Representation for Depression Severity Estimation on Transcribed Clinical Interviews131
Towards Contrastive Context-Aware Conversational Emotion Recognition130
Vision Transformer With Attentive Pooling for Robust Facial Expression Recognition125
Analyzing Emotions and Engagement During Cognitive Stimulation Group Training with the Pepper Robot118
Subjective Fear in Virtual Reality: A Linear Mixed-Effects Analysis of Skin Conductance115
The Diagnosis Method of Major Depressive Disorder Using Wavelet Coherence and State-Pathology Separation Network113
AM-ConvBLS: Adaptive Manifold Convolutional Broad Learning System for Cross-Session and Cross-Subject Emotion Recognition110
MPRNet: A Temporal-Aware Cross-Modal Encoding Framework for Personality Recognition110
When is a Haptic Message Like an Inside Joke? Digitally Mediated Emotive Communication Builds on Shared History109
Real-World Classification of Student Stress and Fatigue Using Wearable PPG Recordings108
Learning Users Inner Thoughts and Emotion Changes for Social Media Based Suicide Risk Detection104
Hierarchical Multiscale Recurrent Neural Networks for Detecting Suicide Notes102
Facial Image-Based Automatic Assessment of Equine Pain102
Continuous Emotion Ambiguity Prediction: Modeling With Beta Distributions99
Cluster-Level Contrastive Learning for Emotion Recognition in Conversations92
An Efficient Framework for Constructing Speech Emotion Corpus Based on Integrated Active Learning Strategies86
Transformer-Based Self-Supervised Multimodal Representation Learning for Wearable Emotion Recognition86
The Deep Method: Towards Computational Modeling of the Social Emotion Shame Driven by Theory, Introspection, and Social Signals85
TouchTales: A Care-Centered Protocol for Recognizing Authentic Emotion from Naturalistic Touching and Telling81
EmoAgent: A Multi-Agent Framework for Diverse Affective Image Manipulation80
Are 3D Face Shapes Expressive Enough for Recognising Continuous Emotions and Action Unit Intensities?79
A Classification Framework for Depressive Episode Using R-R Intervals From Smartwatch78
Enhancing Cross-Dataset EEG Emotion Recognition: A Novel Approach With Emotional EEG Style Transfer Network76
Emotion Arousal Assessment Based on Multimodal Physiological Signals for Game Users73
Brain-Machine Enhanced Intelligence for Semi-Supervised Facial Emotion Recognition73
SparseDGCNN: Recognizing Emotion From Multichannel EEG Signals72
Smart Affect Monitoring With Wearables in the Wild: An Unobtrusive Mood-Aware Emotion Recognition System72
UBFC-Phys: A Multimodal Database For Psychophysiological Studies of Social Stress72
A Multi-Componential Approach to Emotion Recognition and the Effect of Personality72
PainFormer: A Vision Foundation Model for Automatic Pain Assessment72
EEG Microstates and fNIRS Metrics Reveal the Spatiotemporal Joint Neural Processing Features of Human Emotions72
CMCRD: Cross-Modal Contrastive Representation Distillation for Emotion Recognition69
Non-Invasive Measurement of Trust in Group Interactions68
MECA: Manipulation With Emotional Intensity-Aware Contrastive Learning and Attention-Based Discriminative Learning67
Age Against the Machine: How Age Relates to Listeners' Ability to Recognize Emotions in Robots' Semantic-Free Utterances67
Computation of Sensory-Affective Relationships Depending on Material Categories of Pictorial Stimuli66
InterARM: Interpretable Affective Reasoning Model for Multimodal Sarcasm Detection65
A Micro-Expression Recognition Network Based on Attention Mechanism and Motion Magnification63
Towards Efficient and Robust Linguistic Emotion Diagnosis for Mental Health via Multi-Agent Instruction Refinement62
An Enroll-to-Verify Approach for Cross-Task Unseen Emotion Class Recognition62
Nasal Dominance and Nostril Breathing Variability: Potential Biomarkers of Acute Stress62
GHA: A Gated Hierarchical Attention Mechanism for the Detection of Abusive Language in Social Media60
TFAGL: A Novel Agent Graph Learning Method Using Time-Frequency EEG for Major Depressive Disorder Detection59
SMIN: Semi-Supervised Multi-Modal Interaction Network for Conversational Emotion Recognition58
Improved Video Emotion Recognition With Alignment of CNN and Human Brain Representations57
Effects of Algorithmic Transparency on User Experience and Physiological Responses in Affect-Aware Task Adaptation57
Towards Cyberbullying Detection: Building, Benchmarking and Longitudinal Analysis of Aggressiveness and Conflicts/Attacks Datasets From Twitter56
Video-Based Cross-Domain Emotion Recognition Via Sample-Graph Relations Self-Distillation56
LGSNet: A Two-Stream Network for Micro- and Macro-Expression Spotting With Background Modeling55
Rethinking Emotion Annotations in the Era of Large Language Models55
SEED-VII: A Multimodal Dataset of Six Basic Emotions With Continuous Labels for Emotion Recognition55
Dynamic Confidence-Aware Multi-Modal Emotion Recognition55
Guest Editorial Extremely Low-Resource Autonomous Affective Learning54
miMamba: EEG-Based Emotion Recognition With Multi-Scale Inverted Mamba Models54
Fusion and Discrimination: A Multimodal Graph Contrastive Learning Framework for Multimodal Sarcasm Detection54
Nonverbal Leadership in Joint Full-Body Improvisation54
Long Short-Term Memory Network Based Unobtrusive Workload Monitoring With Consumer Grade Smartwatches53
Guest Editorial: Special Issue on Affective Speech and Language Synthesis, Generation, and Conversion52
Dual-Channel Retrieval-Augmented In-Context Learning for Comparative Opinion Mining52
Few-Shot Learning in Emotion Recognition of Spontaneous Speech Using a Siamese Neural Network With Adaptive Sample Pair Formation51
Group Synchrony for Emotion Recognition Using Physiological Signals51
AMuSeD: An Attentive Deep Neural Network for Multimodal Sarcasm Detection Incorporating Bimodal Data Augmentation50
Annotate Smarter, not Harder: Using Active Learning to Reduce Emotional Annotation Effort50
An EEG-Based Multi-Source Domain Knowledge Transfer Framework for Cross-Session and Cross-Subject Emotion Recognition50
An Analysis of Physiological and Psychological Responses in Virtual Reality and Flat Screen Gaming50
Multi-Party Conversation Modeling for Emotion Recognition50
Modeling Category Semantic and Sentiment Knowledge for Aspect-Level Sentiment Analysis49
AGILE: Attribute-Guided Identity Independent Learning for Facial Expression Recognition49
Combining Neural Empathy-Aware Behavior Trees with Knowledge Graphs for Affective Human-AI Teaming49
How many raters do we need? Analyses of uncertainty in estimating ambiguity-aware emotion labels49
Emotion Transition Recognition Using Multimodal Physiological Signal Fusion48
A Feature-level Framework for Evaluating Demographic Biases in Facial Expression Recognition Models48
Facial Depression Recognition by Deep Joint Label Distribution and Metric Learning48
Partial Label Learning for Emotion Recognition From EEG48
Prompt-Guided Domain Generalization for EEG Emotion Recognition47
Emotion Recognition Using Affective Touch: A Survey46
Multi-Label and Multimodal Classifier for Affective States Recognition in Virtual Rehabilitation46
A New Perspective on Stress Detection: An Automated Approach for Detecting Eustress and Distress46
Exploring Spontaneous Facial Micro-expressions in On-road Driver Behavior46
Domain-Incremental Continual Learning for Mitigating Bias in Facial Expression and Action Unit Recognition45
Methodology to Assess Quality, Presence, Empathy, Attitude, and Attention in 360-degree Videos for Immersive Communications44
Dynamical Causal Graph Neural Network for EEG Emotion Recognition44
Hierarchical Shared Encoder With Task-Specific Transformer Layer Selection for Emotion-Cause Pair Extraction44
Exploring Complexity of Facial Dynamics in Autism Spectrum Disorder44
Geometric Graph Representation With Learnable Graph Structure and Adaptive AU Constraint for Micro-Expression Recognition43
Prediction of Depression Severity Based on the Prosodic and Semantic Features With Bidirectional LSTM and Time Distributed CNN43
The Rhythm of Flow: Detecting Facial Expressions of Flow Experiences Using CNNs43
Objective Class-Based Micro-Expression Recognition Under Partial Occlusion Via Region-Inspired Relation Reasoning Network43
EEG Feature Selection via Global Redundancy Minimization for Emotion Recognition42
Emotion Distribution Learning Based on Peripheral Physiological Signals42
Facial Expression Recognition for Chinese Elderly Using Edge and Semantic Features Dual Path Network With Two-Step Transfer Learning42
R2G $^{3}$ Net: A Novel Hierarchical Spatial-Temporal Neural Network With a Regional-to-Global Fusion Mechanism for Multimodal Emotion Recognition42
Classification of Interbeat Interval Time-Series Using Attention Entropy42
Deep Adaptation of Adult-Child Facial Expressions by Fusing Landmark Features42
Integrating Deep Facial Priors Into Landmarks for Privacy Preserving Multimodal Depression Recognition41
Hierarchical Encoding and Fusion of Brain Functions for Depression Subtype Classification41
Text-Based Fine-Grained Emotion Prediction40
Public Opinion Crisis Management via Social Media Mining40
STAA-Net: A Sparse and Transferable Adversarial Attack for Speech Emotion Recognition40
Datasets of Smartphone Modalities for Depression Assessment: A Scoping Review40
EmoTake: Exploring Drivers’ Emotion for Takeover Behavior Prediction40
SMSAT: An Acoustic Dataset and Multi-Feature Deep Contrastive Learning Framework for Affective and Physiological Modeling of Spiritual Meditation39
Ordinal Logistic Regression With Partial Proportional Odds for Depression Prediction39
Aspect-Based Sentiment Quantification39
MTADA: A Multi-Task Adversarial Domain Adaptation Network for EEG-Based Cross-Subject Emotion Recognition39
MoDE: Improving Mixture of Depression Experts With Mutual Information Estimator for Depression Detection38
Contradicted by the Brain: Predicting Individual and Group Preferences via Brain-Computer Interfacing38
Sifting Truth From Spectacle! A Multimodal Hindi Dataset for Misinformation Detection With Emotional Cues and Sentiments37
MCGC-Net: Multi-Scale Controllable Graph Convolutional Network on Music Emotion Recognition36
Exploring the Role of Randomization on Belief Rigidity in Online Social Networks36
Improving Emotion and Intent Understanding in Multimodal Conversations With Progressive Interaction36
FERMixNet: An Occlusion Robust Facial Expression Recognition Model With Facial Mixing Augmentation and Mid-Level Representation Learning35
A Spontaneous Driver Emotion Facial Expression (DEFE) Dataset for Intelligent Vehicles: Emotions Triggered by Video-Audio Clips in Driving Scenarios35
Leveraging Social Media for Real-Time Interpretable and Amendable Suicide Risk Prediction With Human-in-The-Loop35
Lexicon-Based Sentiment Convolutional Neural Networks for Online Review Analysis35
Estimating Affective Taste Experience Using Combined Implicit Behavioral and Neurophysiological Measures35
Versatile Audio-Visual Learning for Emotion Recognition35
Survey of Deep Representation Learning for Speech Emotion Recognition34
MERGE: A Bimodal Audio-Lyrics Dataset For Static Music Emotion Recognition34
A Dual-Branch Dynamic Graph Convolution Based Adaptive TransFormer Feature Fusion Network for EEG Emotion Recognition34
Boosting Micro-Expression Recognition via Self-Expression Reconstruction and Memory Contrastive Learning33
Theory of Mind Abilities Predict Robot's Gaze Effects on Object Preference33
Contrastive Learning of Subject-Invariant EEG Representations for Cross-Subject Emotion Recognition33
Self-Supervised ECG Representation Learning for Emotion Recognition33
Emotion Expression in Human Body Posture and Movement: A Survey on Intelligible Motion Factors, Quantification and Validation33
From What You See to What We Smell: Linking Human Emotions to Bio-Markers in Breath32
Capturing Dynamic Fear Experiences in Naturalistic Contexts: An Ecologically Valid fMRI Signature Integrating Brain Activation and Connectivity32
Comparative Analysis of Physiological and Speech Signals for State Anxiety Detection in University Students in STEM32
CorMulT: A Semi-Supervised Modality Correlation-Aware Multimodal Transformer for Sentiment Analysis32
Affective Touch via Haptic Interfaces: A Sequential Indentation Approach32
SalMIM: Saliency-Guided Masked Image Modeling Network for Visual Emotion Analysis31
Stimulus-Response Pattern: The Core of Robust Cross-Stimulus Facial Depression Recognition31
Modeling Multimodal Depression Diagnosis From the Perspective of Local Depressive Representation31
Fake News, Real Emotions: Emotion Analysis of COVID-19 Infodemic in Weibo31
Progressive Multi-Source Domain Adaptation for Personalized Facial Expression Recognition31
Towards Emotion-Aware Agents for Improved User Satisfaction and Partner Perception in Negotiation Dialogues31
The Role of Preprocessing for Word Representation Learning in Affective Tasks31
Charting the Unspoken: Causal Inference-Guided LLM Augmentation for Emotion Recognition in Conversation30
Enhanced Dynamic Representation via Gradient-Motion Modeling for Micro-Expression Recognition30
I Enjoy Writing and Playing, Do You?: A Personalized and Emotion Grounded Dialogue Agent Using Generative Adversarial Network30
Semantic and Emotional Dual Channel for Emotion Recognition in Conversation30
Does Visual Self-Supervision Improve Learning of Speech Representations for Emotion Recognition?29
Short and Long Range Relation Based Spatio-Temporal Transformer for Micro-Expression Recognition28
Exploring Emotion Expression Recognition in Older Adults Interacting With a Virtual Coach28
A Survey of Textual Emotion Recognition and Its Challenges28
Emotion Recognition From Few-Channel EEG Signals by Integrating Deep Feature Aggregation and Transfer Learning28
Behavioral and Physiological Signals-Based Deep Multimodal Approach for Mobile Emotion Recognition28
Efficient Multimodal Transformer With Dual-Level Feature Restoration for Robust Multimodal Sentiment Analysis28
Automatic Emotion Recognition in Clinical Scenario: A Systematic Review of Methods28
Distant Handshakes: Conveying Social Intentions Through Multi-Modal Soft Haptic Gloves28
The Pixels and Sounds of Emotion: General-Purpose Representations of Arousal in Games28
DECEPTIcON: Bridging Gaps in In-the-Wild Deception Research27
Personalized Federated Learning for Session-based Affective Interaction Modeling27
Mutual Information Based Fusion Model (MIBFM): Mild Depression Recognition Using EEG and Pupil Area Signals27
Derived Topic Propagation Model Based on Topic Relevance and User Sentiment27
An Effective 3D Text Recurrent Voting Generator for Metaverse27
A Residual Multi-Scale Convolutional Neural Network With Transformers for Speech Emotion Recognition27
Empathetic Response Generation Through Multi-Modality27
Towards Multimodal Sentiment Analysis Via Contrastive Cross-Modal Retrieval Augmentation and Hierachical Prompts27
Incorporating Forthcoming Events and Personality Traits in Social Media Based Stress Prediction26
Effects of Physiological Signals in Different Types of Multimodal Sentiment Estimation26
VyaktitvaNirdharan: Multimodal Assessment of Personality and Trait Emotional Intelligence26
Collecting Mementos: A Multimodal Dataset for Context-Sensitive Modeling of Affect and Memory Processing in Responses to Videos26
SCARE: A Novel Framework to Enhance Chinese Harmful Memes Detection25
Emotion Embeddings — Learning Stable and Homogeneous Abstractions From Heterogeneous Affective Datasets25
Hierarchical Knowledge Stripping for Multimodal Sentiment Analysis25
Deep Learning Techniques for Text-Based Emotional Response Generation: A Systematic Review25
Mixture-of-Expert Large Language Models for text-based Personality Assessment from Asynchronous Video Interviews25
Emotions Like Human: Self-Supervised Emotion Label Augmentation for Emotion Recognition in Conversation25
Transformer-Based Physiological Emotion Recognition for Autism Intervention Support25
Percussion and Instrumentation in Music Emotion Recognition: A Feature Engineering Approach25
Multi-Order Networks for Action Unit Detection25
Recognizing, Fast and Slow: Complex Emotion Recognition With Facial Expression Detection and Remote Physiological Measurement25
SLAB: A Self-supervised Label Generation Framework to Reduce Annotation Overhead24
Social Image–Text Sentiment Classification With Cross-Modal Consistency and Knowledge Distillation24
Multi-Scale Hyperbolic Contrastive Learning for Cross-Subject EEG Emotion Recognition24
Unconstrained Facial Expression Recognition With No-Reference De-Elements Learning24
LineConGraphs: Line Conversation Graphs for Effective Emotion Recognition Using Graph Neural Networks24
Emotion Intensity and its Control for Emotional Voice Conversion24
Facial Expression Animation by Landmark Guided Residual Module24
Step-Wise Prompting Meets Uncertainty-Aware Dynamic Fusion for Robust EEG-Visual Emotion Recognition23
Empathetic Conversational Systems: A Review of Current Advances, Gaps, and Opportunities23
Emotion Dependent Domain Adaptation for Speech Driven Affective Facial Feature Synthesis23
Affective Dynamics and Cognition During Game-Based Learning23
Deep Learning for Micro-Expression Recognition: A Survey23
LibEER: A Comprehensive Benchmark and Algorithm Library for EEG-Based Emotion Recognition23
A Multi-Level Alignment and Cross-Modal Unified Semantic Graph Refinement Network for Conversational Emotion Recognition23
Learning to Rank Onset-Occurring-Offset Representations for Micro-Expression Recognition23
Persuasion-Induced Physiology as Predictor of Persuasion Effectiveness23
Multi-Modal Sarcasm Detection and Humor Classification in Code-Mixed Conversations23
Towards Participant-Independent Stress Detection Using Instrumented Peripherals22
Mental Stress Assessment in the Workplace: A Review22
Bootstrap Wayfinding Questions to Elicit Emotion Shift Reasoning with Large Language Models22
Autonomic Modulations to Cardiac Dynamics in Response to Affective Touch: Differences Between Social Touch and Self-Touch22
AT2GRU: A Human Emotion Recognition Model With Mitigated Device Heterogeneity22
SynSem-ASTE: An Enhanced Multi-Encoder Network for Aspect Sentiment Triplet Extraction With Syntax and Semantics22
Effective Connectivity Based EEG Revealing the Inhibitory Deficits for Distracting Stimuli in Major Depression Disorders22
Global Entity Relationship Enhancement Network for Multimodal Sarcasm Detection21
Mind AI's Mind: A Clinically Aligned Explainable AI Pipeline for Depression Diagnosis via Large Language Models21
Examining Emotion Perception Agreement in Live Music Performance21
Conveying Emotions Through Device-Initiated Touch20
CSE-GResNet: A Simple and Highly Efficient Network for Facial Expression Recognition20
Unsupervised Time-Aware Sampling Network With Deep Reinforcement Learning for EEG-Based Emotion Recognition20
Implicit Knowledge and Emotional Cues-Enhanced Multimodal Sarcasm Detection Model20
MDN: A Deep Maximization-Differentiation Network for Spatio-Temporal Depression Detection20
Affective-ROPTester: Capability and Bias Analysis of LLMs in Predicting Retinopathy of Prematurity20
Detection and Identification of Choking Under Pressure in College Tennis Based Upon Physiological Parameters, Performance Patterns, and Game Statistics20
Interview-Based Depression Detection Using LLM-Based Text Restatement and Emotion Lexicon20
Neuro or Symbolic? Fine-Tuned Transformer With Unsupervised LDA Topic Clustering for Text Sentiment Analysis20
Multi-Stage Graph Fusion Networks for Major Depressive Disorder Diagnosis20
STREL - Naturalistic Dataset and Methods for Studying Mental Stress and Relaxation Patterns in Critical Leading Roles20
Beyond Overfitting: Doubly Adaptive Dropout for Generalizable AU Detection19
Exploring Multivariate Dynamics of Emotions Through Time-Varying Self-Assessed Arousal and Valence Ratings19
Detecting Mental Disorders in Social Media Through Emotional Patterns - The Case of Anorexia and Depression19
Editorial19
EEG-Based Emotion Recognition via Channel-Wise Attention and Self Attention19
Guest Editorial Neurosymbolic AI for Sentiment Analysis19
Aspect-Opinion Correlation Aware and Knowledge-Expansion Few Shot Cross-Domain Sentiment Classification18
A Reinforcement Learning Based Two-Stage Model for Emotion Cause Pair Extraction18
From the Lab to the Wild: Affect Modeling Via Privileged Information18
EmoSENSE: Modeling Sentiment-Semantic Knowledge with Hierarchical Reinforcement Learning for Emotional Image Generation18
Enhancing Emotional Congruence in Sensory Substitution17
Neurofeedback Training With an Electroencephalogram-Based Brain-Computer Interface Enhances Emotion Regulation17
Analyzing the Visual Road Scene for Driver Stress Estimation17
Quantitative Personality Predictions From a Brief EEG Recording17
Investigating Cardiovascular Activation of Young Adults in Routine Driving17
ER-Chat: A Text-to-Text Open-Domain Dialogue Framework for Emotion Regulation17
AL-HCL: Active Learning and Hierarchical Contrastive Learning for Multimodal Sentiment Analysis With Fusion Guidance17
Cross-Task and Cross-Participant Classification of Cognitive Load in an Emergency Simulation Game17
Facial Expression Recognition in the Wild Using Multi-Level Features and Attention Mechanisms17
0.071246147155762