IEEE Transactions on Affective Computing

Papers
(The TQCC of IEEE Transactions on Affective Computing is 19. The table below lists those papers that are above that threshold based on CrossRef citation counts [max. 250 papers]. The publications cover those that have been published in the past four years, i.e., from 2022-01-01 to 2026-01-01.)
ArticleCitations
A High-Quality Landmarked Infrared Eye Video Dataset (IREye4Task): Eye Behaviors, Insights and Benchmarks for Wearable Mental State Analysis1040
Mouse-Cursor Tracking: Simple Scoring Algorithms That Make it Work604
Major Depressive Disorder Detection Using Graph Domain Adaptation With Global Message-Passing Based on EEG Signals528
CausalSymptom: Learning Causal Disentangled Representation for Depression Severity Estimation on Transcribed Clinical Interviews379
Classifying Suicide-Related Content and Emotions on Twitter Using Graph Convolutional Neural Networks304
Sparse Emotion Dictionary and CWT Spectrogram Fusion With Multi-Head Self-Attention for Depression Recognition in Parkinson's Disease Patients267
CAETFN: Context Adaptively Enhanced Text-Guided Fusion Network for Multimodal Sentiment Analysis261
Automatic Estimation of Action Unit Intensities and Inference of Emotional Appraisals247
Mechanoreceptive Aβ Primary Afferents Discriminate Naturalistic Social Touch Inputs at a Functionally Relevant Time Scale247
Progressive Masking Oriented Self-Taught Learning for Occluded Facial Expression Recognition227
Does Gamified Breath-Biofeedback Promote Adherence, Relaxation, and Skill Transfer in the Wild?225
Facial Expression Recognition Using a Temporal Ensemble of Multi-Level Convolutional Neural Networks218
From EEG to Eye Movements: Cross-Modal Emotion Recognition Using Constrained Adversarial Network With Dual Attention182
Hug Synchronization Enhances Social Presence and Prosociality in Computer-Mediated Communication177
ATTSF-Net: Attention-Based Similarity Fusion Network for Audio-Visual Emotion Recognition167
Creating an Affective Robot That Feels Both Touch and Emotion162
Vision Transformer With Attentive Pooling for Robust Facial Expression Recognition151
Cost-Sensitive Boosting Pruning Trees for Depression Detection on Twitter146
DGC-Link: Dual-Gate Chebyshev Linkage Network on EEG Emotion Recognition145
Towards Contrastive Context-Aware Conversational Emotion Recognition129
Fusing of Electroencephalogram and Eye Movement With Group Sparse Canonical Correlation Analysis for Anxiety Detection128
Analyzing Emotions and Engagement During Cognitive Stimulation Group Training with the Pepper Robot118
ECPEC: Emotion-Cause Pair Extraction in Conversations118
The Acoustically Emotion-Aware Conversational Agent With Speech Emotion Recognition and Empathetic Responses118
Perceived Conversation Quality in Spontaneous Interactions118
Brain-Machine Enhanced Intelligence for Semi-Supervised Facial Emotion Recognition117
Subjective Fear in Virtual Reality: A Linear Mixed-Effects Analysis of Skin Conductance115
Are 3D Face Shapes Expressive Enough for Recognising Continuous Emotions and Action Unit Intensities?107
AM-ConvBLS: Adaptive Manifold Convolutional Broad Learning System for Cross-Session and Cross-Subject Emotion Recognition104
The Diagnosis Method of Major Depressive Disorder Using Wavelet Coherence and State-Pathology Separation Network103
PainFormer: A Vision Foundation Model for Automatic Pain Assessment102
MPRNet: A Temporal-Aware Cross-Modal Encoding Framework for Personality Recognition99
The Recognition of Multiple Anxiety Levels Based on Electroencephalograph98
The Deep Method: Towards Computational Modeling of the Social Emotion Shame Driven by Theory, Introspection, and Social Signals97
A Classification Framework for Depressive Episode Using R-R Intervals From Smartwatch97
Real-World Classification of Student Stress and Fatigue Using Wearable PPG Recordings96
Continuous Emotion Ambiguity Prediction: Modeling With Beta Distributions93
When is a Haptic Message Like an Inside Joke? Digitally Mediated Emotive Communication Builds on Shared History92
An Efficient Framework for Constructing Speech Emotion Corpus Based on Integrated Active Learning Strategies90
EEG Microstates and fNIRS Metrics Reveal the Spatiotemporal Joint Neural Processing Features of Human Emotions89
Facial Image-Based Automatic Assessment of Equine Pain89
Learning Users Inner Thoughts and Emotion Changes for Social Media Based Suicide Risk Detection89
Automatic Detection of Reflective Thinking in Mathematical Problem Solving Based on Unconstrained Bodily Exploration87
Emotion Arousal Assessment Based on Multimodal Physiological Signals for Game Users87
Enhancing Cross-Dataset EEG Emotion Recognition: A Novel Approach With Emotional EEG Style Transfer Network84
Review on Psychological Stress Detection Using Biosignals83
Transformer-Based Self-Supervised Multimodal Representation Learning for Wearable Emotion Recognition81
A Multi-Componential Approach to Emotion Recognition and the Effect of Personality77
UBFC-Phys: A Multimodal Database For Psychophysiological Studies of Social Stress75
Cluster-Level Contrastive Learning for Emotion Recognition in Conversations73
SparseDGCNN: Recognizing Emotion From Multichannel EEG Signals71
Smart Affect Monitoring With Wearables in the Wild: An Unobtrusive Mood-Aware Emotion Recognition System70
Hierarchical Multiscale Recurrent Neural Networks for Detecting Suicide Notes68
Effects of Algorithmic Transparency on User Experience and Physiological Responses in Affect-Aware Task Adaptation67
Towards Cyberbullying Detection: Building, Benchmarking and Longitudinal Analysis of Aggressiveness and Conflicts/Attacks Datasets From Twitter67
Non-Invasive Measurement of Trust in Group Interactions66
Leveraging the Deep Learning Paradigm for Continuous Affect Estimation from Facial Expressions66
Age Against the Machine: How Age Relates to Listeners' Ability to Recognize Emotions in Robots' Semantic-Free Utterances65
MECA: Manipulation With Emotional Intensity-Aware Contrastive Learning and Attention-Based Discriminative Learning65
A Micro-Expression Recognition Network Based on Attention Mechanism and Motion Magnification65
Computation of Sensory-Affective Relationships Depending on Material Categories of Pictorial Stimuli64
SMIN: Semi-Supervised Multi-Modal Interaction Network for Conversational Emotion Recognition63
An Enroll-to-Verify Approach for Cross-Task Unseen Emotion Class Recognition61
GHA: A Gated Hierarchical Attention Mechanism for the Detection of Abusive Language in Social Media61
Improved Video Emotion Recognition With Alignment of CNN and Human Brain Representations61
Rethinking Emotion Annotations in the Era of Large Language Models60
miMamba: EEG-Based Emotion Recognition With Multi-Scale Inverted Mamba Models60
TFAGL: A Novel Agent Graph Learning Method Using Time-Frequency EEG for Major Depressive Disorder Detection59
Fusion and Discrimination: A Multimodal Graph Contrastive Learning Framework for Multimodal Sarcasm Detection58
LGSNet: A Two-Stream Network for Micro- and Macro-Expression Spotting With Background Modeling58
SEED-VII: A Multimodal Dataset of Six Basic Emotions With Continuous Labels for Emotion Recognition58
Guest Editorial Extremely Low-Resource Autonomous Affective Learning58
Dynamic Confidence-Aware Multi-Modal Emotion Recognition58
Nonverbal Leadership in Joint Full-Body Improvisation57
Long Short-Term Memory Network Based Unobtrusive Workload Monitoring With Consumer Grade Smartwatches56
Multi-Party Conversation Modeling for Emotion Recognition55
Objective Class-Based Micro-Expression Recognition Under Partial Occlusion Via Region-Inspired Relation Reasoning Network55
Emotion Recognition Using Affective Touch: A Survey54
Few-Shot Learning in Emotion Recognition of Spontaneous Speech Using a Siamese Neural Network With Adaptive Sample Pair Formation54
Guest Editorial: Special Issue on Affective Speech and Language Synthesis, Generation, and Conversion53
AMuSeD: An Attentive Deep Neural Network for Multimodal Sarcasm Detection Incorporating Bi-modal Data Augmentation52
Group Synchrony for Emotion Recognition Using Physiological Signals52
Emotion Transition Recognition Using Multimodal Physiological Signal Fusion52
Annotate Smarter, not Harder: Using Active Learning to Reduce Emotional Annotation Effort51
Hierarchical Shared Encoder With Task-Specific Transformer Layer Selection for Emotion-Cause Pair Extraction51
Exploring Complexity of Facial Dynamics in Autism Spectrum Disorder50
How many raters do we need? Analyses of uncertainty in estimating ambiguity-aware emotion labels50
AGILE: Attribute-Guided Identity Independent Learning for Facial Expression Recognition50
Partial Label Learning for Emotion Recognition From EEG50
Methodology to Assess Quality, Presence, Empathy, Attitude, and Attention in 360-degree Videos for Immersive Communications49
Multimodal Deception Detection Using Real-Life Trial Data48
Modeling Category Semantic and Sentiment Knowledge for Aspect-Level Sentiment Analysis46
A New Perspective on Stress Detection: An Automated Approach for Detecting Eustress and Distress46
Multi-Label and Multimodal Classifier for Affective States Recognition in Virtual Rehabilitation45
An Analysis of Physiological and Psychological Responses in Virtual Reality and Flat Screen Gaming45
Facial Depression Recognition by Deep Joint Label Distribution and Metric Learning45
Dynamical Causal Graph Neural Network for EEG Emotion Recognition45
Dynamic Micro-Expression Recognition Using Knowledge Distillation44
Combining Neural Empathy-Aware Behavior Trees with Knowledge Graphs for Affective Human-AI Teaming44
From Regional to Global Brain: A Novel Hierarchical Spatial-Temporal Neural Network Model for EEG Emotion Recognition44
Domain-Incremental Continual Learning for Mitigating Bias in Facial Expression and Action Unit Recognition43
Prediction of Depression Severity Based on the Prosodic and Semantic Features With Bidirectional LSTM and Time Distributed CNN43
EmoTake: Exploring Drivers’ Emotion for Takeover Behavior Prediction42
Contradicted by the Brain: Predicting Individual and Group Preferences via Brain-Computer Interfacing42
STAA-Net: A Sparse and Transferable Adversarial Attack for Speech Emotion Recognition42
Geometric Graph Representation With Learnable Graph Structure and Adaptive AU Constraint for Micro-Expression Recognition42
Hierarchical Encoding and Fusion of Brain Functions for Depression Subtype Classification42
State-Specific and Supraordinal Components of Facial Response to Pain42
EEG Feature Selection via Global Redundancy Minimization for Emotion Recognition42
Emotion Expression in Human Body Posture and Movement: A Survey on Intelligible Motion Factors, Quantification and Validation42
Text-Based Fine-Grained Emotion Prediction41
MoDE: Improving Mixture of Depression Experts with Mutual Information Estimator for Depression Detection41
Deep Adaptation of Adult-Child Facial Expressions by Fusing Landmark Features41
Datasets of Smartphone Modalities for Depression Assessment: A Scoping Review40
Leveraging Social Media for Real-Time Interpretable and Amendable Suicide Risk Prediction With Human-in-The-Loop40
MTADA: A Multi-Task Adversarial Domain Adaptation Network for EEG-Based Cross-Subject Emotion Recognition39
The Rhythm of Flow: Detecting Facial Expressions of Flow Experiences Using CNNs39
Theory of Mind Abilities Predict Robot's Gaze Effects on Object Preference39
Estimating Affective Taste Experience Using Combined Implicit Behavioral and Neurophysiological Measures39
Improving Emotion and Intent Understanding in Multimodal Conversations with Progressive Interaction39
Public Opinion Crisis Management via Social Media Mining39
Micro and Macro Facial Expression Recognition Using Advanced Local Motion Patterns39
Ordinal Logistic Regression With Partial Proportional Odds for Depression Prediction38
A Spontaneous Driver Emotion Facial Expression (DEFE) Dataset for Intelligent Vehicles: Emotions Triggered by Video-Audio Clips in Driving Scenarios38
Aspect-Based Sentiment Quantification38
Lexicon-Based Sentiment Convolutional Neural Networks for Online Review Analysis38
Emotion Distribution Learning Based on Peripheral Physiological Signals38
MCGC-Net: Multi-scale Controllable Graph Convolutional Network on Music Emotion Recognition38
Classification of Interbeat Interval Time-Series Using Attention Entropy37
Integrating Deep Facial Priors Into Landmarks for Privacy Preserving Multimodal Depression Recognition37
Self-Supervised ECG Representation Learning for Emotion Recognition37
Versatile Audio-Visual Learning for Emotion Recognition37
Survey of Deep Representation Learning for Speech Emotion Recognition37
A Psychologically Inspired Fuzzy Cognitive Deep Learning Framework to Predict Crowd Behavior37
Boosting Micro-Expression Recognition via Self-Expression Reconstruction and Memory Contrastive Learning37
Sifting Truth From Spectacle! a Multimodal Hindi Dataset for Misinformation Detection With Emotional Cues and Sentiments36
A Dual-Branch Dynamic Graph Convolution Based Adaptive TransFormer Feature Fusion Network for EEG Emotion Recognition36
From What You See to What We Smell: Linking Human Emotions to Bio-Markers in Breath36
FERMixNet: An Occlusion Robust Facial Expression Recognition Model With Facial Mixing Augmentation and Mid-Level Representation Learning36
Exploring the Role of Randomization on Belief Rigidity in Online Social Networks36
Contrastive Learning of Subject-Invariant EEG Representations for Cross-Subject Emotion Recognition36
Affective Touch via Haptic Interfaces: A Sequential Indentation Approach35
Leveraging the Dynamics of Non-Verbal Behaviors For Social Attitude Modeling35
Modeling Multimodal Depression Diagnosis from the Perspective of Local Depressive Representation34
Progressive Multi-Source Domain Adaptation for Personalized Facial Expression Recognition34
Capturing Dynamic Fear Experiences in Naturalistic Contexts: an Ecologically Valid fMRI Signature Integrating Brain Activation and Connectivity34
Psychophysiological Reactions to Persuasive Messages Deploying Persuasion Principles34
Exploring Emotion Expression Recognition in Older Adults Interacting With a Virtual Coach34
SalMIM: Saliency-guided Masked Image Modeling Network for Visual Emotion Analysis33
Comparative Analysis of Physiological and Speech Signals for State Anxiety Detection in University Students in STEM33
Distant Handshakes: Conveying Social Intentions Through Multi-Modal Soft Haptic Gloves33
CorMulT: A Semi-Supervised Modality Correlation-Aware Multimodal Transformer for Sentiment Analysis32
Fake News, Real Emotions: Emotion Analysis of COVID-19 Infodemic in Weibo32
The Role of Preprocessing for Word Representation Learning in Affective Tasks32
I Enjoy Writing and Playing, Do You?: A Personalized and Emotion Grounded Dialogue Agent Using Generative Adversarial Network31
The Pixels and Sounds of Emotion: General-Purpose Representations of Arousal in Games31
Semantic and Emotional Dual Channel for Emotion Recognition in Conversation31
Towards Emotion-Aware Agents for Improved User Satisfaction and Partner Perception in Negotiation Dialogues31
Automatic Emotion Recognition in Clinical Scenario: A Systematic Review of Methods31
Stimulus-Response Pattern: The Core of Robust Cross-Stimulus Facial Depression Recognition31
Does Visual Self-Supervision Improve Learning of Speech Representations for Emotion Recognition?31
Behavioral and Physiological Signals-Based Deep Multimodal Approach for Mobile Emotion Recognition31
A Survey of Textual Emotion Recognition and Its Challenges31
Incorporating Forthcoming Events and Personality Traits in Social Media Based Stress Prediction30
Short and Long Range Relation Based Spatio-Temporal Transformer for Micro-Expression Recognition30
Unconstrained Facial Expression Recognition With No-Reference De-Elements Learning30
An Effective 3D Text Recurrent Voting Generator for Metaverse30
Emotion Recognition From Few-Channel EEG Signals by Integrating Deep Feature Aggregation and Transfer Learning30
Affective Dynamics and Cognition During Game-Based Learning30
Efficient Multimodal Transformer With Dual-Level Feature Restoration for Robust Multimodal Sentiment Analysis30
Mutual Information Based Fusion Model (MIBFM): Mild Depression Recognition Using EEG and Pupil Area Signals29
A Residual Multi-Scale Convolutional Neural Network With Transformers for Speech Emotion Recognition29
Collecting Mementos: A Multimodal Dataset for Context-Sensitive Modeling of Affect and Memory Processing in Responses to Videos29
Emotions Like Human: Self-Supervised Emotion Label Augmentation for Emotion Recognition in Conversation29
Multi-Scale Hyperbolic Contrastive Learning for Cross-Subject EEG Emotion Recognition29
Emotion Intensity and its Control for Emotional Voice Conversion29
SCARE: A Novel Framework to Enhance Chinese Harmful Memes Detection29
Recognizing, Fast and Slow: Complex Emotion Recognition With Facial Expression Detection and Remote Physiological Measurement29
A Multi-Level Alignment and Cross-Modal Unified Semantic Graph Refinement Network for Conversational Emotion Recognition29
LineConGraphs: Line Conversation Graphs for Effective Emotion Recognition Using Graph Neural Networks29
Hierarchical Knowledge Stripping for Multimodal Sentiment Analysis29
Learning to Rank Onset-Occurring-Offset Representations for Micro-Expression Recognition28
Empathetic Response Generation Through Multi-Modality28
Emotion Dependent Domain Adaptation for Speech Driven Affective Facial Feature Synthesis27
Multi-Order Networks for Action Unit Detection27
Deep Learning Techniques for Text-based Emotional Response Generation: A Systematic Review27
Facial Expression Animation by Landmark Guided Residual Module27
DECEPTIcON: Bridging Gaps in In-the-Wild Deception Research27
Step-wise Prompting Meets Uncertainty-Aware Dynamic Fusion for Robust EEG-Visual Emotion Recognition26
VyaktitvaNirdharan: Multimodal Assessment of Personality and Trait Emotional Intelligence26
Effects of Physiological Signals in Different Types of Multimodal Sentiment Estimation26
Emotion Embeddings — Learning Stable and Homogeneous Abstractions from Heterogeneous Affective Datasets26
Multi-Modal Sarcasm Detection and Humor Classification in Code-Mixed Conversations26
Percussion and Instrumentation in Music Emotion Recognition: a Feature Engineering Approach26
Social Image–Text Sentiment Classification With Cross-Modal Consistency and Knowledge Distillation26
Deep Learning for Micro-Expression Recognition: A Survey25
Persuasion-Induced Physiology as Predictor of Persuasion Effectiveness25
Interview-based Depression Detection Using LLM-based Text Restatement and Emotion Lexicon25
SynSem-ASTE: An Enhanced Multi-Encoder Network for Aspect Sentiment Triplet Extraction With Syntax and Semantics25
Mental Stress Assessment in the Workplace: A Review25
AT2GRU: A Human Emotion Recognition Model With Mitigated Device Heterogeneity25
Detection and Identification of Choking Under Pressure in College Tennis Based Upon Physiological Parameters, Performance Patterns, and Game Statistics25
Towards Participant-Independent Stress Detection Using Instrumented Peripherals25
STREL - Naturalistic Dataset and Methods for Studying Mental Stress and Relaxation Patterns in Critical Leading Roles25
Detecting Mental Disorders in Social Media Through Emotional Patterns - The Case of Anorexia and Depression24
Mind AI's Mind: A Clinically Aligned Explainable AI Pipeline for Depression Diagnosis via Large Language Models24
Beyond Overfitting: Doubly Adaptive Dropout for Generalizable AU Detection24
Autonomic Modulations to Cardiac Dynamics in Response to Affective Touch: Differences Between Social Touch and Self-Touch24
LibEER: A Comprehensive Benchmark and Algorithm Library for EEG-Based Emotion Recognition23
Neuro or Symbolic? Fine-Tuned Transformer With Unsupervised LDA Topic Clustering for Text Sentiment Analysis23
Conveying Emotions Through Device-Initiated Touch23
Examining Emotion Perception Agreement in Live Music Performance23
Affective-ROPTester: Capability and Bias Analysis of LLMs in Predicting Retinopathy of Prematurity23
MDN: A Deep Maximization-Differentiation Network for Spatio-Temporal Depression Detection22
Empathetic Conversational Systems: A Review of Current Advances, Gaps, and Opportunities22
ENGAGE-DEM: A Model of Engagement of People With Dementia22
Implicit Knowledge and Emotional Cues-Enhanced Multimodal Sarcasm Detection Model22
Effective Connectivity Based EEG Revealing the Inhibitory Deficits for Distracting Stimuli in Major Depression Disorders22
CSE-GResNet: A Simple and Highly Efficient Network for Facial Expression Recognition21
Deep Multi-Task Multi-Label CNN for Effective Facial Attribute Classification21
Guest Editorial Neurosymbolic AI for Sentiment Analysis21
Unsupervised Time-Aware Sampling Network With Deep Reinforcement Learning for EEG-Based Emotion Recognition21
Global Entity Relationship Enhancement Network for Multimodal Sarcasm Detection21
Facial Expression Recognition With Vision Transformer Using Fused Shifted Windows21
EEG-Based Emotion Recognition via Channel-Wise Attention and Self Attention21
Improving Multi-Label Facial Expression Recognition With Consistent and Distinct Attentions21
Multi-Stage Graph Fusion Networks for Major Depressive Disorder Diagnosis21
AL-HCL: Active Learning and Hierarchical Contrastive Learning for Multimodal Sentiment Analysis with Fusion Guidance20
Analyzing the Visual Road Scene for Driver Stress Estimation20
Neurofeedback Training With an Electroencephalogram-Based Brain-Computer Interface Enhances Emotion Regulation20
Investigating Cardiovascular Activation of Young Adults in Routine Driving20
Investigating the Effects of Sleep Conditions on Emotion Responses with EEG Signals and Eye Movements20
SeeNet: A Soft Emotion Expert and Data Augmentation Method to Enhance Speech Emotion Recognition20
Enhancing Emotional Congruence in Sensory Substitution20
Editorial20
Exploring Multivariate Dynamics of Emotions Through Time-Varying Self-Assessed Arousal and Valence Ratings20
Enhancing EEG-Based Decision-Making Performance Prediction by Maximizing Mutual Information Between Emotion and Decision-Relevant Features20
From Translation to Generative LLMs: Classification of Code-Mixed Affective Tasks19
From the Lab to the Wild: Affect Modeling Via Privileged Information19
Facial Expression Recognition in the Wild Using Multi-Level Features and Attention Mechanisms19
Meta-Based Self-Training and Re-Weighting for Aspect-Based Sentiment Analysis19
ER-Chat: A Text-to-Text Open-Domain Dialogue Framework for Emotion Regulation19
Cross-Task and Cross-Participant Classification of Cognitive Load in an Emergency Simulation Game19
Towards Multimodal Prediction of Spontaneous Humor: A Novel Dataset and First Results19
A Reinforcement Learning Based Two-Stage Model for Emotion Cause Pair Extraction19
Aspect-Opinion Correlation Aware and Knowledge-Expansion Few Shot Cross-Domain Sentiment Classification19
RVISA: Reasoning and Verification for Implicit Sentiment Analysis19
EEG-Based Emotion Recognition via Neural Architecture Search19
Quantitative Personality Predictions From a Brief EEG Recording19
0.068283081054688