IEEE Transactions on Affective Computing

Papers
(The TQCC of IEEE Transactions on Affective Computing is 17. The table below lists those papers that are above that threshold based on CrossRef citation counts [max. 250 papers]. The publications cover those that have been published in the past four years, i.e., from 2021-05-01 to 2025-05-01.)
ArticleCitations
Effects of Computerized Emotional Training on Children with High Functioning Autism818
Facial Expression Recognition Using a Temporal Ensemble of Multi-Level Convolutional Neural Networks441
Major Depressive Disorder Detection Using Graph Domain Adaptation With Global Message-Passing Based on EEG Signals421
Sparse Emotion Dictionary and CWT Spectrogram Fusion with Multi-head Self-Attention for Depression Recognition in Parkinson's Disease Patients276
From EEG to Eye Movements: Cross-modal Emotion Recognition Using Constrained Adversarial Network with Dual Attention255
Automatic Estimation of Action Unit Intensities and Inference of Emotional Appraisals195
Towards Contrastive Context-Aware Conversational Emotion Recognition190
Perceived Conversation Quality in Spontaneous Interactions187
Progressive Masking Oriented Self-Taught Learning for Occluded Facial Expression Recognition183
Mechanoreceptive Aβ Primary Afferents Discriminate Naturalistic Social Touch Inputs at a Functionally Relevant Time Scale171
Mouse-cursor Tracking: Simple Scoring Algorithms That Make It Work159
Fusing of Electroencephalogram and Eye Movement With Group Sparse Canonical Correlation Analysis for Anxiety Detection150
A High-Quality Landmarked Infrared Eye Video Dataset (IREye4Task): Eye Behaviors, Insights and Benchmarks for Wearable Mental State Analysis141
Vision Transformer With Attentive Pooling for Robust Facial Expression Recognition138
Classifying Suicide-Related Content and Emotions on Twitter Using Graph Convolutional Neural Networks136
Cost-Sensitive Boosting Pruning Trees for Depression Detection on Twitter123
ECPEC: Emotion-Cause Pair Extraction in Conversations119
The Acoustically Emotion-Aware Conversational Agent With Speech Emotion Recognition and Empathetic Responses110
Does Gamified Breath-Biofeedback Promote Adherence, Relaxation, and Skill Transfer in the Wild?104
EEG Microstates and fNIRS Metrics Reveal the Spatiotemporal Joint Neural Processing Features of Human Emotions102
An Efficient Framework for Constructing Speech Emotion Corpus Based on Integrated Active Learning Strategies89
Emotion Arousal Assessment Based on Multimodal Physiological Signals for Game Users86
A Classification Framework for Depressive Episode Using R-R Intervals From Smartwatch83
The Deep Method: Towards Computational Modeling of the Social Emotion Shame Driven by Theory, Introspection, and Social Signals83
Facial Image-Based Automatic Assessment of Equine Pain81
Cluster-Level Contrastive Learning for Emotion Recognition in Conversations78
Continuous Emotion Ambiguity Prediction: Modeling With Beta Distributions78
Subjective Fear in Virtual Reality: A Linear Mixed-Effects Analysis of Skin Conductance77
Using Circular Models to Improve Music Emotion Recognition77
Enhancing Cross-Dataset EEG Emotion Recognition: A Novel Approach with Emotional EEG Style Transfer Network76
Are 3D Face Shapes Expressive Enough for Recognising Continuous Emotions and Action Unit Intensities?75
Smart Affect Monitoring With Wearables in the Wild: An Unobtrusive Mood-Aware Emotion Recognition System73
A Multi-Componential Approach to Emotion Recognition and the Effect of Personality73
The Recognition of Multiple Anxiety Levels Based on Electroencephalograph71
Automatic Detection of Reflective Thinking in Mathematical Problem Solving Based on Unconstrained Bodily Exploration70
Learning Users Inner Thoughts and Emotion Changes for Social Media Based Suicide Risk Detection69
UBFC-Phys: A Multimodal Database For Psychophysiological Studies of Social Stress69
Hierarchical Multiscale Recurrent Neural Networks for Detecting Suicide Notes68
Transformer-Based Self-Supervised Multimodal Representation Learning for Wearable Emotion Recognition67
SparseDGCNN: Recognizing Emotion From Multichannel EEG Signals66
Review on Psychological Stress Detection Using Biosignals66
MECA: Manipulation with Emotional Intensity-aware Contrastive Learning and Attention-based Discriminative learning62
When is a Haptic Message Like an Inside Joke? Digitally Mediated Emotive Communication Builds on Shared History62
TFAGL: A Novel Agent Graph Learning Method Using Time-Frequency EEG For Major Depressive Disorder Detection60
Fusion and Discrimination: A Multimodal Graph Contrastive Learning Framework for Multimodal Sarcasm Detection60
LGSNet: A Two-Stream Network for Micro- and Macro-Expression Spotting With Background Modeling58
SMIN: Semi-Supervised Multi-Modal Interaction Network for Conversational Emotion Recognition57
Improved Video Emotion Recognition With Alignment of CNN and Human Brain Representations56
Non-Invasive Measurement of Trust in Group Interactions56
Dynamic Confidence-Aware Multi-Modal Emotion Recognition56
An Enroll-to-Verify Approach for Cross-Task Unseen Emotion Class Recognition56
A Micro-Expression Recognition Network Based on Attention Mechanism and Motion Magnification55
Leveraging the Deep Learning Paradigm for Continuous Affect Estimation from Facial Expressions55
Towards Cyberbullying Detection: Building, Benchmarking and Longitudinal Analysis of Aggressiveness and Conflicts/Attacks Datasets from Twitter53
Effects of Algorithmic Transparency on User Experience and Physiological Responses in Affect-Aware Task Adaptation53
Computation of Sensory-Affective Relationships Depending on Material Categories of Pictorial Stimuli53
GHA: a Gated Hierarchical Attention Mechanism for the Detection of Abusive Language in Social Media52
SEED-VII: A Multimodal Dataset of Six Basic Emotions with Continuous Labels for Emotion Recognition51
Multi-Label and Multimodal Classifier for Affective States Recognition in Virtual Rehabilitation51
Multi-Party Conversation Modeling for Emotion Recognition50
Long Short-Term Memory Network Based Unobtrusive Workload Monitoring With Consumer Grade Smartwatches50
Guest Editorial: Special Issue on Affective Speech and Language Synthesis, Generation, and Conversion49
A New Perspective on Stress Detection: An Automated Approach for Detecting Eustress and Distress49
Modeling Category Semantic and Sentiment Knowledge for Aspect-Level Sentiment Analysis48
Hierarchical Shared Encoder With Task-Specific Transformer Layer Selection for Emotion-Cause Pair Extraction48
Annotate Smarter, not Harder: Using Active Learning to Reduce Emotional Annotation Effort46
Exploring Complexity of Facial Dynamics in Autism Spectrum Disorder46
Nonverbal Leadership in Joint Full-Body Improvisation46
AGILE: Attribute-Guided Identity Independent Learning for Facial Expression Recognition46
An Analysis of Physiological and Psychological Responses in Virtual Reality and Flat Screen Gaming45
Prediction of Depression Severity Based on the Prosodic and Semantic Features With Bidirectional LSTM and Time Distributed CNN44
Group Synchrony for Emotion Recognition Using Physiological Signals44
Facial Depression Recognition by Deep Joint Label Distribution and Metric Learning44
Domain-Incremental Continual Learning for Mitigating Bias in Facial Expression and Action Unit Recognition44
Partial Label Learning for Emotion Recognition from EEG44
Dynamic Micro-Expression Recognition Using Knowledge Distillation43
Improving Cross-Corpus Speech Emotion Recognition with Adversarial Discriminative Domain Generalization (ADDoG)42
Objective Class-Based Micro-Expression Recognition Under Partial Occlusion Via Region-Inspired Relation Reasoning Network41
Few-Shot Learning in Emotion Recognition of Spontaneous Speech Using a Siamese Neural Network With Adaptive Sample Pair Formation41
Multimodal Deception Detection Using Real-Life Trial Data40
From Regional to Global Brain: A Novel Hierarchical Spatial-Temporal Neural Network Model for EEG Emotion Recognition40
Hierarchical Encoding and Fusion of Brain Functions for Depression Subtype Classification39
Methodology to Assess Quality, Presence, Empathy, Attitude, and Attention in 360-degree Videos for Immersive Communications39
State-Specific and Supraordinal Components of Facial Response to Pain39
Contradicted by the Brain: Predicting Individual and Group Preferences via Brain-Computer Interfacing39
Multilevel Longitudinal Analysis of Shooting Performance as a Function of Stress and Cardiovascular Responses39
Aspect-Based Sentiment Quantification39
Leveraging Social Media for Real-Time Interpretable and Amendable Suicide Risk Prediction With Human-in-The-Loop37
Micro and Macro Facial Expression Recognition Using Advanced Local Motion Patterns37
STAA-Net: A Sparse and Transferable Adversarial Attack for Speech Emotion Recognition37
Deep Adaptation of Adult-Child Facial Expressions by Fusing Landmark Features37
Ordinal Logistic Regression With Partial Proportional Odds for Depression Prediction37
Geometric Graph Representation With Learnable Graph Structure and Adaptive AU Constraint for Micro-Expression Recognition36
Theory of Mind Abilities Predict Robot's Gaze Effects on Object Preference36
A Spontaneous Driver Emotion Facial Expression (DEFE) Dataset for Intelligent Vehicles: Emotions Triggered by Video-Audio Clips in Driving Scenarios36
Lexicon-Based Sentiment Convolutional Neural Networks for Online Review Analysis35
EEG Feature Selection via Global Redundancy Minimization for Emotion Recognition35
Boosting Micro-Expression Recognition via Self-Expression Reconstruction and Memory Contrastive Learning35
Emotion Distribution Learning Based on Peripheral Physiological Signals34
A Scalable Off-the-Shelf Framework for Measuring Patterns of Attention in Young Children and Its Application in Autism Spectrum Disorder34
Versatile Audio-Visual Learning for Emotion Recognition33
Integrating Deep Facial Priors Into Landmarks for Privacy Preserving Multimodal Depression Recognition33
A Psychologically Inspired Fuzzy Cognitive Deep Learning Framework to Predict Crowd Behavior32
The Rhythm of Flow: Detecting Facial Expressions of Flow Experiences Using CNNs32
Emotion Expression in Human Body Posture and Movement: A Survey on Intelligible Motion Factors, Quantification and Validation32
Text-Based Fine-Grained Emotion Prediction31
A Dual-Branch Dynamic Graph Convolution Based Adaptive TransFormer Feature Fusion Network for EEG Emotion Recognition31
Estimating Affective Taste Experience Using Combined Implicit Behavioral and Neurophysiological Measures31
Self-Supervised ECG Representation Learning for Emotion Recognition31
Survey of Deep Representation Learning for Speech Emotion Recognition31
Classification of Interbeat Interval Time-Series Using Attention Entropy31
FERMixNet: An Occlusion Robust Facial Expression Recognition Model with Facial Mixing Augmentation and Mid-Level Representation Learning31
EmoTake: Exploring Drivers’ Emotion for Takeover Behavior Prediction30
Contrastive Learning of Subject-Invariant EEG Representations for Cross-Subject Emotion Recognition30
Semantic and Emotional Dual Channel for Emotion Recognition in Conversation30
Stimulus-Response Pattern: The Core of Robust Cross-stimulus Facial Depression Recognition29
Leveraging the Dynamics of Non-Verbal Behaviors For Social Attitude Modeling29
Distant Handshakes: Conveying Social Intentions Through Multi-Modal Soft Haptic Gloves29
Psychophysiological Reactions to Persuasive Messages Deploying Persuasion Principles29
Does Visual Self-Supervision Improve Learning of Speech Representations for Emotion Recognition?28
Towards Emotion-Aware Agents for Improved User Satisfaction and Partner Perception in Negotiation Dialogues28
I Enjoy Writing and Playing, Do You?: A Personalized and Emotion Grounded Dialogue Agent Using Generative Adversarial Network28
Fake News, Real Emotions: Emotion Analysis of COVID-19 Infodemic in Weibo28
From What You See to What We Smell: Linking Human Emotions to Bio-Markers in Breath28
Exploring Emotion Expression Recognition in Older Adults Interacting with a Virtual Coach28
Short and Long Range Relation Based Spatio-Temporal Transformer for Micro-Expression Recognition27
CorMulT: a Semi-supervised Modality Correlation-aware Multimodal Transformer for Sentiment Analysis27
Efficient Multimodal Transformer With Dual-Level Feature Restoration for Robust Multimodal Sentiment Analysis27
The Pixels and Sounds of Emotion: General-Purpose Representations of Arousal in Games27
Emotion Recognition From Few-Channel EEG Signals by Integrating Deep Feature Aggregation and Transfer Learning27
Affective Touch via Haptic Interfaces: A Sequential Indentation Approach27
Automatic Emotion Recognition in Clinical Scenario: A Systematic Review of Methods27
The Role of Preprocessing for Word Representation Learning in Affective Tasks27
A Survey of Textual Emotion Recognition and Its Challenges26
A Residual Multi-Scale Convolutional Neural Network with Transformers for Speech Emotion Recognition26
Hierarchical Knowledge Stripping for Multimodal Sentiment Analysis26
Behavioral and Physiological Signals-Based Deep Multimodal Approach for Mobile Emotion Recognition26
VyaktitvaNirdharan: Multimodal Assessment of Personality and Trait Emotional Intelligence26
Recognizing, Fast and Slow: Complex Emotion Recognition With Facial Expression Detection and Remote Physiological Measurement25
Emotion Dependent Domain Adaptation for Speech Driven Affective Facial Feature Synthesis25
Multi-Order Networks for Action Unit Detection25
Mutual Information Based Fusion Model (MIBFM): Mild Depression Recognition Using EEG and Pupil Area Signals25
Affective Dynamics and Cognition During Game-Based Learning25
Facial Expression Animation by Landmark Guided Residual Module25
SCARE: A Novel Framework to Enhance Chinese Harmful Memes Detection24
A Multi-Level Alignment and Cross-Modal Unified Semantic Graph Refinement Network for Conversational Emotion Recognition24
LineConGraphs: Line Conversation Graphs for Effective Emotion Recognition using Graph Neural Networks24
Deep Facial Action Unit Recognition and Intensity Estimation from Partially Labelled Data24
An Effective 3D Text Recurrent Voting Generator for Metaverse24
Multi-Scale Hyperbolic Contrastive Learning for Cross-Subject EEG Emotion Recognition24
Emotion Intensity and its Control for Emotional Voice Conversion23
Collecting Mementos: A Multimodal Dataset for Context-Sensitive Modeling of Affect and Memory Processing in Responses to Videos23
Multi-Modal Sarcasm Detection and Humor Classification in Code-Mixed Conversations23
Deep Learning for Micro-Expression Recognition: A Survey23
Effects of Physiological Signals in Different Types of Multimodal Sentiment Estimation23
Modeling Emotion in Complex Stories: The Stanford Emotional Narratives Dataset23
Incorporating Forthcoming Events and Personality Traits in Social Media Based Stress Prediction23
Unconstrained Facial Expression Recognition With No-Reference De-Elements Learning23
Autonomic modulations to cardiac dynamics in response to affective touch: Differences between social touch and self-touch22
Persuasion-Induced Physiology as Predictor of Persuasion Effectiveness22
Effective Connectivity Based EEG Revealing the Inhibitory Deficits for Distracting Stimuli in Major Depression Disorders22
Social Image–Text Sentiment Classification With Cross-Modal Consistency and Knowledge Distillation22
Beyond Overfitting: Doubly Adaptive Dropout for Generalizable AU Detection22
SynSem-ASTE: An Enhanced Multi-Encoder Network for Aspect Sentiment Triplet Extraction With Syntax and Semantics22
CSE-GResNet: A Simple and Highly Efficient Network for Facial Expression Recognition22
Detection and Identification of Choking Under Pressure in College Tennis Based Upon Physiological Parameters, Performance Patterns, and Game Statistics22
Mental Stress Assessment in the Workplace: A Review21
ENGAGE-DEM: A Model of Engagement of People With Dementia21
Towards Participant-Independent Stress Detection Using Instrumented Peripherals21
Unsupervised Time-Aware Sampling Network With Deep Reinforcement Learning for EEG-Based Emotion Recognition21
Detecting Mental Disorders in Social Media Through Emotional Patterns - The Case of Anorexia and Depression21
Neuro or Symbolic? Fine-Tuned Transformer With Unsupervised LDA Topic Clustering for Text Sentiment Analysis21
MDN: A Deep Maximization-Differentiation Network for Spatio-Temporal Depression Detection21
Examining Emotion Perception Agreement in Live Music Performance21
AT2GRU: A Human Emotion Recognition Model With Mitigated Device Heterogeneity20
Multi-Stage Graph Fusion Networks for Major Depressive Disorder Diagnosis20
Empathetic Conversational Systems: A Review of Current Advances, Gaps, and Opportunities20
Deep Multi-Task Multi-Label CNN for Effective Facial Attribute Classification19
Enhancing EEG-Based Decision-Making Performance Prediction by Maximizing Mutual Information Between Emotion and Decision-Relevant Features19
A Reinforcement Learning Based Two-Stage Model for Emotion Cause Pair Extraction19
Quantitative Personality Predictions From a Brief EEG Recording19
EEG-Based Emotion Recognition via Channel-Wise Attention and Self Attention19
Aspect-Opinion Correlation Aware and Knowledge-Expansion Few Shot Cross-Domain Sentiment Classification19
Facial Expression Recognition With Vision Transformer Using Fused Shifted Windows19
Neurofeedback Training With an Electroencephalogram-Based Brain-Computer Interface Enhances Emotion Regulation19
ER-Chat: A Text-to-Text Open-Domain Dialogue Framework for Emotion Regulation19
Conveying Emotions Through Device-Initiated Touch19
Investigating Cardiovascular Activation of Young Adults in Routine Driving19
Improving Multi-Label Facial Expression Recognition With Consistent and Distinct Attentions19
Guest Editorial Neurosymbolic AI for Sentiment Analysis19
SeeNet: A Soft Emotion Expert and Data Augmentation Method to Enhance Speech Emotion Recognition18
Meta-Based Self-Training and Re-Weighting for Aspect-Based Sentiment Analysis18
From Translation to Generative LLMs: Classification of Code-Mixed Affective Tasks18
From the Lab to the Wild: Affect Modeling Via Privileged Information18
RVISA: Reasoning and Verification for Implicit Sentiment Analysis18
Towards Multimodal Prediction of Spontaneous Humor: A Novel Dataset and First Results18
Exploring Multivariate Dynamics of Emotions Through Time-Varying Self-Assessed Arousal and Valence Ratings18
Cross-Task and Cross-Participant Classification of Cognitive Load in an Emergency Simulation Game18
Analyzing the Visual Road Scene for Driver Stress Estimation18
Avatar-Based Feedback in Job Interview Training Impacts Action Identities and Anxiety17
The ForDigitStress Dataset: A Multi-Modal Dataset for Automatic Stress Recognition17
EEG-Based Emotion Recognition via Neural Architecture Search17
A Multi-Stage Visual Perception Approach for Image Emotion Analysis17
Human Emotion Recognition With Relational Region-Level Analysis17
Editorial: Special Issue on Unobtrusive Physiological Measurement Methods for Affective Applications17
Analyzing Continuous-Time and Sentence-Level Annotations for Speech Emotion Recognition17
Facial Expression Recognition in the Wild Using Multi-Level Features and Attention Mechanisms17
0.17007899284363