IEEE Transactions on Affective Computing

Papers
(The median citation count of IEEE Transactions on Affective Computing is 5. The table below lists those papers that are above that threshold based on CrossRef citation counts [max. 250 papers]. The publications cover those that have been published in the past four years, i.e., from 2020-11-01 to 2024-11-01.)
ArticleCitations
Deep Facial Expression Recognition: A Survey663
Review on Psychological Stress Detection Using Biosignals326
EEG-Based Emotion Recognition Using Regularized Graph Neural Networks321
AMIGOS: A Dataset for Affect, Personality and Mood Research on Individuals and Groups274
GCB-Net: Graph Convolutional Broad Network and Its Application in Emotion Recognition211
Survey on Emotional Body Gesture Recognition208
EEG-Based Emotion Recognition via Channel-Wise Attention and Self Attention199
A Bi-Hemisphere Domain Adversarial Neural Network Model for EEG Emotion Recognition159
Self-Supervised ECG Representation Learning for Emotion Recognition140
Issues and Challenges of Aspect-based Sentiment Analysis: A Comprehensive Survey136
An Efficient LSTM Network for Emotion Recognition From Multichannel EEG Signals131
From Regional to Global Brain: A Novel Hierarchical Spatial-Temporal Neural Network Model for EEG Emotion Recognition129
Feature Extraction and Selection for Emotion Recognition from Electrodermal Activity129
Classifying Emotions and Engagement in Online Learning Based on a Single Facial Expression Recognition Neural Network126
Utilizing Deep Learning Towards Multi-Modal Bio-Sensing and Vision-Based Affective Computing118
Deep Learning for Human Affect Recognition: Insights and New Developments114
Automatic Recognition Methods Supporting Pain Assessment: A Survey113
Beneath the Tip of the Iceberg: Current Challenges and New Directions in Sentiment Analysis Research103
Facial Expression Recognition With Visual Transformers and Attentional Selective Fusion103
Exploiting Multi-CNN Features in CNN-RNN Based Dimensional Emotion Recognition on the OMG in-the-Wild Dataset97
An EEG-Based Brain Computer Interface for Emotion Recognition and Its Application in Patients with Disorder of Consciousness91
Video-Based Depression Level Analysis by Encoding Deep Spatiotemporal Features73
A Review on Nonlinear Methods Using Electroencephalographic Recordings for Emotion Recognition70
Integrating Deep and Shallow Models for Multi-Modal Depression Analysis—Hybrid Architectures68
TSception: Capturing Temporal Dynamics and Spatial Asymmetry From EEG for Emotion Recognition68
Facial Expression Recognition with Identity and Emotion Joint Learning67
Spontaneous Speech Emotion Recognition Using Multiscale Deep Convolutional LSTM65
Improving Cross-Corpus Speech Emotion Recognition with Adversarial Discriminative Domain Generalization (ADDoG)64
An Active Learning Paradigm for Online Audio-Visual Emotion Recognition62
An Improved Empirical Mode Decomposition of Electroencephalogram Signals for Depression Detection62
The Biases of Pre-Trained Language Models: An Empirical Study on Prompt-Based Sentiment Analysis and Emotion Detection62
The Ordinal Nature of Emotions: An Emerging Approach62
A Deeper Look at Facial Expression Dataset Bias59
Spectral Representation of Behaviour Primitives for Depression Analysis59
Multi-Task Semi-Supervised Adversarial Autoencoding for Speech Emotion Recognition58
Facial Action Unit Detection Using Attention and Relation Learning56
Virtual Reality for Emotion Elicitation – A Review55
SparseDGCNN: Recognizing Emotion From Multichannel EEG Signals55
Induction and Profiling of Strong Multi-Componential Emotions in Virtual Reality54
Contrastive Learning of Subject-Invariant EEG Representations for Cross-Subject Emotion Recognition52
All-in-One: Emotion, Sentiment and Intensity Prediction Using a Multi-Task Ensemble Framework52
Facial Expression Recognition in the Wild Using Multi-Level Features and Attention Mechanisms51
UBFC-Phys: A Multimodal Database For Psychophysiological Studies of Social Stress50
Facial Expression Recognition With Deeply-Supervised Attention Network49
Hybrid Contrastive Learning of Tri-Modal Representation for Multimodal Sentiment Analysis49
Dynamic Micro-Expression Recognition Using Knowledge Distillation47
First Impressions: A Survey on Vision-Based Apparent Personality Trait Analysis46
Computer Vision Analysis for Quantification of Autism Risk Behaviors44
Multi-Modal Pain Intensity Recognition Based on the SenseEmotion Database43
Adapting Software with Affective Computing: A Systematic Review42
Deep Learning for Micro-Expression Recognition: A Survey42
A Survey of Textual Emotion Recognition and Its Challenges42
Audio Features for Music Emotion Recognition: A Survey40
Multimodal Spatiotemporal Representation for Automatic Depression Level Detection40
Multi-Feature Based Network Revealing the Structural Abnormalities in Autism Spectrum Disorder40
Inter-Brain EEG Feature Extraction and Analysis for Continuous Implicit Emotion Tagging During Video Watching40
Feature Pooling of Modulation Spectrum Features for Improved Speech Emotion Recognition in the Wild40
GANSER: A Self-Supervised Data Augmentation Framework for EEG-Based Emotion Recognition39
Vision Transformer With Attentive Pooling for Robust Facial Expression Recognition39
Short and Long Range Relation Based Spatio-Temporal Transformer for Micro-Expression Recognition39
Facial Expression Recognition Using a Temporal Ensemble of Multi-Level Convolutional Neural Networks39
Beyond Mobile Apps: A Survey of Technologies for Mental Well-Being38
Automatic Detection of Mind Wandering from Video in the Lab and in the Classroom38
Modeling, Recognizing, and Explaining Apparent Personality From Videos38
Survey of Deep Representation Learning for Speech Emotion Recognition37
Fusing of Electroencephalogram and Eye Movement With Group Sparse Canonical Correlation Analysis for Anxiety Detection37
A Dual-Branch Dynamic Graph Convolution Based Adaptive TransFormer Feature Fusion Network for EEG Emotion Recognition37
Multimodal Engagement Analysis From Facial Videos in the Classroom37
ACSEE: Antagonistic Crowd Simulation Model With Emotional Contagion and Evolutionary Game Theory36
A Spontaneous Driver Emotion Facial Expression (DEFE) Dataset for Intelligent Vehicles: Emotions Triggered by Video-Audio Clips in Driving Scenarios36
Recognizing Induced Emotions of Movie Audiences from Multimodal Information36
Meta-Based Self-Training and Re-Weighting for Aspect-Based Sentiment Analysis35
FLEPNet: Feature Level Ensemble Parallel Network for Facial Expression Recognition35
Artificial Emotional Intelligence in Socially Assistive Robots for Older Adults: A Pilot Study35
MDN: A Deep Maximization-Differentiation Network for Spatio-Temporal Depression Detection35
Cross-Cultural and Cultural-Specific Production and Perception of Facial Expressions of Emotion in the Wild35
Emotion Recognition for Everyday Life Using Physiological Signals From Wearables: A Systematic Literature Review35
The MatchNMingle Dataset: A Novel Multi-Sensor Resource for the Analysis of Social Interactions and Group Dynamics In-the-Wild During Free-Standing Conversations and Speed Dates34
EmoBed: Strengthening Monomodal Emotion Recognition via Training with Crossmodal Emotion Embeddings34
Multi-Fusion Residual Memory Network for Multimodal Human Sentiment Comprehension34
Modeling Emotion in Complex Stories: The Stanford Emotional Narratives Dataset34
Adapted Dynamic Memory Network for Emotion Recognition in Conversation33
SchiNet: Automatic Estimation of Symptoms of Schizophrenia from Facial Behaviour Analysis33
Depression Level Prediction Using Deep Spatiotemporal Features and Multilayer Bi-LTSM33
Improving Attention Model Based on Cognition Grounded Data for Sentiment Analysis33
Interpretation of Depression Detection Models via Feature Selection Methods33
GMSS: Graph-Based Multi-Task Self-Supervised Learning for EEG Emotion Recognition33
The Multimodal Sentiment Analysis in Car Reviews (MuSe-CaR) Dataset: Collection, Insights and Improvements33
EmoNet: A Transfer Learning Framework for Multi-Corpus Speech Emotion Recognition32
A Bayesian Deep Learning Framework for End-To-End Prediction of Emotion From Heartbeat32
Lexicon-Based Sentiment Convolutional Neural Networks for Online Review Analysis32
Unsupervised Learning in Reservoir Computing for EEG-Based Emotion Recognition30
Acute Stress State Classification Based on Electrodermal Activity Modeling30
Behavioral and Physiological Signals-Based Deep Multimodal Approach for Mobile Emotion Recognition30
Variational Instance-Adaptive Graph for EEG Emotion Recognition30
Efficient Multimodal Transformer With Dual-Level Feature Restoration for Robust Multimodal Sentiment Analysis30
Classification of Interbeat Interval Time-Series Using Attention Entropy29
A Deep Multiscale Spatiotemporal Network for Assessing Depression From Facial Dynamics29
A Novel Classification Strategy to Distinguish Five Levels of Pain Using the EEG Signal Features29
Evoking Physiological Synchrony and Empathy Using Social VR With Biofeedback29
A Comprehensive and Context-Sensitive Neonatal Pain Assessment Using Computer Vision28
Hierarchical Interactive Multimodal Transformer for Aspect-Based Multimodal Sentiment Analysis28
A Novel Sentiment Polarity Detection Framework for Chinese27
PARSE: Pairwise Alignment of Representations in Semi-Supervised EEG Learning for Emotion Recognition27
Exploring Self-Attention Graph Pooling With EEG-Based Topological Structure and Soft Label for Depression Detection27
E-Key: An EEG-Based Biometric Authentication and Driving Fatigue Detection System27
Deep Multi-Task Multi-Label CNN for Effective Facial Attribute Classification27
EEG-Based Emotion Recognition via Neural Architecture Search26
Conveying Emotions Through Device-Initiated Touch26
Micro and Macro Facial Expression Recognition Using Advanced Local Motion Patterns26
Neural Attentive Network for Cross-Domain Aspect-Level Sentiment Classification26
Empirical Evidence Relating EEG Signal Duration to Emotion Classification Performance26
Deep Learning for Spatio-Temporal Modeling of Dynamic Spontaneous Emotions26
Automatic Emotion Recognition for Groups: A Review26
A Psychologically Inspired Fuzzy Cognitive Deep Learning Framework to Predict Crowd Behavior25
Speech-Driven Expressive Talking Lips with Conditional Sequential Generative Adversarial Networks25
STCAM: Spatial-Temporal and Channel Attention Module for Dynamic Facial Expression Recognition25
Multiple Instance Learning for Emotion Recognition Using Physiological Signals25
Unraveling ML Models of Emotion With NOVA: Multi-Level Explainable AI for Non-Experts25
The Recognition of Multiple Anxiety Levels Based on Electroencephalograph24
Local Temporal Pattern and Data Augmentation for Spotting Micro-Expressions24
Automatic Recognition of Facial Displays of Unfelt Emotions24
Multi-Label Emotion Detection via Emotion-Specified Feature Extraction and Emotion Correlation Learning24
Investigation of Speech Landmark Patterns for Depression Detection23
Facial Depression Recognition by Deep Joint Label Distribution and Metric Learning23
Personality Traits Classification Using Deep Visual Activity-Based Nonverbal Features of Key-Dynamic Images23
PersEmoN: A Deep Network for Joint Analysis of Apparent Personality, Emotion and Their Relationship23
EEG-Based Emotion Recognition With Emotion Localization via Hierarchical Self-Attention23
BReG-NeXt: Facial Affect Computing Using Adaptive Residual Networks With Bounded Gradient23
Self-Supervised Learning of Person-Specific Facial Dynamics for Automatic Personality Recognition23
MERASTC: Micro-Expression Recognition Using Effective Feature Encodings and 2D Convolutional Neural Network23
Multi-Modal Sarcasm Detection and Humor Classification in Code-Mixed Conversations23
Spatio-Temporal Encoder-Decoder Fully Convolutional Network for Video-Based Dimensional Emotion Recognition22
On the Influence of Affect in EEG-Based Subject Identification22
DepecheMood++: A Bilingual Emotion Lexicon Built Through Simple Yet Powerful Techniques22
ENGAGE-DEM: A Model of Engagement of People With Dementia22
Multimodal Deception Detection Using Real-Life Trial Data22
Stimulus Sampling With 360-Videos: Examining Head Movements, Arousal, Presence, Simulator Sickness, and Preference on a Large Sample of Participants and Videos21
Exploring the Contextual Factors Affecting Multimodal Emotion Recognition in Videos21
Self Supervised Adversarial Domain Adaptation for Cross-Corpus and Cross-Language Speech Emotion Recognition21
EmoSen: Generating Sentiment and Emotion Controlled Responses in a Multimodal Dialogue System21
Regression Guided by Relative Ranking Using Convolutional Neural Network (R3 CNN) for Facial Beauty Prediction21
Towards Transparent Robot Learning Through TDRL-Based Emotional Expressions21
On the Effect of Observed Subject Biases in Apparent Personality Analysis From Audio-Visual Signals21
Prediction of Depression Severity Based on the Prosodic and Semantic Features With Bidirectional LSTM and Time Distributed CNN20
Two-Stage Fuzzy Fusion Based-Convolution Neural Network for Dynamic Emotion Recognition20
Chunk-Level Speech Emotion Recognition: A General Framework of Sequence-to-One Dynamic Temporal Modeling20
Exploring Individual Differences of Public Speaking Anxiety in Real-Life and Virtual Presentations20
SMIN: Semi-Supervised Multi-Modal Interaction Network for Conversational Emotion Recognition20
Deep Multi-Modal Network Based Automated Depression Severity Estimation20
ICA-Evolution Based Data Augmentation with Ensemble Deep Neural Networks Using Time and Frequency Kernels for Emotion Recognition from EEG-Data20
Identifying Cortical Brain Directed Connectivity Networks From High-Density EEG for Emotion Recognition19
Deep Siamese Neural Networks for Facial Expression Recognition in the Wild19
A Multi-Modal Stacked Ensemble Model for Bipolar Disorder Classification19
EEG-Based Subject-Independent Emotion Recognition Using Gated Recurrent Unit and Minimum Class Confusion19
A Transfer Learning Approach to Heatmap Regression for Action Unit Intensity Estimation19
Cost-Sensitive Boosting Pruning Trees for Depression Detection on Twitter19
Deep Temporal Analysis for Non-Acted Body Affect Recognition19
Phase Space Reconstruction Driven Spatio-Temporal Feature Learning for Dynamic Facial Expression Recognition19
Emotion Intensity and its Control for Emotional Voice Conversion19
A Scalable Off-the-Shelf Framework for Measuring Patterns of Attention in Young Children and Its Application in Autism Spectrum Disorder18
Toward Robust Stress Prediction in the Age of Wearables: Modeling Perceived Stress in a Longitudinal Study With Information Workers18
Does Visual Self-Supervision Improve Learning of Speech Representations for Emotion Recognition?18
What’s Your Laughter Doing There? A Taxonomy of the Pragmatic Functions of Laughter18
Ethics and Good Practice in Computational Paralinguistics18
Bio-Inspired Deep Attribute Learning Towards Facial Aesthetic Prediction18
Recognition of Advertisement Emotions With Application to Computational Advertising18
Towards a Prediction and Data Driven Computational Process Model of Emotion18
Joint Feature Adaptation and Graph Adaptive Label Propagation for Cross-Subject Emotion Recognition From EEG Signals17
Multimodal Classification of Stressful Environments in Visually Impaired Mobility Using EEG and Peripheral Biosignals17
A Multi-Componential Approach to Emotion Recognition and the Effect of Personality17
The Pixels and Sounds of Emotion: General-Purpose Representations of Arousal in Games17
Domain-Incremental Continual Learning for Mitigating Bias in Facial Expression and Action Unit Recognition17
EEG Feature Selection via Global Redundancy Minimization for Emotion Recognition17
Disentangling Identity and Pose for Facial Expression Recognition17
A Multimodal Non-Intrusive Stress Monitoring From the Pleasure-Arousal Emotional Dimensions16
Cluster-Level Contrastive Learning for Emotion Recognition in Conversations16
Graph-Based Facial Affect Analysis: A Review16
LQGDNet: A Local Quaternion and Global Deep Network for Facial Depression Recognition16
Detecting Mental Disorders in Social Media Through Emotional Patterns - The Case of Anorexia and Depression16
Depression Recognition Using Remote Photoplethysmography From Facial Videos16
Affective Dynamics: Causality Modeling of Temporally Evolving Perceptual and Affective Responses16
The Arousal Video Game AnnotatIoN (AGAIN) Dataset16
Aspect-Based Sentiment Analysis with New Target Representation and Dependency Attention15
FaceEngage: Robust Estimation of Gameplay Engagement from User-Contributed (YouTube) Videos15
Longitudinal Observational Evidence of the Impact of Emotion Regulation Strategies on Affective Expression15
An Emotion Recognition Method for Game Evaluation Based on Electroencephalogram15
Driver Emotion Recognition With a Hybrid Attentional Multimodal Fusion Framework15
Personal-Zscore: Eliminating Individual Difference for EEG-Based Cross-Subject Emotion Recognition15
Geometry-Aware Facial Expression Recognition via Attentive Graph Convolutional Networks15
On the Influence of Shot Scale on Film Mood and Narrative Engagement in Film Viewers15
Leaders and Followers Identified by Emotional Mimicry During Collaborative Learning: A Facial Expression Recognition Study on Emotional Valence14
Affect Estimation in 3D Space Using Multi-Task Active Learning for Regression14
4DME: A Spontaneous 4D Micro-Expression Dataset With Multimodalities14
An Overview of Facial Micro-Expression Analysis: Data, Methodology and Challenge14
Leveraging Affective Hashtags for Ranking Music Recommendations14
Boosting Facial Expression Recognition by A Semi-Supervised Progressive Teacher14
Designing an Experience Sampling Method for Smartphone Based Emotion Detection14
Multi-Target Positive Emotion Recognition From EEG Signals14
Learning to Learn Better Unimodal Representations via Adaptive Multimodal Meta-Learning14
Prediction of Car Design Perception Using EEG and Gaze Patterns14
Audio-Visual Automatic Group Affect Analysis14
Improving the Performance of Sentiment Analysis Using Enhanced Preprocessing Technique and Artificial Neural Network14
Effective Connectivity Based EEG Revealing the Inhibitory Deficits for Distracting Stimuli in Major Depression Disorders14
Toward Automated Classroom Observation: Multimodal Machine Learning to Estimate CLASS Positive Climate and Negative Climate13
Active Learning With Complementary Sampling for Instructing Class-Biased Multi-Label Text Emotion Classification13
Multiview Facial Expression Recognition, A Survey13
Multi-Label Multi-Task Deep Learning for Behavioral Coding13
Induction of Emotional States in Educational Video Games Through a Fuzzy Control System13
Capturing Emotion Distribution for Multimedia Emotion Tagging13
Morality Classification in Natural Language Text13
Two Birds With One Stone: Knowledge-Embedded Temporal Convolutional Transformer for Depression Detection and Emotion Recognition13
Embodied Robot Models for Interdisciplinary Emotion Research13
To Be or Not to Be in Flow at Work: Physiological Classification of Flow Using Machine Learning13
Hierarchical Multiscale Recurrent Neural Networks for Detecting Suicide Notes12
Autoencoder for Semisupervised Multiple Emotion Detection of Conversation Transcripts12
Affect in Multimedia: Benchmarking Violent Scenes Detection12
Multimodal Affective States Recognition Based on Multiscale CNNs and Biologically Inspired Decision Fusion Model12
Touching Virtual Humans: Haptic Responses Reveal the Emotional Impact of Affective Agents12
Empathetic Conversational Systems: A Review of Current Advances, Gaps, and Opportunities12
Probabilistic Attribute Tree Structured Convolutional Neural Networks for Facial Expression Recognition in the Wild12
Altered Brain Dynamics and Their Ability for Major Depression Detection Using EEG Microstates Analysis12
Layered-Modeling of Affective and Sensory Experiences using Structural Equation Modeling: Touch Experiences of Plastic Surfaces as an Example12
Multi-Stage Graph Fusion Networks for Major Depressive Disorder Diagnosis12
Affective Impression: Sentiment-Awareness POI Suggestion via Embedding in Heterogeneous LBSNs12
Embedding Refinement Framework for Targeted Aspect-Based Sentiment Analysis12
Neurofeedback Training With an Electroencephalogram-Based Brain-Computer Interface Enhances Emotion Regulation12
THIN: THrowable Information Networks and Application for Facial Expression Recognition in the Wild11
Semi-Structural Interview-Based Chinese Multimodal Depression Corpus Towards Automatic Preliminary Screening of Depressive Disorders11
Brain-Computer Interface for Generating Personally Attractive Images11
Cross-Task and Cross-Participant Classification of Cognitive Load in an Emergency Simulation Game11
Examining Emotion Perception Agreement in Live Music Performance11
Automatic Emotion Recognition in Clinical Scenario: A Systematic Review of Methods11
Modeling Multiple Temporal Scales of Full-Body Movements for Emotion Classification11
Weakly-Supervised Learning for Fine-Grained Emotion Recognition Using Physiological Signals11
Quality-Aware Bag of Modulation Spectrum Features for Robust Speech Emotion Recognition11
Affective Dynamics: Principal Motion Analysis of Temporal Dominance of Sensations and Emotions Data11
Jointly Aligning and Predicting Continuous Emotion Annotations11
Few-Shot Learning in Emotion Recognition of Spontaneous Speech Using a Siamese Neural Network With Adaptive Sample Pair Formation11
Sentiment- Emotion- and Context-Guided Knowledge Selection Framework for Emotion Recognition in Conversations11
Transformer-Based Self-Supervised Multimodal Representation Learning for Wearable Emotion Recognition11
ECPEC: Emotion-Cause Pair Extraction in Conversations11
Learning Person-Specific Cognition From Facial Reactions for Automatic Personality Recognition10
First Impressions Count! The Role of the Human's Emotional State on Rapport Established with an Empathic versus Neutral Virtual Therapist10
Exploring Complexity of Facial Dynamics in Autism Spectrum Disorder10
Audio-Visual Emotion Recognition With Preference Learning Based on Intended and Multi-Modal Perceived Labels10
A Review of Affective Computing Research Based on Function-Component-Representation Framework10
Applying Probabilistic Programming to Affective Computing10
Discriminative Few Shot Learning of Facial Dynamics in Interview Videos for Autism Trait Classification10
Adapting the Interplay Between Personalized and Generalized Affect Recognition Based on an Unsupervised Neural Framework10
0.055540084838867