IEEE Transactions on Affective Computing

Papers
(The median citation count of IEEE Transactions on Affective Computing is 5. The table below lists those papers that are above that threshold based on CrossRef citation counts [max. 250 papers]. The publications cover those that have been published in the past four years, i.e., from 2020-09-01 to 2024-09-01.)
ArticleCitations
Deep Facial Expression Recognition: A Survey639
Review on Psychological Stress Detection Using Biosignals301
EEG-Based Emotion Recognition Using Regularized Graph Neural Networks293
AMIGOS: A Dataset for Affect, Personality and Mood Research on Individuals and Groups265
GCB-Net: Graph Convolutional Broad Network and Its Application in Emotion Recognition205
Survey on Emotional Body Gesture Recognition195
EEG-Based Emotion Recognition via Channel-Wise Attention and Self Attention190
A Bi-Hemisphere Domain Adversarial Neural Network Model for EEG Emotion Recognition151
Issues and Challenges of Aspect-based Sentiment Analysis: A Comprehensive Survey134
Self-Supervised ECG Representation Learning for Emotion Recognition129
Feature Extraction and Selection for Emotion Recognition from Electrodermal Activity125
From Regional to Global Brain: A Novel Hierarchical Spatial-Temporal Neural Network Model for EEG Emotion Recognition122
An Efficient LSTM Network for Emotion Recognition From Multichannel EEG Signals122
Classifying Emotions and Engagement in Online Learning Based on a Single Facial Expression Recognition Neural Network120
Utilizing Deep Learning Towards Multi-Modal Bio-Sensing and Vision-Based Affective Computing117
Automatic Recognition Methods Supporting Pain Assessment: A Survey112
Deep Learning for Human Affect Recognition: Insights and New Developments111
Facial Expression Recognition With Visual Transformers and Attentional Selective Fusion100
Exploiting Multi-CNN Features in CNN-RNN Based Dimensional Emotion Recognition on the OMG in-the-Wild Dataset95
Beneath the Tip of the Iceberg: Current Challenges and New Directions in Sentiment Analysis Research94
Novel Audio Features for Music Emotion Recognition94
An EEG-Based Brain Computer Interface for Emotion Recognition and Its Application in Patients with Disorder of Consciousness84
A Mutual Information Based Adaptive Windowing of Informative EEG for Emotion Recognition74
Video-Based Depression Level Analysis by Encoding Deep Spatiotemporal Features71
A Review on Nonlinear Methods Using Electroencephalographic Recordings for Emotion Recognition69
Integrating Deep and Shallow Models for Multi-Modal Depression Analysis—Hybrid Architectures67
Facial Expression Recognition with Identity and Emotion Joint Learning65
Improving Cross-Corpus Speech Emotion Recognition with Adversarial Discriminative Domain Generalization (ADDoG)63
Spontaneous Speech Emotion Recognition Using Multiscale Deep Convolutional LSTM62
The Ordinal Nature of Emotions: An Emerging Approach60
An Active Learning Paradigm for Online Audio-Visual Emotion Recognition60
A Deeper Look at Facial Expression Dataset Bias58
Multi-Task Semi-Supervised Adversarial Autoencoding for Speech Emotion Recognition58
An Improved Empirical Mode Decomposition of Electroencephalogram Signals for Depression Detection58
TSception: Capturing Temporal Dynamics and Spatial Asymmetry From EEG for Emotion Recognition57
The Biases of Pre-Trained Language Models: An Empirical Study on Prompt-Based Sentiment Analysis and Emotion Detection56
Strategies to Utilize the Positive Emotional Contagion Optimally in Crowd Evacuation54
Facial Action Unit Detection Using Attention and Relation Learning54
Spectral Representation of Behaviour Primitives for Depression Analysis53
Virtual Reality for Emotion Elicitation – A Review52
Facial Expression Recognition in the Wild Using Multi-Level Features and Attention Mechanisms50
All-in-One: Emotion, Sentiment and Intensity Prediction Using a Multi-Task Ensemble Framework50
Induction and Profiling of Strong Multi-Componential Emotions in Virtual Reality50
SparseDGCNN: Recognizing Emotion From Multichannel EEG Signals49
Hybrid Contrastive Learning of Tri-Modal Representation for Multimodal Sentiment Analysis48
Facial Expression Recognition With Deeply-Supervised Attention Network48
UBFC-Phys: A Multimodal Database For Psychophysiological Studies of Social Stress46
Contrastive Learning of Subject-Invariant EEG Representations for Cross-Subject Emotion Recognition44
Dynamic Micro-Expression Recognition Using Knowledge Distillation44
Computer Vision Analysis for Quantification of Autism Risk Behaviors43
Multi-Modal Pain Intensity Recognition Based on the SenseEmotion Database43
A Survey of Textual Emotion Recognition and Its Challenges42
First Impressions: A Survey on Vision-Based Apparent Personality Trait Analysis41
Feature Pooling of Modulation Spectrum Features for Improved Speech Emotion Recognition in the Wild40
Adapting Software with Affective Computing: A Systematic Review40
Inter-Brain EEG Feature Extraction and Analysis for Continuous Implicit Emotion Tagging During Video Watching39
Multi-Feature Based Network Revealing the Structural Abnormalities in Autism Spectrum Disorder39
Short and Long Range Relation Based Spatio-Temporal Transformer for Micro-Expression Recognition38
Deep Learning for Micro-Expression Recognition: A Survey38
Beyond Mobile Apps: A Survey of Technologies for Mental Well-Being37
Discrete Probability Distribution Prediction of Image Emotions with Shared Sparse Learning37
Facial Expression Recognition Using a Temporal Ensemble of Multi-Level Convolutional Neural Networks37
A Spontaneous Driver Emotion Facial Expression (DEFE) Dataset for Intelligent Vehicles: Emotions Triggered by Video-Audio Clips in Driving Scenarios36
Audio Features for Music Emotion Recognition: A Survey36
Modeling, Recognizing, and Explaining Apparent Personality From Videos36
ACSEE: Antagonistic Crowd Simulation Model With Emotional Contagion and Evolutionary Game Theory36
Fusing of Electroencephalogram and Eye Movement With Group Sparse Canonical Correlation Analysis for Anxiety Detection36
GANSER: A Self-Supervised Data Augmentation Framework for EEG-Based Emotion Recognition35
Recognizing Induced Emotions of Movie Audiences from Multimodal Information35
Meta-Based Self-Training and Re-Weighting for Aspect-Based Sentiment Analysis35
Multimodal Spatiotemporal Representation for Automatic Depression Level Detection35
Artificial Emotional Intelligence in Socially Assistive Robots for Older Adults: A Pilot Study34
Cross-Cultural and Cultural-Specific Production and Perception of Facial Expressions of Emotion in the Wild34
The MatchNMingle Dataset: A Novel Multi-Sensor Resource for the Analysis of Social Interactions and Group Dynamics In-the-Wild During Free-Standing Conversations and Speed Dates34
Vision Transformer With Attentive Pooling for Robust Facial Expression Recognition34
Multimodal Engagement Analysis From Facial Videos in the Classroom33
The Multimodal Sentiment Analysis in Car Reviews (MuSe-CaR) Dataset: Collection, Insights and Improvements33
FLEPNet: Feature Level Ensemble Parallel Network for Facial Expression Recognition33
Emotion Recognition for Everyday Life Using Physiological Signals From Wearables: A Systematic Literature Review33
A Dual-Branch Dynamic Graph Convolution Based Adaptive TransFormer Feature Fusion Network for EEG Emotion Recognition33
SchiNet: Automatic Estimation of Symptoms of Schizophrenia from Facial Behaviour Analysis33
MDN: A Deep Maximization-Differentiation Network for Spatio-Temporal Depression Detection33
Multi-Fusion Residual Memory Network for Multimodal Human Sentiment Comprehension33
EmoBed: Strengthening Monomodal Emotion Recognition via Training with Crossmodal Emotion Embeddings33
EmoNet: A Transfer Learning Framework for Multi-Corpus Speech Emotion Recognition32
Modeling Emotion in Complex Stories: The Stanford Emotional Narratives Dataset32
Depression Level Prediction Using Deep Spatiotemporal Features and Multilayer Bi-LTSM32
Improving Attention Model Based on Cognition Grounded Data for Sentiment Analysis32
Interpretation of Depression Detection Models via Feature Selection Methods32
Lexicon-Based Sentiment Convolutional Neural Networks for Online Review Analysis31
Automatic Recognition of Children Engagement from Facial Video Using Convolutional Neural Networks31
Survey of Deep Representation Learning for Speech Emotion Recognition31
A Bayesian Deep Learning Framework for End-To-End Prediction of Emotion From Heartbeat31
Acute Stress State Classification Based on Electrodermal Activity Modeling30
Physiological Detection of Affective States in Children with Autism Spectrum Disorder30
Classification of Interbeat Interval Time-Series Using Attention Entropy29
Evoking Physiological Synchrony and Empathy Using Social VR With Biofeedback29
Unsupervised Learning in Reservoir Computing for EEG-Based Emotion Recognition29
Segment-Based Methods for Facial Attribute Detection from Partial Faces29
A Novel Classification Strategy to Distinguish Five Levels of Pain Using the EEG Signal Features28
Adapted Dynamic Memory Network for Emotion Recognition in Conversation28
Efficient Multimodal Transformer With Dual-Level Feature Restoration for Robust Multimodal Sentiment Analysis28
GMSS: Graph-Based Multi-Task Self-Supervised Learning for EEG Emotion Recognition27
A Comprehensive and Context-Sensitive Neonatal Pain Assessment Using Computer Vision27
Variational Instance-Adaptive Graph for EEG Emotion Recognition27
A Deep Multiscale Spatiotemporal Network for Assessing Depression From Facial Dynamics27
Automatic Emotion Recognition for Groups: A Review26
Conveying Emotions Through Device-Initiated Touch26
A Novel Sentiment Polarity Detection Framework for Chinese26
Deep Learning for Spatio-Temporal Modeling of Dynamic Spontaneous Emotions26
Micro and Macro Facial Expression Recognition Using Advanced Local Motion Patterns25
Exploring Self-Attention Graph Pooling With EEG-Based Topological Structure and Soft Label for Depression Detection25
Deep Multi-Task Multi-Label CNN for Effective Facial Attribute Classification25
Empirical Evidence Relating EEG Signal Duration to Emotion Classification Performance25
Neural Attentive Network for Cross-Domain Aspect-Level Sentiment Classification25
Speech-Driven Expressive Talking Lips with Conditional Sequential Generative Adversarial Networks25
Hierarchical Interactive Multimodal Transformer for Aspect-Based Multimodal Sentiment Analysis25
A Psychologically Inspired Fuzzy Cognitive Deep Learning Framework to Predict Crowd Behavior25
STCAM: Spatial-Temporal and Channel Attention Module for Dynamic Facial Expression Recognition25
Multiple Instance Learning for Emotion Recognition Using Physiological Signals25
Automatic Recognition of Facial Displays of Unfelt Emotions24
Local Temporal Pattern and Data Augmentation for Spotting Micro-Expressions24
E-Key: An EEG-Based Biometric Authentication and Driving Fatigue Detection System24
Behavioral and Physiological Signals-Based Deep Multimodal Approach for Mobile Emotion Recognition24
EEG-Based Emotion Recognition via Neural Architecture Search24
MERASTC: Micro-Expression Recognition Using Effective Feature Encodings and 2D Convolutional Neural Network23
DepecheMood++: A Bilingual Emotion Lexicon Built Through Simple Yet Powerful Techniques22
The Recognition of Multiple Anxiety Levels Based on Electroencephalograph22
Multi-Modal Sarcasm Detection and Humor Classification in Code-Mixed Conversations22
Spatio-Temporal Encoder-Decoder Fully Convolutional Network for Video-Based Dimensional Emotion Recognition22
Investigation of Speech Landmark Patterns for Depression Detection22
ENGAGE-DEM: A Model of Engagement of People With Dementia22
PARSE: Pairwise Alignment of Representations in Semi-Supervised EEG Learning for Emotion Recognition22
On the Influence of Affect in EEG-Based Subject Identification22
BReG-NeXt: Facial Affect Computing Using Adaptive Residual Networks With Bounded Gradient22
Stimulus Sampling With 360-Videos: Examining Head Movements, Arousal, Presence, Simulator Sickness, and Preference on a Large Sample of Participants and Videos21
Towards Transparent Robot Learning Through TDRL-Based Emotional Expressions21
Personality Traits Classification Using Deep Visual Activity-Based Nonverbal Features of Key-Dynamic Images21
Self-Supervised Learning of Person-Specific Facial Dynamics for Automatic Personality Recognition21
PersEmoN: A Deep Network for Joint Analysis of Apparent Personality, Emotion and Their Relationship21
Multi-Label Emotion Detection via Emotion-Specified Feature Extraction and Emotion Correlation Learning21
Regression Guided by Relative Ranking Using Convolutional Neural Network (R3 CNN) for Facial Beauty Prediction21
Multimodal Deception Detection Using Real-Life Trial Data21
Unraveling ML Models of Emotion With NOVA: Multi-Level Explainable AI for Non-Experts21
Self Supervised Adversarial Domain Adaptation for Cross-Corpus and Cross-Language Speech Emotion Recognition20
Exploring Individual Differences of Public Speaking Anxiety in Real-Life and Virtual Presentations20
On the Effect of Observed Subject Biases in Apparent Personality Analysis From Audio-Visual Signals20
Prediction of Depression Severity Based on the Prosodic and Semantic Features With Bidirectional LSTM and Time Distributed CNN20
EEG-Based Emotion Recognition With Emotion Localization via Hierarchical Self-Attention20
Deep Multi-Modal Network Based Automated Depression Severity Estimation20
EmoSen: Generating Sentiment and Emotion Controlled Responses in a Multimodal Dialogue System20
Deep Temporal Analysis for Non-Acted Body Affect Recognition19
SMIN: Semi-Supervised Multi-Modal Interaction Network for Conversational Emotion Recognition19
Deep Siamese Neural Networks for Facial Expression Recognition in the Wild19
A Multi-Modal Stacked Ensemble Model for Bipolar Disorder Classification19
Phase Space Reconstruction Driven Spatio-Temporal Feature Learning for Dynamic Facial Expression Recognition19
Facial Depression Recognition by Deep Joint Label Distribution and Metric Learning19
Two-Stage Fuzzy Fusion Based-Convolution Neural Network for Dynamic Emotion Recognition19
Does Visual Self-Supervision Improve Learning of Speech Representations for Emotion Recognition?18
Recognition of Advertisement Emotions With Application to Computational Advertising18
ICA-Evolution Based Data Augmentation with Ensemble Deep Neural Networks Using Time and Frequency Kernels for Emotion Recognition from EEG-Data18
Cost-Sensitive Boosting Pruning Trees for Depression Detection on Twitter18
A Scalable Off-the-Shelf Framework for Measuring Patterns of Attention in Young Children and Its Application in Autism Spectrum Disorder18
What’s Your Laughter Doing There? A Taxonomy of the Pragmatic Functions of Laughter18
Chunk-Level Speech Emotion Recognition: A General Framework of Sequence-to-One Dynamic Temporal Modeling18
Identifying Cortical Brain Directed Connectivity Networks From High-Density EEG for Emotion Recognition18
Exploring the Contextual Factors Affecting Multimodal Emotion Recognition in Videos18
Emotion Intensity and its Control for Emotional Voice Conversion18
Bio-Inspired Deep Attribute Learning Towards Facial Aesthetic Prediction17
EEG-Based Subject-Independent Emotion Recognition Using Gated Recurrent Unit and Minimum Class Confusion17
The Pixels and Sounds of Emotion: General-Purpose Representations of Arousal in Games17
Domain-Incremental Continual Learning for Mitigating Bias in Facial Expression and Action Unit Recognition17
Multimodal Classification of Stressful Environments in Visually Impaired Mobility Using EEG and Peripheral Biosignals17
Towards a Prediction and Data Driven Computational Process Model of Emotion17
Affective Dynamics: Causality Modeling of Temporally Evolving Perceptual and Affective Responses16
Detecting Mental Disorders in Social Media Through Emotional Patterns - The Case of Anorexia and Depression16
The Arousal Video Game AnnotatIoN (AGAIN) Dataset16
A Transfer Learning Approach to Heatmap Regression for Action Unit Intensity Estimation16
EEG Feature Selection via Global Redundancy Minimization for Emotion Recognition16
Ethics and Good Practice in Computational Paralinguistics16
A Multimodal Non-Intrusive Stress Monitoring From the Pleasure-Arousal Emotional Dimensions16
Aspect-Based Sentiment Analysis with New Target Representation and Dependency Attention15
FaceEngage: Robust Estimation of Gameplay Engagement from User-Contributed (YouTube) Videos15
On the Influence of Shot Scale on Film Mood and Narrative Engagement in Film Viewers15
Graph-Based Facial Affect Analysis: A Review15
Toward Robust Stress Prediction in the Age of Wearables: Modeling Perceived Stress in a Longitudinal Study With Information Workers15
An Emotion Recognition Method for Game Evaluation Based on Electroencephalogram15
Disentangling Identity and Pose for Facial Expression Recognition15
Depression Recognition Using Remote Photoplethysmography From Facial Videos14
Prediction of Car Design Perception Using EEG and Gaze Patterns14
Designing an Experience Sampling Method for Smartphone Based Emotion Detection14
Personal-Zscore: Eliminating Individual Difference for EEG-Based Cross-Subject Emotion Recognition14
Improving the Performance of Sentiment Analysis Using Enhanced Preprocessing Technique and Artificial Neural Network14
LQGDNet: A Local Quaternion and Global Deep Network for Facial Depression Recognition14
Driver Emotion Recognition With a Hybrid Attentional Multimodal Fusion Framework14
An Investigation of Partition-Based and Phonetically-Aware Acoustic Features for Continuous Emotion Prediction from Speech14
Longitudinal Observational Evidence of the Impact of Emotion Regulation Strategies on Affective Expression14
Geometry-Aware Facial Expression Recognition via Attentive Graph Convolutional Networks14
Leveraging Affective Hashtags for Ranking Music Recommendations14
Joint Feature Adaptation and Graph Adaptive Label Propagation for Cross-Subject Emotion Recognition From EEG Signals14
Audio-Visual Automatic Group Affect Analysis14
An Overview of Facial Micro-Expression Analysis: Data, Methodology and Challenge14
Effective Connectivity Based EEG Revealing the Inhibitory Deficits for Distracting Stimuli in Major Depression Disorders14
Multi-Target Positive Emotion Recognition From EEG Signals13
Multiview Facial Expression Recognition, A Survey13
Boosting Facial Expression Recognition by A Semi-Supervised Progressive Teacher13
Induction of Emotional States in Educational Video Games Through a Fuzzy Control System13
Capturing Emotion Distribution for Multimedia Emotion Tagging13
Toward Automated Classroom Observation: Multimodal Machine Learning to Estimate CLASS Positive Climate and Negative Climate13
Embodied Robot Models for Interdisciplinary Emotion Research13
To Be or Not to Be in Flow at Work: Physiological Classification of Flow Using Machine Learning13
Multimodal Affective States Recognition Based on Multiscale CNNs and Biologically Inspired Decision Fusion Model12
Multi-Label Multi-Task Deep Learning for Behavioral Coding12
Affective Impression: Sentiment-Awareness POI Suggestion via Embedding in Heterogeneous LBSNs12
Probabilistic Attribute Tree Structured Convolutional Neural Networks for Facial Expression Recognition in the Wild12
Affect in Multimedia: Benchmarking Violent Scenes Detection12
Embedding Refinement Framework for Targeted Aspect-Based Sentiment Analysis12
4DME: A Spontaneous 4D Micro-Expression Dataset With Multimodalities12
Cluster-Level Contrastive Learning for Emotion Recognition in Conversations12
A Multi-Componential Approach to Emotion Recognition and the Effect of Personality12
Leaders and Followers Identified by Emotional Mimicry During Collaborative Learning: A Facial Expression Recognition Study on Emotional Valence12
Affect Estimation in 3D Space Using Multi-Task Active Learning for Regression12
Two Birds With One Stone: Knowledge-Embedded Temporal Convolutional Transformer for Depression Detection and Emotion Recognition12
Empathetic Conversational Systems: A Review of Current Advances, Gaps, and Opportunities12
Autoencoder for Semisupervised Multiple Emotion Detection of Conversation Transcripts12
Morality Classification in Natural Language Text12
Semi-Structural Interview-Based Chinese Multimodal Depression Corpus Towards Automatic Preliminary Screening of Depressive Disorders11
Few-Shot Learning in Emotion Recognition of Spontaneous Speech Using a Siamese Neural Network With Adaptive Sample Pair Formation11
Cross-Task and Cross-Participant Classification of Cognitive Load in an Emergency Simulation Game11
THIN: THrowable Information Networks and Application for Facial Expression Recognition in the Wild11
Jointly Aligning and Predicting Continuous Emotion Annotations11
Neurofeedback Training With an Electroencephalogram-Based Brain-Computer Interface Enhances Emotion Regulation11
Learning to Learn Better Unimodal Representations via Adaptive Multimodal Meta-Learning11
Altered Brain Dynamics and Their Ability for Major Depression Detection Using EEG Microstates Analysis11
Layered-Modeling of Affective and Sensory Experiences using Structural Equation Modeling: Touch Experiences of Plastic Surfaces as an Example11
Touching Virtual Humans: Haptic Responses Reveal the Emotional Impact of Affective Agents11
Affective Dynamics: Principal Motion Analysis of Temporal Dominance of Sensations and Emotions Data11
Ordinal Logistic Regression With Partial Proportional Odds for Depression Prediction10
Sentiment- Emotion- and Context-Guided Knowledge Selection Framework for Emotion Recognition in Conversations10
Examining Emotion Perception Agreement in Live Music Performance10
Exploring Domain Knowledge for Facial Expression-Assisted Action Unit Activation Recognition10
A Review of Affective Computing Research Based on Function-Component-Representation Framework10
Weakly-Supervised Learning for Fine-Grained Emotion Recognition Using Physiological Signals10
Applying Probabilistic Programming to Affective Computing10
Transformer-Based Self-Supervised Multimodal Representation Learning for Wearable Emotion Recognition10
Active Learning With Complementary Sampling for Instructing Class-Biased Multi-Label Text Emotion Classification10
Brain-Computer Interface for Generating Personally Attractive Images10
Quality-Aware Bag of Modulation Spectrum Features for Robust Speech Emotion Recognition10
Discriminative Few Shot Learning of Facial Dynamics in Interview Videos for Autism Trait Classification10
First Impressions Count! The Role of the Human's Emotional State on Rapport Established with an Empathic versus Neutral Virtual Therapist10
0.043612957000732