IEEE Transactions on Affective Computing

Papers
(The H4-Index of IEEE Transactions on Affective Computing is 46. The table below lists those papers that are above that threshold based on CrossRef citation counts [max. 250 papers]. The publications cover those that have been published in the past four years, i.e., from 2021-02-01 to 2025-02-01.)
ArticleCitations
2021 Index IEEE Transactions on Affective Computing Vol. 12686
Recognizing, Fast and Slow: Complex Emotion Recognition With Facial Expression Detection and Remote Physiological Measurement343
A High-Quality Landmarked Infrared Eye Video Dataset (IREye4Task): Eye Behaviors, Insights and Benchmarks for Wearable Mental State Analysis341
Emotion Dependent Domain Adaptation for Speech Driven Affective Facial Feature Synthesis286
Modeling Emotion in Complex Stories: The Stanford Emotional Narratives Dataset226
Multimodal Self-Assessed Personality Estimation During Crowded Mingle Scenarios Using Wearables Devices and Cameras221
A Deeper Look at Facial Expression Dataset Bias208
Emotion Recognition for Everyday Life Using Physiological Signals From Wearables: A Systematic Literature Review159
To Be or Not to Be in Flow at Work: Physiological Classification of Flow Using Machine Learning158
Mutual Information Based Fusion Model (MIBFM): Mild Depression Recognition Using EEG and Pupil Area Signals146
Affective Dynamics and Cognition During Game-Based Learning145
TSSRD: A Topic Sentiment Summarization Framework Based on Reaching Definition140
Early Detection of User Engagement Breakdown in Spontaneous Human-Humanoid Interaction136
Capturing Emotion Distribution for Multimedia Emotion Tagging129
BReG-NeXt: Facial Affect Computing Using Adaptive Residual Networks With Bounded Gradient122
Editorial: Transactions on Affective Computing – Another Year in the Shade of Covid-19118
Multiview Facial Expression Recognition, A Survey113
Automatic Estimation of Action Unit Intensities and Inference of Emotional Appraisals103
An (E)Affective Bind: Situated Affectivity and the Prospect of Affect Recognition103
Speech-Driven Expressive Talking Lips with Conditional Sequential Generative Adversarial Networks100
Facial Expression Recognition Using a Temporal Ensemble of Multi-Level Convolutional Neural Networks91
Personality Traits Classification Using Deep Visual Activity-Based Nonverbal Features of Key-Dynamic Images86
First Impressions: A Survey on Vision-Based Apparent Personality Trait Analysis74
The Mediating Effect of Emotions on Trust in the Context of Automated System Usage70
Doing and Feeling: Relationships Between Moods, Productivity and Task-Switching68
Exploiting Evolutionary Algorithms to Model Nonverbal Reactions to Conversational Interruptions in User-Agent Interactions67
Receiving a Mediated Touch From Your Partner vs. a Male Stranger: How Visual Feedback of Touch and Its Sender Influence Touch Experience65
Collecting Mementos: A Multimodal Dataset for Context-Sensitive Modeling of Affect and Memory Processing in Responses to Videos64
Effects of Physiological Signals in Different Types of Multimodal Sentiment Estimation63
Towards Human-Compatible Autonomous Car: A Study of Non-Verbal Turing Test in Automated Driving With Affective Transition Modelling63
Perceived Conversation Quality in Spontaneous Interactions62
Guest Editorial Best of ACII 202159
Does gamified breath-biofeedback promote adherence, relaxation, and skill transfer in the wild?59
A Residual Multi-Scale Convolutional Neural Network with Transformers for Speech Emotion Recognition56
Probabilistic Attribute Tree Structured Convolutional Neural Networks for Facial Expression Recognition in the Wild55
A Comparative Data-driven Study of Intensity-based Categorical Emotion Representations for MER55
Multimodal Sentimental Privileged Information Embedding for Improving Facial Expression Recognition54
Automatic Context-Aware Inference of Engagement in HMI: A Survey51
Capturing Interaction Quality in Long Duration (Simulated) Space Missions With Wearables50
A Region Group Adaptive Attention Model For Subtle Expression Recognition49
Hierarchical Knowledge Stripping for Multimodal Sentiment Analysis49
Facial Expression Animation by Landmark Guided Residual Module48
Mechanoreceptive Aβ primary afferents discriminate naturalistic social touch inputs at a functionally relevant time scale47
Unsupervised Cross-Corpus Speech Emotion Recognition Using a Multi-Source Cycle-GAN47
“Emotions are the Great Captains of Our Lives”: Measuring Moods Through the Power of Physiological and Environmental Sensing47
Facial Expression Recognition in Classrooms: Ethical Considerations and Proposed Guidelines for Affect Detection in Educational Settings46
0.18424701690674