Journal on Multimodal User Interfaces

Papers
(The median citation count of Journal on Multimodal User Interfaces is 2. The table below lists those papers that are above that threshold based on CrossRef citation counts [max. 250 papers]. The publications cover those that have been published in the past four years, i.e., from 2020-11-01 to 2024-11-01.)
ArticleCitations
A survey of challenges and methods for Quality of Experience assessment of interactive VR applications36
MUMBAI: multi-person, multimodal board game affect and interaction analysis dataset17
A gaze-based interactive system to explore artwork imagery16
Words of encouragement: how praise delivered by a social robot changes children’s mindset for learning16
Multimodal analysis of personality traits on videos of self-presentation and induced behavior14
Non-native speaker perception of Intelligent Virtual Agents in two languages: the impact of amount and type of grammatical mistakes13
Internet-based tailored virtual human health intervention to promote colorectal cancer screening: design guidelines from two user studies12
Comparing mind perception in strategic exchanges: human-agent negotiation, dictator and ultimatum games12
Neighborhood based decision theoretic rough set under dynamic granulation for BCI motor imagery classification12
Verbal empathy and explanation to encourage behaviour change intention10
Facial expression and action unit recognition augmented by their dependencies on graph convolutional networks10
SoundSight: a mobile sensory substitution device that sonifies colour, distance, and temperature9
Training public speaking with virtual social interactions: effectiveness of real-time feedback and delayed feedback9
An audiovisual interface-based drumming system for multimodal human–robot interaction9
Grounding behaviours with conversational interfaces: effects of embodiment and failures9
Combining haptics and inertial motion capture to enhance remote control of a dual-arm robot9
A review on communication cues for augmented reality based remote guidance8
Informing the design of a multisensory learning environment for elementary mathematics learning8
Theory-based approach for assessing cognitive load during time-critical resource-managing human–computer interactions: an eye-tracking study7
The Audio-Corsi: an acoustic virtual reality-based technological solution for evaluating audio-spatial memory abilities6
Virtual agents as supporting media for scientific presentations6
Interactive exploration of a hierarchical spider web structure with sound6
RFID-based tangible and touch tabletop for dual reality in crisis management context5
Predicting multimodal presentation skills based on instance weighting domain adaptation5
Ipsilateral and contralateral warnings: effects on decision-making and eye movements in near-collision scenarios5
A wearable virtual touch system for IVIS in cars4
PLAAN: Pain Level Assessment with Anomaly-detection based Network4
Does an agent’s touch always matter? Study on virtual Midas touch, masculinity, social status, and compliance in Polish men4
Review of substitutive assistive tools and technologies for people with visual impairments: recent advancements and prospects4
Virtual reality can mediate the learning phase of upper limb prostheses supporting a better-informed selection process4
Preliminary assessment of a multimodal electric-powered wheelchair simulator for training of activities of daily living4
Advanced multimodal interaction techniques and user interfaces for serious games and virtual environments3
Remote social touch framework: a way to communicate physical interactions across long distances3
A novel focus encoding scheme for addressee detection in multiparty interaction using machine learning algorithms3
Personality trait estimation in group discussions using multimodal analysis and speaker embedding3
Exploring user-defined gestures for lingual and palatal interaction3
In-vehicle air gesture design: impacts of display modality and control orientation3
Augmented reality and deep learning based system for assisting assembly process2
Behavior and usability analysis for multimodal user interfaces2
Commanding a drone through body poses, improving the user experience2
Comparing alternative modalities in the context of multimodal human–robot interaction2
Identifying and evaluating conceptual representations for auditory-enhanced interactive physics simulations2
Exploring visual stimuli as a support for novices’ creative engagement with digital musical interfaces2
TapCAPTCHA: non-visual CAPTCHA on touchscreens for visually impaired people2
Gesture-based guidance for navigation in virtual environments2
Combining audio and visual displays to highlight temporal and spatial seismic patterns2
An interdisciplinary journey towards an aesthetics of sonification experience2
A SLAM-based augmented reality app for the assessment of spatial short-term memory using visual and auditory stimuli2
Designing multi-purpose devices to enhance users’ perception of haptics2
The Augmented Movement Platform For Embodied Learning (AMPEL): development and reliability2
Understanding virtual drilling perception using sound, and kinesthetic cues obtained with a mouse and keyboard2
0.02049994468689