Machine Vision and Applications

Papers
(The H4-Index of Machine Vision and Applications is 18. The table below lists those papers that are above that threshold based on CrossRef citation counts [max. 250 papers]. The publications cover those that have been published in the past four years, i.e., from 2021-05-01 to 2025-05-01.)
ArticleCitations
End-to-end unsupervised learning of latent-space clustering for image segmentation via fully dense-UNet and fuzzy C-means loss110
Real estate pricing prediction via textual and visual features86
ECM: arbitrary style transfer via Enhanced-Channel Module64
Medtransnet: advanced gating transformer network for medical image classification59
Multi-shot person re-identification based on appearance and spatial-temporal cues in a large camera network50
A method for high dynamic range 3D color modeling of objects through a color camera48
Text-driven object affordance for guiding grasp-type recognition in multimodal robot teaching32
Development of a robust cascaded architecture for intelligent robot grasping using limited labelled data30
Class-aware cross-domain target detection based on cityscape in fog29
Obs-tackle: an obstacle detection system to assist navigation of visually impaired using smartphones27
A hybrid overlapping group sparsity denoising model with fractional-order total variation and non-convex regularizer25
Triple attention and global reasoning Siamese networks for visual tracking25
Enforced clustering for zero-to-one-shot texture anomaly detection22
Adaptive fast scale estimation, with accurate online model update based on kernelized correlation filter22
Global-guided cross-reference network for co-salient object detection21
A stereo vision SLAM with moving vehicles tracking in outdoor environment19
Motion-region annotation for complex videos via label propagation across occluders19
A motion direction detecting model for colored images based on the Hassenstein–Reichardt model19
Innovative surface roughness detection method based on white light interference images18
0.027153968811035