How robots perceive objects

L3S Best Paper of the Quarter 
Category: Haptic Perception 

Visuo-Haptic Object Perception for Robots: An Overview

Authors: Nicolás Navarro-Guerrero, Sibel Toprak, Josip Josifovski, Lorenzo Jamone

Published in the Q1 journal “Autonomous Robots”  
https://link.springer.com/article/10.1007/s10514-023-10091-y 

The Autonomous Robots Journal from Springer Nature Group highlighted our article “Visuo-haptic object perception for robots: an overview” as one of the most downloaded articles in 2023: https://link.springer.com/journal/10514/updates/20394502 

The paper in a nutshell:

We provide a holistic description of the multimodal object perception for robots. We start with the biological basis of object perception and how it has inspired the development of sensor technologies and applications. We also describe the current state of research, including challenges in data collection in exchange for scientific results due to the heterogeneity of sensors, robots, and data collection procedures, and finally, challenges in multimodal signal processing (AI / ML) applied to multimodal object perception.

What is the potential impact of your findings?

Multimodal object perception for robots would allow robots to interact with objects more proficiently and could unlock many more applications. Moreover, multimodal object perception has applications beyond autonomous robots and could be used in smart prosthetics and remote object manipulation assisted by robots, such as in space or underwater applications. Our article aims to lower the barrier to entry into this research area by describing the current scientific landscape in multimodal object perception and describing promising future research directions.

What is new about your research?

In the project “ROMEO: Robot-MEdiated Object manipulation with haptic feedback”, we investigate technologies to translate robot tactile perception into haptic stimulation devices for humans. This research has applications in surgical robots, remote object manipulation, and smart prosthetics. In the project “Vibro-Sense: Bioinspirierter taktiler Sensor für die Robotik”, we investigate dynamic tactile sensors where we measure the vibrations generated and propagated when interacting with objects and process them to recognize texture, objects, slip, and the location of tactile stimuli. Another related project we are working on is “ACROSS: Adaptive Cross-Modal Representation for Robotic Transfer Learning.” In this project, we aim to translate tactile information across different sensor technologies, helping us researchers to exchange algorithms and data more easily.


Navarro-Guerrero, N., Toprak, S., Josifovski, J., & Jamone, L. (2023). Visuo-Haptic Object Perception for Robots: An Overview. Autonomous Robots, 47(4), 377–403. https://doi.org/10.1007/s10514-023-10091-y