

RoboTac 2026
IEEE/RAS International Workshop
June 1st, 2026
ICRA 2026, Vienna, Austria

Embodied Tactile Intelligence in Predictive Perception, Learning & Control in Grasp & Manipulation
Emerging the Role of Embodiment and Visuo-Tactile-LLM Foundation Models
Call for Papers and Contributions
Important dates:
-
Paper submission deadline: March 1st, 2026
-
Notification of acceptance: March 15th, 2026
-
Camera-ready deadline: April 1st, 2026
-
Workshop day: June 1st, 2026
Paper submission guidelines:
We solicit contributions in the form of extended abstracts (min 2 pages, max 4 pages) in IEEE paper format to be presented at the workshop as posters. Please follow the authors guidelines and use the LaTex or MS Word templates for the ICRA conference. Outstanding contributions will be selected for oral presentations.
Accepted papers and eventual supplementary material will be made available on the workshop website.
Submission:
Please submit your contribution via EasyChair Link or https://easychair.org/conferences/?conf=robotac2026
In case of any questions please contact the main organiser via Email info.robotac@bmw.de
Awards:
Outstanding paper, poster and demos will receive awards.
Objectives
Haptics, or the sense of touch, enables humans to perceive and interact with their environment, playing a crucial role in grasping, manipulation, learning, and decision-making. Humans rely on haptic exploration, combined with vision and higher-level reasoning, to understand an object's shape, texture, and mechanical properties, using sensory feedback to plan and adapt actions in real time.
For robots, developing similar capabilities is essential to safely and efficiently operate in dynamic and unstructured environments, from industrial tasks to household assistance and care. Achieving robust haptic exploration and object recognition remains challenging due to soft or variable materials, sensor and motor noise, and delays in sensory processing.
Embodiment is key: by grounding perception, learning, and control in the physical body and its interactions, robots can achieve more effective, interactive, and predictive manipulation. Integrating tactile sensing with vision and large language models (LLMs) allows robots to combine local, detailed touch information with global visual context and high-level reasoning. This integration enables better anticipation of object properties, more robust planning, and reasoning about complex tasks, resulting in more dexterous, adaptive, and context-aware manipulation.
Organizers
Invited Speakers
Accepted Papers
Accepted papers will be listed here.
Overview of the Program
Speaker - Session | Time |
|---|---|
Welcome | 09:00 - 09:05 |
Human Sense of Touch and Embodiment | |
Prof. Marcia O'Malley | 09:05 - 09:20 |
Prof. Yasemin Vardar | 09:20 - 09:35 |
Prof. Gregory Gerling | 09:35 - 09:50 |
Poster Teaser A | 09:50 - 10:05 |
Panel Discussion Session 1 and Q/A | 10:05 - 10:35 |
Tactile Sensing Technologies for Embodied Robots | |
Prof. Oliver Brock | 10:35 - 10:50 |
Prof. Matei Ciocarlie | 10:50 - 11:05 |
Prof. Domenico Prattichizzo | 11:05 - 11:20 |
Prof. Matteo Bianchi | 11:20 - 11:35 |
Prof. Lorenzo Jamone | 11:35 - 11:50 |
Poster Teaser B | 11:50 - 12:00 |
Panel Discussion Session 2 and Q/A | 12:00 - 12:30 |
Coffee break and Poster and Demo Session | 12:30 - 12:45 |
Embodied Visuo-Tactile & LLM Perception, Interaction, & Exploration | |
Prof. Fumiya Iida | 12:45 - 13:00 |
Prof. Matej Hoffmann | 13:00 - 13:15 |
Lunch and Demo | 13:15 - 14:00 |
Prof. Micheal Wiertlewski | 14:00 - 14:15 |
Prof. Hauping Liu | 14:15 - 14:30 |
Panel Discussion Session 3 and Q/A | 14:30 - 15:00 |
Embodied Visuo-tactile grasp, prehensile and nonprehensile manipulation | |
Prof. Sethu Vijayakumar
| 15:00 - 15:15 |
Prof. Patrick van der Smagt | 15:15 - 15:30 |
Poster Teasers C | 15:30 - 15:45 |
Coffee Break | 15:45 - 16:00 |
Prof. Ad Spiers | 16:00 - 16:15 |
Prof. Maria Pozzi | 16:15 - 16:30 |
Prof. Alberto Rodriguez | 16:30 - 16:45 |
Prof. Irene Kuling | 16:45 - 17:00 |
Panel Discussion Session 4 and Q/A | 17:00 - 17:30 |
Wrap up and Socialization, Group Photo | 17:30 - 18:00 |
Topics of Interest
(but not limited to)
Human Sense of Touch and Embodiment
• Touch physiology: from skin to brain, and its role in embodied intelligence
• Haptic perception and sensory-motor integration
• Action-perception loops for adaptive behavior
• Perception for learning in an embodied context
• Insights from human embodiment for robotic interactive and predictive perception
Tactile Sensing Technologies for Embodied Robots
• Conformable, compliant, and biomimetic materials
• Features enabled by embodied tactile sensors
• Sensor effects, self-healing properties, and strategies for robust robotic embodiment
• Sensor skins: design, fabrication, and integration for embodied manipulation
• Integration and read-out strategies in physically embodied robots
• Enabling technologies for fully integrated, multimodal robotic systems
Embodied Visuo-Tactile & Language: Perception, Interaction, and Exploration
• Exploitation of contact constraints and novel contact models in embodied contexts
• Object perception and exploration through touch and vision
• Tactile information processing and feature learning for embodied reasoning
• Tactile-based object modeling, localization, shape reconstruction, and classification
• Integration of vision and touch sensing with foundation models (LLMs, multimodal models) for reasoning, prediction, and planning
• Modeling and representation of embodied, multimodal sensory modalities
Embodied Visuo-Tactile & Language: Grasping and Manipulation
• Slip detection, grasp planning, and stability assessment in embodied systems
• Soft, in-hand, and whole-body manipulation
• Tactile-vision integration for predictive planning and control
• Embodied skill transfer, tactile transfer learning, and learning from human demonstrations
• Multi-robot manipulation, coordination, and collaborative assembly with embodied reasoning
• Whole-body, multi-contact planning and control
• Design and characterization of contact-exploiting, compliant robotic hands
• Leveraging foundation models to enhance dexterous manipulation, high-level task reasoning, and adaptive control
Sponsors

































