top of page

Program
Central European Time (CET)
27th September
(LIVE)

 

09:00-09:10 Welcome opening

Lectures

09:10-09:35 (20 min + 5 min Q&A) - Video

Title: The physical basis of haptic perception    

Abstract: In this discussion I propose the idea that the sense of touch, supported by the somatosensory system, has developed to take advantage of the ambient physics. While this is certainly not an original idea, it has proven to be rich in surprises and practical consequences which will be discussed by examples. 

09:35-10:00 (20 min + 5 min Q&A) - Video

Title: Touch in silico   

Abstract: Neuromimetic algorithms has the potential to provide sense of touch for human-made devices in the near future. Here we discuss the recent work on information processing, communication and recovery along the somatosensory axis and introduce a biologically realistic and computationally efficient multilayer model of the sensory cortex that could learn from the experience of the agent.  

10:00-10:25 (20 min + 5 min Q&A) - Video

Title: The Neural Basis of Haptic Perception

Abstract: Here I will discuss recent developments in the view of haptics information and information processing in the nervous system. A central tenet is that haptics is a highly dynamical process, that engages large parts of the skin and consequently a large numbers of sensors, but this not well represented in the neuroscience literature. A consequence is that also large parts of the nervous system becomes engaged in haptics processing, which is also what we have found in a recent series of experiments. The implications of these changing views for the design of artificially intelligent systems, or robots, which rely on haptics to ‘understand’, are also discussed.

10:25-10:50 (20 min + 5 min Q&A) - Video

Title: Sensing Tactile Contact Over Large, Soft Surfaces    

Abstract: Robots should be able to feel contacts that occur across all of their body surfaces, not just at their fingertips.Furthermore, tactile sensors should be soft in order to cushion contact and support the transmission of tangential force and torque.  Today's robotic systems rarely have such sensing capabilities because artificial skin tends to be complex, bulky, rigid, delicate, and/or expensive.  Taking inspiration from other successful sensor designs, my collaborators and I have created two families of soft sensors that can feel a distribution of contact forces across their large surfaces.  First, Hyosang Lee has led a long-term project on large fabric-based tactile sensors that use piezoresistive laminated structures and electrical resistance tomography (ERT).These ERTac sensors estimate the distribution of normal force at fast frame rates, enabling capable perception of complex contacts and total normal force.  The sensing hardware is relatively simple and robust, with point electrodes distributed across the surface, while the more complex sampling electronics and reconstruction algorithms provide interesting opportunities for system improvement.  Second, Huanbo Sun, Georg Martius, and I have recently invented a tactile sensor that uses vision and deep learning to deliver all-over 3D tactile sensing in a package the size and shape of an extended human thumb.  Called Insight, our sensor has a soft single-layer silicone skin that is over-molded on a stiff skeleton, lit by internal LEDs, and viewed from within by a camera.  Extensive contact data was collected by an automatic testbed that applies both normal and shear forces at points across the surface.  After training, the network estimates the distribution of 3D forces across Insight's skin from each camera image, capably capturing multiple complex contacts.

10:50-11:15 (20 min + 5 min Q&A) - Video

Title: Materials and Skins for Intelligent Machines

Abstract: We live in an increasingly hyper-connected environment where humans, smart devices and robots live in synergy together. Flexible, wearable sensors and systems are accelerating this trend by generating ever greater amounts of data for AI algorithms to process and understand. Exciting new understanding and developments in somatosensory sciences will further augment human abilities and aid in applications as health diagnostics, surgery and predictive analytics. We believe a multi-disciplinary approach especially in materials design and processing is essential to achieve near or even superhuman capabilities in robotics. In the area of manipulation tasks, robots have yet to match human abilities despite progress in various sensing and actuator systems. We apply a neuromorphic approach for sensory systems as a potential pathway towards greater tactile and machine intelligence. I will discuss our approach and recent progress in developing new soft materials systems and neuromorphic approaches for robotic intelligence. Fusion of sensing modalities such as neuromorphic vision and touch will also facilitate robotic learning towards greater autonomy, especially as remote work, and “digitial twins” gains critical importance in pandemics and the future of work.

Title: Robots touching and touching robots

Abstract: Tactile sensors enable robots to properly react to contacts, but on the other hand they allow to sense the tactile features of touched objects and in the end to recognize them.

In this talk I will present some experiments involving robots sensorized with large area capacitive sensors, based on the CySkin technology developed at the University of Genova, and in particular I will focus on the problem of the recognition of human hand touch.

As a matter of fact, coexistence of robots and humans has gained great relevance over the past few years, and the capability of recognizing tactile gestures is a key element to trigger safe human robot interaction and to drive cooperative tasks or robot motions, or to ensure a safe interaction. Tactile data are acquired by sparse transducers, non-uniformly distributed over non-planar manifolds. These aspects together with the complexity the contacts arising make the processing of tactile information a difficult task. The approach that we propose is based on geometric transformations of the tactile data from 3D maps, formed by pressure measurements associated taxels spread over the robot body, into tactile images representing the contact pressure distribution in two dimensions. Deep learning algorithms are then applied to recognize human hands and to compute the pressure distribution applied on the robot by the various hand segments: palm and single fingers.

Title: Force/Tactile Sensor Technology for Robotic Manipulation

Abstract: Nowadays, robotic systems use tactile sensing as a key enabling technology to implement complex tasks. For example, manipulation and grasping problems strongly depend on the physical and geometrical characteristics of the objects, in fact, objects may be deformable or change their shape when in contact with the robot or the environment. For this reason, often, robots end effectors are equipped with sensorized fingers which can estimate the objects' features, forces, and contact locations. The idea of designing and developing, in our laboratories, a tactile sensor based on optoelectronic technology dates to about a decade ago, within the FP7 European project DEXMART. During these years, the evolution of optoelectronic devices and our experience in the field allowed us to optimize our prototypes, by reaching with latest versions a high measurement performance and a high mechatronic integration level. The working principle is based on the idea to design a deformable layer to be suitably assembled with a discrete number of optoelectronic sensing devices, with the objective to transduce the external contacts into deformations measured by the optical sensible points (typically called “taxels” in literature). The sensing points, positioned below the deformable layer, provide a “tactile map” corresponding to a spatially distributed information about the contact. Based on application task, the tactile map can be used to reconstruct contact properties, e.g., contact force, contact torque, object shape. This contribution will present the technology related to the latest solution, in particular the one developed during the last two years, within the H2020 European projects REFILLS and REMODEL. Different application scenario will be presented in order to demonstrate the manipulation abilities based on the reconstructed forces and torques or the direct use of the tactile map. All these abilities can be performed by simple parallel jaw grippers equipped with the sensors. The slipping avoidance ability consists of firmly grasping an object by applying the “lowest” grasp force that avoids slippage. The pivoting maneuver can be executed in two different modalities called gripper and object pivoting, respectively. The first one consists of having the object fixed in the space while the gripper rotates about the grasp axis so as to change the relative orientation between the gripper and the object. The second pivoting modality is the dual one and consists of having the gripper fixed in the space while the object rotates in a pendulum-like motion. Additionally, the tactile map can be directly used to estimate the shape of grasped Deformable Linear Objects (DLOs) and to recognize object features (e.g., wire diameters), by means of machine learning techniques.

12:05-12:25 (20 min + 5 min Q&A) - Video

Title: Stretchable soft conductive composites for sensing force and touch

Abstract: Although stretchable soft conductive composites, consisting of silicone polymers and conductive fillers, have been used for electrical sensing of force in academic laboratories, these materials have not been able to move into real world applications. This is largely due to the lack of robust, low-cost and geometrically scalable technologies to reliably connect chemically inert silicone composites with solid-state electronics. In this talk, I will present our recent work [1] on the nanoporous Si-Cu based electrical contact technology and how it enables a range of applications for silicone-based force sensors especially for sensing touch for medical applications. 

 

Reference: Michael Kasimatis, Estefania Nunez-Bajo, Max Grell, Yasin Cotur, Giandrin Barandun, Ji-Seon Kim, and Firat Güder, “Monolithic Solder-On Nanoporous Si-Cu Contacts for Stretchable Silicone Composite Sensors”, ACS Applied Materials & Interfaces 2019 11 (50), 47577-47586 

Paper presentation - I

     12:30-12:40

  • Low-pass filter effects in biological neurons as a feature to facilitate representation of tactile information Udaya B. Rongala and Henrik Jörntell        download paper.pdf

     12:40-12:50

  • A Local Filtering Technique for Robot Skin Systems

​      Alessandro Albini, Giorgio Cannata and Perla Maiolino        download paper.pdf

     12:50-13:00

  • Sensor Fusion and Multimodal Learning for Robotic Grasp Verification

​      Priteshkumar GohilSantosh Thoduka and Paul Plöger.   download paper.pdf

13:00-13:25 (20 min + 5 min Q&A) - Video

Title: Grasping with a Sense of Touch 

Abstract: I will report on the new problems that arise in robotic grasping and manipulation with soft, adaptable hands and approaches that can be used in conjunction with tactile sensing capabilities. We will consider reactive grasping procedures that progressively refine an initial approximated grasp into a full form closure one. I will also consider interaction with tight environment constraints, and how manipulation can be planned in cases where classical randomized methods have difficulties.

13:25-13:50 (20 min + 5 min Q&A) - Video

Title: Soft Manipulation with Rigid and Magnetic Constraints

Abstract: Soft robotic hands are powerful end-effectors allowing compliant interactions with the environment and objects. The softness of the fingers largely increases the robustness of the physical interaction making manipulation very robust with respect to uncertainties. Soft robotic hands are typically underactuated and this, together with unpredictable deformations, affect the overall accuracy of manipulation tasks. Moreover, soft robotic hands are not dexterous because they are typically underactuated to avoid complex design of hands. 

In this talk I will present how rigid and magnetic constraints can be exploited to improve soft manipulation without impacting the simplicity of the design of the soft manipulation systems. I will present some ideas on how to exploit softness, rigid and magnetic constraints to improve soft manipulation.

13:50-14:15 (20 min + 5 min Q&A) - Video

Title: Soft Robots that Feel – Multimodal Sensing Skins for Soft Robot Grasping

Abstract: By eliminating rigid materials and hard contacts, soft robot end effectors have the potential to revolutionize robot grasping and manipulation. However, their ability to sense and map objects is highly limited by the bulk and stiffness of existing sensor electronics. In this talk, I will present progress in creating soft electronic sensing skins that can be incorporated into soft robot grippers and enable a wide range of sensing modalities. These sensing skins utilize a variety of material architectures, from highly stretchable liquid metal circuits to soft magnetized elastomers. When combined with methods in machine learning, these skins can be used to enable soft robot grippers to perform a variety of closed-loop grasping tasks that were not previously possible with open-loop techniques. Moreover, they can also be used as wearable electronic stickers for monitoring health vitals. In addition to describing their material architecture and sensing properties, I will discuss the utilization of these sensors in a variety of applications, from humanoid robotics to healthcare.

 

14:15-14:40 (20 min + 5 min Q&A) - Video

Title: Soft Robots that Feel – Multimodal Sensing Skins for Soft Robot Grasping

Abstract: Robots should go beyond biological models for sensing.  Human and animal tactile sensing is limited to when skin, hairs, or whiskers are in contact with an object. Thermal sensing can sense infrared radiation, sensing heat at a distance. Sensing of local wind currents and their temperature can indicate the movement of nearby objects. For robots, we can do better than this by extending tactile sensing with proximity sensing of nearby objects. Proximity sensing is useful to be able to predict contact time and location, and generate priors for contacted object pose, as well as nearby object locations and poses. In work on a camera-based tactile sensor with transparent skin, we found that vision of nearby objects was useful for centering grasps, grasping with little force, measuring small forces, and letting go of an object without knocking it over. In recent work we have explored cameras collocated with tactile sensors, rather than using the same camera and optical path for proximity and tactile sensing. This avoids some of the drawbacks of FingerVision. We are also exploring the use of radar for proximity sensing, including proximity sensing of occluded objects.

 

“I don’t want to be human. I want to see gamma rays, I want to hear X-rays, and I want to smell dark matter.”

—John Cavil, Cylon Model Number One, “No Exit”, Episode 15, Season 4, Battlestar Galactica (for this meeting a better quote would be: "I want to touch dark matter

 

14:40-15:05 (20 min + 5 min Q&A) - Video

Title: Using Tactile Signals and Grasp Analysis for Real-time Stability Prediction

Abstract: Grasp analysis is a well-developed framework for predicting grasp stability, based on force and torque equilibrium including friction. While it has been widely used for grasp planning, it has not been exploited for real-time control for robot hands. We are developing a highly instrumented robot hand with contact sensors to estimate the quantities needed for stability prediction, namely finger-object contact locations, surface normal vectors at the contact locations, and contact force vectors. Initial results suggest that force signals are intrinsically noisy with the relatively stiff polymer materials typically used for robot fingertips. In addition, the observed frictional behavior does not follow the Coulomb models typically used in grasp analysis. This limits the ability to accurately predict when objects will slip within a grasp. This has implications for the design of effective robot hands, as well as reliable grasp control methods.

15:05-15:30 (20 min + 5 min Q&A) - Video

Title: MAT: Multi-Fingered Adaptive Tactile Grasping via Deep Reinforcement Learning

Abstract: Vision-based grasping systems typically adopt an open-loop execution of a planned grasp. This policy can fail due to many reasons, including ubiquitous calibration error. Recovery from a failed grasp is further complicated by visual occlusion, as the hand is usually occluding the vision sensor as it attempts another open-loop regrasp. This talk presents MAT, a tactile closed-loop method capable of realizing grasps provided by a coarse initial positioning of the hand above an object. Our algorithm is a deep reinforcement learning (RL) policy optimized through the clipped surrogate objective within a maximum entropy RL framework to balance exploitation and exploration. The method utilizes tactile and proprioceptive information to act through both fine finger motions and larger regrasp movements to execute stable grasps. A novel curriculum of action motion magnitude makes learning more tractable and helps turn common failure cases into successes. Careful selection of features that exhibit small sim-to-real gaps enables this tactile grasping policy, trained purely in simulation, to transfer well to real world environments without the need for additional learning. Experimentally, this methodology improves over a vision-only grasp success rate substantially on a multi-fingered robot hand. When this methodology is used to realize grasps from coarse initial positions provided by a vision-only planner, the system is made dramatically more robust to calibration errors in the camera-robot transform.

Paper presentation - II

     15:30-15:40​

  • Active Tapping via Gaussian Process for Efficient Unknown Object Surface Reconstruction

      Su Sun and Byung-Cheol Min   download paper.pdf​

     15:40-15:50

  • TIAGo RL: Simulated Reinforcement Learning Environments with Tactile Data for Mobile Robots

      Luca Lach, Robert Haschke, Francesco Ferro, Helge Ritter      download paper.pdf

     15:50-16:00

  • Towards a soft robotic, haptic feedback seat for autonomy level transitions in highly-automated vehicles

       Jan Peters, Bani Anvari, Annika Raatz and Helge A. Wurdemann       download paper.pdf

16:00-16:25 (20 min + 5 min Q&A) - Video

Title: Thermal and Tactile Sensing and the Development of Multi-sensory Cutaneous Displays

Abstract: When the hand makes contact with an object its geometric and material properties are readily encoded by cutaneous mechanoreceptors that signal features such as the object’s shape, surface texture and compliance. Changes in skin temperature can also occur as the object is manipulated, with the thermal properties of the object and skin, as well as their initial temperatures, determining whether the heat flux is conducted out of the skin or object on contact. These changes in temperature provide information about the object’s thermal properties which assists in identifying its material composition. Although the thermal cues are subtle and changes in temperature are strictly localized to the area of contact, we have demonstrated in a number of experiments that such signals not only enable the composition of objects to be identified and discriminated but also provide information about contact force and area. Over the range of forces typically used during manual exploration (0.1-6 N), skin temperature decreases by an average of 5-6 °C after 10 s, reflecting changes in blood flow to the finger pad as it is compressed. Thermal models developed that incorporate such contact conditions and material properties have been shown to capture these changes in skin temperature. When implemented in thermal displays they enable users to identify and discriminate between simulated materials. Thermal feedback has also been combined with vibrotactile feedback in multisensory cutaneous displays to enhance user experience during object manipulation in virtual environments or when working with teleoperated robotic systems. Given the very different temporal and spatial processing properties of the tactile and thermal sensory systems, it is critical to determine the optimal temporal profiles for presenting such cues so that the signals are not masked. Our work has demonstrated that the perception of tactile cues can be enhanced or impeded depending on whether the skin is warmed or cooled and that these effects are specific to particular features of the vibrotactile signals presented.

16:25-16:50 (20 min + 5 min Q&A) - Video

Title: Towards Proprioception and Exteroception for Soft Growing Robots

Abstract: Due to their ability to move without sliding relative to their environment, soft growing robots are attractive for exploring unknown environments and deploying distributed sensor networks in confined spaces. Sensing of the state of such robots and their environment would add to their capabilities as human-safe, adaptable manipulators. However, incorporation of sensors into soft growing robots is challenging because it requires an interface between stiff and soft materials, and the sensors needs to undergo significant strain. In this work, we present two methods for adding distributed sensors to soft growing robots that use (1) bundled optical fibers for strain sensing using Optical Frequency Domain Reflectometry (OFDR) and (2) flexible printed circuit boards with self-contained units of microcontrollers and sensors encased in a laminate armor that protects them from unsafe curvatures. We demonstrate several capabilities of these sensing systems, including proprioception to measure growing robot shape and exteroception in the form of directional temperature and humidity information.This work advances the capabilities of soft growing robots, as well as the field of soft robot sensing.

16:50-17:15 (20 min + 5 min Q&A) - Video

Title: The Visiflex tactile and wrench-sensing fingertip

Abstract: Robot manipulation of, and contact with, rigid and nearly rigid objects requires compliance at the manipulator. In this talk I will describe the Visiflex, a tactile fingertip that uses a camera and a passive six-dof flexure to achieve the desired compliance and to simultaneously sense applied wrenches (forces and moments) and contact locations. I will also describe desirable symmetry properties of the flexures. These passively-compliant sensing fingertips enable the application of a theoretical framework for planning and controlling dexterous tasks such as in-hand sliding regrasps.

17:15-17:40 (20 min + 5 min Q&A) - Video

Title: Haptic Sensing, Perception and Soft Mechanics

Abstract: The sense of touch is essential for skilled manipulation and object perception.  Tactile sensing by humans and other animals is supported by biomechanical coupling in soft tissues, which transforms mechanical signals that are elicited by even localized touch contacts and distributes these signals to widespread tactile sensory neurons.  In this talk I will discuss how these processes are revising our understanding of haptic perception and how they furnish new ideas and strategies for haptic and robotic engineering

Paper presentation - III

     17:40-17:50​

  • An Active Extrinsic Contact Sensing for Generalizable Insertion Strategy

      Sangwoon Kim and Alberto Rodriguez   download paper.pdf

     17:50-18:00

  • Active Visuo-Tactile Object Pose Estimation

      Prajval Kumar Murali and Mohsen Kaboli      download paper.pdf

     18:00-18:10

  • Under Pressure: Learning to Detect Slip with Barometric Tactile Sensors

       Abhinav Grover, Chrisopher Grebe, Philippe Nadeau and Jonathan Kelly.  download paper.pdf

18:20-19:20 Panel Discussion 
Best paper and presentation award

Video

© 2024 by RoboTac

  • Youtube
  • White Twitter Icon
bottom of page