Robot learning and tactile sensors

Multimaterial modular design through robot learning and touch sensor technology

Close-up view of a learning attempt by the robot that learns to perform tactile-based architectural assembly with SL blocks.

Duration: 7.2020 – 3.2021


Project team:

Prof. Jan Peters, Ph. D. | FB 20, FG Intelligente Autonome Systeme (IAS)

Prof. Dr.- Ing. Oliver Tessmann | FB 15, FG Digitales Gestalten (DDU)


Project description:

The proposed interdisciplinary research project aims to investigate and leverage a vision-based tactile sensing technology recently developed in collaboration between the departments of architecture and computer science in building multi-material modular structures. The developed tactile sensor allows for inspection of material properties, such as kind (e.g., concrete, wood) and texture (e.g., hard, soft), currently not achievable by other means. Furthermore, force and torque can be directly measured at a contact location. Based on such tactile input, a robotic assembly process can be developed capable of adapting to intermaterial contact properties, which will enable autonomous construction of modular structures combining materials of various kinds and textures. As such constructions have not been robot-assembled before, an investigation into possible geometries, material arrangements, and interconnection types is required. The expected outcome of the project are learning-based algorithms that can adapt to material and contact properties as well as the creation of guidelines and requirements for multi-material modular structures.



Gershenfeld, N., Carney, M., Jenett, B., Calisch, S. and Wilson, S., 2015. Macrofabrication with digital materials: Robotic assembly. Architectural Design, 85(5), pp.122–127.

Melenbrink, N., Werfel, J. and Menges, A., 2020. On-site autonomous construction robots: Towards unsupervised building. Automation in Construction.