Current smart objects are mainly controlled by voice or mobile devices. However, smart tools require seamless interaction that does not affect the operation of the tool. Touch interaction is the natural alternative, as no additional devices are required. In particular, touch gestures here are promising.
Most of the research on gesture interaction is based on gesture design through elicitation studies. Traditional mobile devices are designed to be held naturally during use. Their fronts are reserved for touch interactions, but the sides and back are reserved for their secure grip. The surface of smart tools cannot be divided into such discrete areas because the grip area and usage interface are close to each other. Thus, for gestures for smart tools, the importance of the midas-touch-problem increases to distinguish between holding the tool and operating them. Current methods for designing gestures do not consider whether the gestures are different from holding the tool.
Therefore, the goal of this project is to develop methods for designing touch-based gestures for smart tools by deeply integrating the distinguishability of explicit interaction from natural grasp, and the recognizability of gestures into the design process. We will extend the established elicitation studies by introducing a discriminability assessment and a recognizability assessment. The discriminability score describes how well a gesture can be distinguished from naturally holding and using a device. The recognizability score describes how well a gesture can be recognized. The scores can be easily combined with the currently used Agreement Score. This will enable us to design gestures that are not affected by Midas-Touch and thus are truly usable. We will demonstrate the new method by designing gestures for three exemplary tools, a drill, a cordless screwdriver, and a tool grinder.
After prototyping the three types of tools with capacitive sensors, we will use them to collect touch data while naturally holding and using the tools. The collected dataset will show gesture overlap with natural interaction, allowing us to derive a discriminability and recognizability score for a given gesture. We will perform elicitation studies incorporating the scores and evaluate the resulting gesture sets. By comparing the gestures of the conventional method and our method, we demonstrate an improved design method for gesture interactions for intelligent tool means of our method.
Institut für Information und Medien, Sprache und Kultur, Lehrstuhl für Medieninformatik, Prof. Dr. Niels Henze