By Krishna Kumar
| October 16th 2007 03:40 AM | Print
Haptics - Virtuality Explored
Haptics refers to sensing and manipulation through touch. Since the early part of twentieth century, the term haptics has been used by psychologists for studies on the active touch of real objects by humans. In the late nineteen-eighties, when we started working on novel
machines pertaining to touch, it became apparent that a new discipline was emerging that needed a name. Rather than concocting a new term, we chose to redefine haptics by enlarging its scope to include machine touch and human-machine touch interactions. Our working definition of haptics includes all aspects of information acquisition and object manipulation through touch by humans, machines, or a combination of the two; and the
environments can be real, virtual or teleoperated. This is the sense in which substantial research and development in haptics is being pursued around the world today..
In order to organize the rapidly increasing multidisciplinary research literature, it is useful to define sub-areas of haptics. Haptics can be subdivided into three areas
1. human haptics - the study of human sensing and manipulation through touch,
2. machine haptics – the design, construction, and use of machines to replace or augment human touch.
3. computer haptics -algorithms and software associated with generating and rendering the touch and feel of virtual objects (analogous to computer graphics).
Consequently, multiple disciplines such as biomechanics, neuroscience, psychophysics, robot design and control, mathematical modeling and simulation, and software engineering converge to support haptics. Wide varieties of applications have emerged and span many areas of human needs such as product design, medical trainers, and rehabilitation.
Haptics is poised for rapid growth. Just as the primitive man forged hand tools to triumph over harsh nature, we need to develop smart devices to interface with information-rich real and virtual worlds. Given the ever-increasing quantities and types of information that surrounds us, and to which we need to respond rapidly, there is a critical need to explore new ways to interact with information. In order to be efficient in this interaction, it is essential that we utilize all of our sensorimotor capabilities. Our haptic system – with its tactile, kinesthetic, and motor capabilities together with the associated cognitive processes – presents a uniquely bi-directional information channel to our brains, yet it remains underutilized. If we add force and/or distributed tactile feedback of sufficient range,resolution and frequency bandwidth to match the capabilities of our hands and other body
parts, a large number of applications open up, such as haptic aids for a blind user surfing the net or a surgical trainee perfecting his trade. Ongoing engineering revolutions in information technology and the miniaturization of sensors and actuators are bringing this
dream ever closer to reality.
Virtual environments (VEs), generally referred to as virtual reality in the popular press, have caught the imagination of lay public as well as researchers working in a wide variety of disciplines. VEs are computer-generated synthetic environments with which a human user can interact to perform perceptual and motor tasks. A typical VE system consists of a helmet that can project computer-generated visual images and sounds appropriate to the gaze direction, and special gloves with which one can command a computer through hand gestures. The possibility that by wearing such devices, one could be mentally transported
to and immersed in virtual worlds built solely through software is both fascinating and powerful. Applications of this technology include a large variety of human activities such as training, education, entertainment, health care, scientific visualization, telecommunication, design, manufacturing and marketing. Virtual environment systems that engage only the visual and auditory senses of the user are limited in their capability to interact with the user. As in our interactions with the real world, engaging the haptic sensorimotor system that not only conveys the sense of touch and feel of objects, but also allows us to manipulate them, is desirable. In particular, the human hand is a versatile organ that is able to press, grasp, squeeze or stroke objects; it can
explore object properties such as surface texture, shape and softness; it can manipulate tools such as a pen or a jack-hammer. Being able to touch, feel, and manipulate objects in an environment, in addition to seeing (and/or hearing) them, gives a sense of compelling immersion in the environment that is otherwise not possible. Real or virtual environments
that deprive the human user of the touch and feel of objects seem deficient and seriously handicap human interaction capabilities. It is likely that a more immersive experience in a VE can be achieved by the synchronous operation of even a simple haptic interface with a
visual and auditory display, rather than by large improvements in, say, the fidelity of the visual display alone.
Haptic interfaces are devices that enable manual interactions with virtual environments or teleoperated remote systems. They are employed for tasks that are usually performed using hands in the real world, such as manual exploration and manipulation of objects. In general, they receive motor action commands from the human user and display appropriate
tactual images to the user. Such haptic interactions may or may not be accompanied by the stimulation of other sensory modalities such as vision and audition.
Although computer keyboards, mice, trackballs, and even instrumented gloves available in the market can be thought of as relatively simple haptic interfaces, they can only convey the user’s commands to the computer, and are unable to give a natural sense of touch and
feel to the user. Recent advances in the development of force-reflecting haptic interfacehardware as well as haptic rendering software have caused considerable excitement. The underlying technology is becoming mature and has opened up novel and interesting research areas. However, to really enable the wide variety of known applications of haptics, and even more so, the applications that we cannot yet imagine, it is critical to
understand the nature of touch interaction - how and what do we perceive, how do we manipulate, and how are these related to task performance. The challenge of haptics research then is two-fold: to gain a deep scientific understanding of our haptic sensorimotor system and to develop appropriate haptic interface technology. In this short introductory document, we primarily provide an overview of the major sub- areas of haptics and refer the reader to some of our more detailed reviews (which, in turn, have substantial references to works by us and others) for a more in-depth look. In the first section, we provide the basics of how we feel and how to mimic that feel. The following section is a basic introduction to human haptics, the study of the human sensorimotor system relevant to manual exploration and manipulation. The subsequent section is on machine haptics, concerned with the electromechanical devices used as haptic interfaces.