HAPTICS (pronounced HAP-tiks) is the science of applying touch (tactile) sensation and control to interaction with computer applications. By using special devices (joysticks, data gloves, or other devices), users can receive feedback from computer applications in the form of felt sensation in the hand or other parts of the body. In combination with a visual display, haptics technology can be used to train people for tasks requiring hand eye coordination such as surgery and space ship maneuvers. It can also be used in games in which you feel as well as see your interactions with the images. For example, you might play tennis with another computer user somewhere else in the world. Both of you can see the moving ball, using the haptic device, position and swing your tennis racket and feel the impact of the ball.
Creating robots that can see, feel, smell and taste
It has taken millions of years of evolution to create the myriad of animal designs that roam the Earth today. But despite the incredible range of shapes and sizes that have resulted, many of these creatures share some very basic and vital senses vision, touch, smell and taste.
So fundamental to survival are these senses, that roboticits overlooked the sheer complexity of their physiology, until now and finally after years of painstaking
Research, we are beginning to create robots that can interact with the world around them and the result is almost life-like.
The standard list of five senses does not really give our bodies credit for all of the amazing things they can do. There are at least a dozen different things we can sense.
In order for us to have a sense, there needs to be a sensor. Each sensor is turned to one specific sensation. For example, there are sensors in your eyes that can detect light. That is all that they can detect. To track down all of the different senses a person has, the easiest things to do is to catalog all of the different sensors. Here is a reasonable list.
In your eyes, you have two different types of light sensors. One set of sensors, called the rods, senses light intensity and works in low-light situations. The other type, called cones, can sense colors and require fairly intense light to be activated.
In your inner ears, there are sound sensors. Also, in your ears are sensors that let you detect your orientation in the gravitational field—they give you your sense of balance. In your skin, there are at least five different types of nerve endings for sensing heat, Cold, Pain, Itch, Pressure etc. These cells give us the sense of touch, sense of pain, sense of temperature and sense of itch.
The fight for sight
Artificial vision involves far more than simply attaching a camera to the head of a robot. If robots are to actually react in a suitable way, they must be able to interpret what they are seeing.
How do you get a robot to recognize an apple? You have to program it with enough information in its internal memory system, so that the apple could not possibly be anything else. But look around you; think how much information you would have to provide to account for all the objects in the room, let alone the world!
In the broad picture, artificial vision is still in its infancy. There are no HAL 90000s just around the corner, but when that day arrives, the robotics industry will take off like never before.
For now, scientists have been able to design visual robot capable of performing very specific jobs – driving cars, playing badminton and even putting out fires. Three dimensional visions is just making its debut, enabling motion detection far more accurate than humans are capable of.
Alex Zelinsky is the founder of the company Seeing Machines in Australia. One of his products is a computerized camera, rigged up inside the cockpit of a car, to monitor the tiny movements in a driver’s face.
Information on gaze and eyelid activity is then analyzed to determine the level of fatigue. With over 1000 road deaths per year in Australia attributed to over-tired drivers, this could be a very important move.
To feel for real
AS you touch your keyboard now, millions of tiny nerves relay information to the brain about the position, texture and movement of the keys. Robots have no impulses, so how do they manipulate screwdrivers and spanners to perform delicate tasks?
The story of artificial touch begins in an area of virtual reality called HAPTICS which describes the physical handling of virtual objects. With your fingers placed in special thimbles, you can pluck non-existent objects of a virtual environment and even watch your virtual hands doing it!
Manayan Srinivasan, director of the popular Touch Lab, explains. “We could create a doughnut shape ….. make it feel sticky on the outside, with a gooey virtual jelly centre”
One of the most exciting applications of Haptics is telerobotic surgery. Surgeons can perform an operation without actually being present. Using virtual technology, they not only control the robot from afar, but also can actually feel their way through the operation.
In Austalia, Professor Andy Russell of Monash University is creating robots that can sense temperatures changes through touch. By placing a heater in the robot finger to mimic the heating of blood in our fingers and a thermistor to sense the temperature change, he has created robots that can follow heat trails.
“Heating the ground to about 50 de4grees with a quartz halogen bulb gave a heat trail that could be detected 15 minutes later,” said Professor Russell. One of the most sophisticated robotic hands to date came from Chinese developer Liu Hong, with 96 sensors and four joints in each finger. Robotic limbs, you see, are not limited to the evolutionary constraints of our bodies. Theoretically, a robotic hand could have dozens of fingers able to withstand extreme temperatures and rotate 360 degrees, with handy tools that can pop out of the fingertips when needed! This is Robonaut, a robot of surpassing dexterity created by NASA to reach further than the human hand and expand our ability for space construction and discovery.
Robots that smell!
Like ants following there own pheromone trails back home, robots can be fitted with special quartz crystal microbalance (QCM) sensors to detect and follow specific chemicals along the ground. Again Professor Russell explains, “The QCM sensors actually weigh the odor molecules and the extra weigh reduces the crystal frequency”
With the ability to sense particular odors in this way, the applications of robots will stretch yet further. For example, swarms of robots could move together without colliding by avoiding the smell of their neighbors.
Just a taster!
In January this year, a hand-held robotic tongue was unveiled to the world. Now threatening to replace professional tasters, this tongue is able to distinguish not only between two different wines from the same winery but also between different years! Sophisticated wine connoisseur it may be, but the science behind it is simple The tongue’s electric circuit contains four chemical sensors relating to the four basic tastes sweet, sour, salty and bitter. These sensors absorb dissolved substances differently. Specific foods have a unique “fingerprint” of these substances, therefore affecting the conductivity of the circuit in their own unique way.
A multi channel taste sensor, namely an electronic tongue, with global selectively is composed of several kinds of lipid/polymer membranes for transforming information about substances producing taste into electrical signals, which are input to a computer . The sensors output exhibits different patterns for chemical substances, which have different taste qualities such as saltiness, sourness and bitterness, whereas it exhibits similar patterns for chemical substances with similar tastes. The sensor respond to taste itself, as can be understood from the fact that taste interactions such as the suppression effect, which appears for sweet and bitter substances, can be produced well.The tastes of foodstuffs such as beer, coffee, mineral water, milk, sake, rice, soybean paste and vegetables can be discussed quantitatively using the taste sensor, which provides the objective scale for the human sensory system.
ABOUT HAPTICS LABORATORY
The Haptics Laboratory works on the engineering and design of design of haptic interfaces, that is, of systems, which comprise software and hardware components that concern the sense of touch.
They work on on-line computational models of interaction between objects (deformation, friction, cutting, etc) which can provide high-fidelity simulations as needed, for example, in the construction of surgical simulators. They are also interested in construction of surgical simulators. They are also interested in the study of perpetual effects involving touch, and to take advantage of them to create devices, visualization methods and tactile displays.
The laboratory is involved in exciting applications including rehabilitation, operator assistance in space, medicine, and computer music performance. In the past ten years, a Varity of haptics devices were created, including the Pantograph and the Freedom-7.Other devices based on different principles are presently being developed Actuators and their methods of control, being at the source of movement, are also of great interest.
The laboratory has also made contributions in robot programming, trajectory generation, mechanism design and computational collision detection. The Haptics Laboratory is the center for Intelligent Machines at McGill University.
What is haptic interaction?
"A haptic interface is a force reflecting device which allows a user to touch, feel, manipulate, create and/or alter simulated 3D objects in a virtual environment" haptic. (Adjective Grk: haptein) having to do with the sense of touch; tactile haptics = touch, tactile, force-feedback, using force/resistance, texture, heat, vibration
How does it work?
Force display technology works by using mechanical actuators to apply forces to the user. By simulating the physics of the user’s virtual world, we can compute these forces in real-time, and then send them to the actuators so that the user feels them.
Why is it going to be important?
• Bill Buxton "hands on = finger on"
• Not exploiting the interface to keep up with computing power
• More 3D and VR environments in games and elsewhere
• Demand for richer input and output possibilities
• Reduction in fatigue
• Increase in productivity and comfort
• decreased learning times
• Large reductions in manipulation errors
What sorts of products are being produced?
• The Phantom haptic interaction device
• Magnetic levitation interaction devices
• Exoskeleton devices
• The Freedom 7
Device turns computer into means for touching virtual objects
Like a high-tech version of a child's interactive "touch and feel" book, this computer interface lets users feel a virtual object to learn its shape, and whether it's rough as sandpaper, hard as rock or hot as fire. What use is that, one might ask? Plenty, if you're in the business of creating new ways for humans and computers to interact. One of the stickiest problems in developing advanced human-computer interfaces is finding a way to simulate touch. Without it, virtual reality isn't very real. Now a group of researchers at MIT's Artificial Intelligence Laboratory have found a way to communicate the tactile sense of a virtual object -- its shape, texture, temperature, weight and rigidity -- and let the user change those characteristics through a device called the PHANToM haptic interface.
For instance, you could deform a box's shape by poking it with your finger, and actually feel the side of the box give way. Or you could throw a virtual ball against a virtual wall and feel the impact when you catch the rebound.
"In the same way that a video monitor displays visual or graphic information to your eyes, the haptic 'display' lets you feel the physical information with your finger," said Dr. J. Kenneth Salisbury, a principal research scientist in the Department of Mechanical Engineering and head of haptics research at the AI Lab. "It's very unlike video in that you can modify a scene by touching it." The original PHANToM device was developed a few years ago by Thomas Massie, then an undergraduate, and Dr. Salisbury; inspiration for the device grew from collaboration between Dr. Salisbury and Dr. Mandayam Srinivasan of the Research Lab for Electronics. Since then, MIT's haptics researchers have continued to create enhancements, such as the ability to communicate a virtual object's temperature, texture and elasticity.
The PHANToM haptic interface is a small, desktop device that looks a bit like a desk lamp. Instead of a bulb on the end of the arm, it has a stylus grip or thimble for the user's fingertip. When connected to a computer, the device works sort of like a tactile mouse, except in 3-D. Three small motors give force feedback to the user by exerting pressure on the grip or thimble.
Although haptic devices predated PHANToM, those models primarily had been used to control remote robots that handled hazardous materials, said Dr. Salisbury. They cost up to $200,000 each and required a team of experts to develop the interface and software to adapt them for an application. "The beauty of the PHANToM is people can use it within minutes of getting it," he said. "They can plug it in and get started. I like to compare it to the PC revolution. Once computers were enormous and prohibitively expensive. Now PCs are everywhere."
The PHANToM interface's novelty lies in its small size, relatively low cost (about $20,000) and its simplification of tactile information. Rather than displaying information from many different points, this haptic device provides high-fidelity feedback to simulate touching at a single point. "Imagine closing your eyes, holding a pen and touching everything in your office. You could actually tell a lot about those objects from that single point of contact. You'd recognize your computer keyboard, the monitor, the telephone, desktop and so on," said Dr. Salisbury. The PHANToM haptic interface is currently being manufactured by SensAble Technology, Inc. in Cambridge. It has been sold in more than 17 countries to organizations such as Hewlett-Packard, GE, Toyota, Volkswagen, LEGO, Western Mining, Pennsylvania State University Medical School, and Brigham and Women's Hospital, which are using it for applications ranging from medical training to industrial design.
You do not have the required permissions to download the files attached to this post. You must LOGIN or REGISTER to download these files.