Computers Will Eventually Taste, Touch, Smell And More

By | December 20, 2012

Some day in the not too distant future, you’ll be able to order a wedding dress on your tablet and feel the fabric and the veil just by touching the screen.

When you feel an object, your brain registers the series of vibrations on your skin as being smooth, rough, sharp, etc. Computer sensors are becoming sophisticated enough to do that too.

Within the next five years, vibrators within smartphones will be precise enough that they could be designed to mimic the vibrations experienced when your fingers touch a particular surface. Even though you’ll just be touching glass, it will feel like you’re touching whatever object is displayed on the screen.

“We’re not talking about fuzzy screens,” said Bernie Meyerson, IBM’s vice president of innovation. “You’re not going to have to dry clean your Samsung.”

In some ways, computers are already simulating touch — albeit in a crude form. When you’re driving a car in a video game, the controller vibrates when the car starts to veer off the road. It may not feel precisely like a steering wheel’s vibrations when you hit gravel, but within five years, that technology is expected to become even more lifelike.

IBM’s researchers are working on just that — creating applications for the retail and healthcare sectors that use haptic, infrared or pressure-sensitive technologies to simulate touch.

Today’s computers are very good at capturing and displaying images, but despite advances in image recognition software, computers are still pretty lousy at understanding what they’re “looking” at. Humans are still needed to tag friends, label photos and identify diseases.

In five years, all that will change, IBM says. Computers will be able to interpret images better than we can, analyzing colors, texture patterns and gaining insights from other visual media. They will even surpass doctors’ abilities to read medical imagery, including MRIs, CT scans, X-Rays and ultrasounds.

Computers of the not-too-distant future will be able to see subtleties in images that can be invisible to the human eye. For instance, computers will be able to quickly differentiate healthy from diseased tissue on an MRI and cross-reference the image with a patient’s medical history and scientific literature to make a diagnosis.

Imagine holding a smartphone up to your infant when she’s making a sound, and the app displaying a message: “I’m hungry.” That’s not as far-off as you might think.

In five years, computers will be able to detect elements of sounds that humans can hear but aren’t able to understand. As every parent knows, the difference between normal babbling and a message that something is wrong can be extremely subtle. Computers of the near-future will not only be able to detect whether a baby is upset, they’ll be able to determine if the child is hungry, tired, hot or in pain.

By interpreting different sound pressures, vibrations and waves, computers will be able to predict when trees are about to fall, when landslides are imminent, or when cars are about to collide before humans can.

Computers are already starting to do this: In Galway Bay, Ireland, IBM researchers are capturing underwater noise levels to understand the impact that different sounds have on sea life.

Within the next five years, a computer will help you make the perfect recipe — not too sweet, not too salty, not too crunchy, but just the way you like it.

By breaking down foods to the molecular level, computers will be able to use complex algorithms to determine what flavor combinations are the most appealing. They could then develop recipes that provide the ideal flavor and texture of food. Think of it as the Watson of Top Chef.

The technology could be used to help people eat better, IBM says. By making healthy foods taste better, people might crave vegetable dishes instead of sugary and fatty junk foods.

Though computers aren’t quite there yet, they are “tasting” things today. Specially designed microchips are being used in chemical and power plants to sense biohazards in the air. IBM researchers are working to adapt that technology to analyze the chemical structures in food.

Do you think you’re coming down with a cold? In five years, you’ll be able to breathe into your smartphone to find out.

IBM researchers are developing technology to analyze odors in people’s breath that identify ailments, including liver and kidney disorders, asthma, diabetes and epilepsy. By determining which odors and molecules in a person’s breath are associated with each disease, computers of the future will be able to make an instant analysis for problems that today could be misdiagnosed or go undetected by a doctor.

Computers will also be able to detect harmful bacteria that cause Staph infections in hospitals just by smelling the surroundings.

In a more rudimentary form, computers are smelling things now: Agricultural sensors smell soil to determine crop conditions, sensors in museums determine which gas levels are ideal to preserve paintings, and city sanitation departments use computers that can smell garbage and pollution to alert workers when conditions are getting dangerous.

Leave a Reply