Este sitio web fue traducido automáticamente. Para obtener más información, por favor haz clic aquí.
Updated

Reach out and touch someone -- over a smartphone.

By 2017, technology will have advanced far enough that computers will be able to see, smell, touch, taste and hear via digital taste buds, newly sensitive pixels and other innovations. You’ll even be able to touch something through your phone, announced Bernie Meyerson, IBM fellow and vice president of innovation.

“If you’re buying a textured fabric, you should be able to run your hand across the screen and get the feeling of touching that fabric,” Meyerson told FoxNews.com. “That’s not as impossible as it may seem.”

Every year, IBM announces the "5 in 5": Five technology innovations that seem ripped from the pages of a sci-fi novel, yet are arguably very realistic based on the state of the industry.

[pullquote]

More On This...

This year, the company looked to the next era of computing, which IBM called the era of cognitive systems, a new generation of machines that will learn, adapt, sense and experience the world as it really is. And these computers will mimic the human senses in their own way.

“Is a PC just a glorified calculator? Or can it learn?” Meyerson asked.

To enable a sense of touch in a smartphone or other screen, he predicted embeddable piezo-electric transducers -- nearly invisible elements that could produce vibrations and a sensation in the finger.

“You could mimic the sensation of moving your finger across something that had a rough degree of coarseness,” he explained.

This type of sensing capabilities will help computers see through complexity, keep up with the speed of information, make more informed decisions, and more

Take baby talk. Meyerson said future computers would be able to interpret an infant’s cries, listening through the background noise to parse emotion, sentiment, desire and more simply by studying changes in pitch and frequency.

“Machines are good at sensing that stuff,” he told FoxNews.com.

In a similar vein, a machine observing what you eat and eating it at the same time would be able to fine tune that food’s taste in real time.

“You could actually alter the taste of food by saying, no, I want a little more sugar, a little more starch.”

And as food goes bad, it releases tiny amounts of ammonia. A computer could sense that far better than a human could, and give you realtime status updates on the food in your fridge.

“It’s a viable thing. Five years out, that’s perfectly reasonable. We’re doing bits of it today,” Meyerson said.