IBM has issued its seventh annual look at what Big Blue researchers think will be the five biggest technologies for the next five years. In past prediction packages known as "IBM 5 in 5" the company has had some success in predicting the future of password protection, telemedicine and nanotechnology.

The IBM 5 in 5 research is based on collective trends as well as emerging technologies from IBM's R&D labs around the world. This year's research points to the development of what IBM calls a new generation of machines that will learn, adapt, sense and begin to experience the world as humans do through hearing, sight, smell, touch and taste.

"Just as the human brain relies on interacting with the world using multiple senses, by bringing combinations of these breakthroughs together, cognitive systems will bring even greater value and insights, helping us solve some of the most complicated challenges," writes Bernie Meyerson, IBM fellow and VP of innovation.

Your computer will reach out and touch somebody: According to IBM, in five years, industries such as retail will be transformed by the ability to "touch" a product through your mobile device. IBM says its scientists are developing applications for the retail, healthcare and other sectors using haptic, infrared and pressure sensitive technologies to simulate touch, such as the texture and weave of a fabric - as a shopper brushes a finger over the image of the item on a device screen.

Utilising the vibration capabilities of the phone, every object will have a unique set of vibration patterns that represents the touch experience: short fast patterns, or longer and stronger strings of vibrations. The vibration pattern will differentiate silk from linen or cotton, helping simulate the physical sensation of actually touching the material, IBM says.

Can you see me now? Within the next five years, IBM researchers think computers will not only be able to look at images, but help us understand the 500 billion photos we're taking every year (that's about 78 photos for each person on the planet). In the future, "brain-like" capabilities will let computers analyse features such as colour, texture patterns or edge information and extract insights from visual media.

One of the challenges of getting computers to "see," is that traditional programming can't replicate something as complex as sight. But by taking a cognitive approach, and showing a computer thousands of examples of a particular scene, the computer can start to detect patterns that matter, whether it's in a scanned photograph uploaded to the web, or some video footage taken with a camera phone, IBM says.

Within five years, these capabilities will be put to work in healthcare by making sense out of massive volumes of medical information such as MRIs, CT scans, X-Rays and ultrasound images to capture information tailored to particular anatomy or pathologies.

What is critical in these images can be subtle or invisible to the human eye and requires careful measurement. By being trained to discriminate what to look for in images - such as differentiating healthy from diseased tissue - and correlating that with patient records and scientific literature, systems that can "see" will help doctors detect medical problems with far greater speed and accuracy, IBM says.

Stop, look. Listen: IBM thinks that by 2017 or so a distributed system of what it calls "clever sensors" will detect elements of sound such as sound pressure, vibrations and sound waves at different frequencies. The system will interpret these inputs to predict when trees will fall in a forest or when a landslide is imminent. Such a system will "listen" to our surroundings and measure movements, or the stress in a material, to warn of danger, IBM says.

Raw sounds will be detected by sensors, much like the human brain. A system that receives this data will take into account other "modalities," such as visual or tactile information, and classify and interpret the sounds based on what it has learned. When new sounds are detected, the system will form conclusions based on previous knowledge and the ability to recognise patterns, IBM says. By learning about emotion and being able to sense mood, systems will pinpoint aspects of a conversation and analyse pitch, tone and hesitancy to help us have more productive dialogues that could improve customer call center interactions, or allow us to seamlessly interact with different cultures.

IBM goes so far as to say that "baby talk" will be understood as a language - telling parents or doctors what infants are trying to communicate.