Physical Analytic s Research Manager Hendrik Hamann examines an array of
wireless sensors used to detect environmental conditions such as
temperature, humidity, gases and chemicals at IBM Research headquarters
in Yorktown Heights, NY, Monday, December 17, 2012. In five years,
technology advancements could enable sensors to analyze
odors or the molecules in a person’s breath to help diagnose diseases.
This innovation is part of IBM’s 5 in 5, a set of IBM annual predictions
that have the potential to change the way people work, live and
interact during the next five years. Credit: Jon Simon/Feature Photo
Service for IBM. The IBM 5 in 5 is based on market and societal trends
as well as emerging technologies from IBM's R&D labs around the
world that can make these transformations possible. This year's IBM 5 in
5 explores innovations that will be the underpinnings of the next era
of computing, which IBM describes as the era of cognitive systems. This
new generation of machines will learn, adapt, sense and begin to
experience the world as it really is. This year's predictions focus on
one element of the new era, the ability of computers to mimic the human
senses—in their own way, to see, smell, touch, taste and hear. These
sensing capabilities will help us become more aware, productive and help
us think – but not think for us. Cognitive computing systems will help
us see through complexity, keep up with the speed of information, make
more informed decisions, improve our health and standard of living,
enrich our lives and break down all kinds of barriers—including
geographic distance, language, cost and inaccessibility. "IBM scientists
around the world are collaborating on advances that will help computers
make sense of the world around them," said Bernie Meyerson, IBM Fellow
and VP of Innovation. "Just as the human brain relies on interacting
with the world using multiple senses, by bringing combinations of these
breakthroughs together, cognitive systems will bring even greater value
and insights, helping us solve some of the most complicated challenges."
Here are five predictions that will define the future: Touch: You will
be able to touch through your phone Imagine using your smartphone to
shop for your wedding dress and being able to feel the satin or silk of
the gown, or the lace on the veil, all from the surface of the screen?
Or to feel the beading and weave of a blanket made by a local artisan
half way around the world? In five years, industries such as retail will
be transformed by the ability to "touch" a product through your mobile
device. IBM scientists are developing applications for the retail,
healthcare and other sectors using haptic, infrared and pressure
sensitive technologies to simulate touch, such as the texture and weave
of a fabric—as a shopper brushes her finger over the image of the item
on a device screen. Utilizing the vibration capabilities of the phone,
every object will have a unique set of vibration patterns that
represents the touch experience: short fast patterns, or longer and
stronger strings of vibrations. The vibration pattern will differentiate
silk from linen or cotton, helping simulate the physical sensation of
actually touching the material. Current uses of haptic and graphic
technology in the gaming industry take the end user into a simulated
environment. The opportunity and challenge here is to make the
technology so ubiquitous and inter-woven into everyday experiences that
it brings greater context to our lives by weaving technology in front
and around us. This technology will become ubiquitous in our everyday
lives, turning mobile phones into tools for natural and intuitive
interaction with the world around us. Sight: A pixel will be worth a
thousand words We take 500 billion photos a year[1]. 72 hours of video
is uploaded to YouTube every minute[2]. The global medical diagnostic
imaging market is expected to grow to $26.6 billion by 2016[3].
Computers today only understand pictures by the text we use to tag or
title them; the majority of the information—the actual content of the
image—is a mystery. In the next five years, systems will not only be
able to look at and recognize the contents of images and visual data,
they will turn the pixels into meaning, beginning to make sense out of
it similar to the way a human views and interprets a photograph. In the
future, "brain-like" capabilities will let computers analyze features
such as color, texture patterns or edge information and extract insights
from visual media. This will have a profound impact for industries such
as healthcare, retail and agriculture. Within five years, these
capabilities will be put to work in healthcare by making sense out of
massive volumes of medical information such as MRI s, CT scans, X-Rays
and ultrasound images to capture information tailored to particular
anatomy or pathologies. What is critical in these images can be subtle
or invisible to the human eye and requires careful measurement. By being
trained to discriminate what to look for in images—such as
differentiating healthy from diseased tissue—and correlating that with
patient records and scientific literature, systems that can "see" will
help doctors detect medical problems with far greater speed and
accuracy. Hearing: Computers will hear what matters Ever wish you could
make sense of the sounds all around you and be able to understand what's
not being said? Within five years, a distributed system of clever
sensors will detect elements of sound such as sound pressure, vibrations
and sound waves at different frequencies. It will interpret these
inputs to predict when trees will fall in a forest or when a landslide
is imminent. Such a system will "listen" to our surroundings and measure
movements, or the stress in a material, to warn us if danger lies
ahead. Raw sounds will be detected by sensors, much like the human
brain. A system that receives this data will take into account other
"modalities," such as visual or tactile information, and classify and
interpret the sounds based on what it has learned. When new sounds are
detected, the system will form conclusions based on previous knowledge
and the ability to recognize patterns. For example, "baby talk" will be
understood as a language, telling parents or doctors what infants are
trying to communicate. Sounds can be a trigger for interpreting a baby's
behavior or needs. By being taught what baby sounds mean – whether
fussing indicates a baby is hungry, hot, tired or in pain – a
sophisticated speech recognition system would correlate sounds and
babbles with other sensory or physiological information such as heart
rate, pulse and temperature. In the next five years, by learning about
emotion and being able to sense mood, systems will pinpoint aspects of a
conversation and analyze pitch, tone and hesitancy to help us have more
productive dialogues that could improve customer call center
interactions, or allow us to seamlessly interact with different
cultures. Today, IBM scientists are beginning to capture underwater
noise levels in Galway Bay, Ireland to understand the sounds and
vibrations of wave energy conversion machines, and the impact on sea
life, by using underwater sensors that capture sound waves and transmit
them to a receiving system to be analyzed. Taste: Digital taste buds
will help you to eat smarter What if we could make healthy foods taste
delicious using a different kind of computing system that is built for
creativity? IBM researchers are developing a computing system that
actually experiences flavor, to be used with chefs to create the most
tasty and novel recipes. It will break down ingredients to their
molecular level and blend the chemistry of food compounds with the
psychology behind what flavors and smells humans prefer. By comparing
this with millions of recipes, the system will be able to create new
flavor combinations that pair, for example, roasted chestnuts with other
foods such as cooked beetroot, fresh caviar, and dry-cured ham. A
system like this can also be used to help us eat healthier, creating
novel flavor combinations that will make us crave a vegetable casserole
instead of potato chips. The computer will be able to use algorithms to
determine the precise chemical structure of food and why people like
certain tastes. These algorithms will examine how chemicals interact
with each other, the molecular complexity of flavor compounds and their
bonding structure, and use that information, together with models of
perception to predict the taste appeal of flavors. Not only will it make
healthy foods more palatable—it will also surprise us with unusual
pairings of foods actually designed to maximize our experience of taste
and flavor. In the case of people with special dietary needs such as
individuals with diabetes, it would develop flavors and recipes to keep
their blood sugar regulated, but satisfy their sweet tooth. Smell:
Computers will have a sense of smell During the next five years, tiny
sensors embedded in your computer or cell phone will detect if you're
coming down with a cold or other illness. By analyzing odors, bio markers
and thousands of molecules in someone's breath, doctors will have help
diagnosing and monitoring the onset of ailments such as liver and kidney
disorders, asthma, diabetes and epilepsy by detecting which odors are
normal and which are not. Today IBM scientists are already sensing
environmental conditions and gases to preserve works of art. This
innovation is beginning to be applied to tackle clinical hygiene, one of
the biggest challenges in healthcare today. For example,
antibiotic-resistant bacteria such as Methicillin-resistant
Staphylococcus aureus (MRSA), which in 2005 was associated with almost
19,000 hospital stay-related deaths in the United States, is commonly
found on the skin and can be easily transmitted wherever people are in
close contact. One way of fighting MRSA exposure in healthcare
institutions is by ensuring medical staff follow clinical hygiene
guidelines. In the next five years, IBM technology will "smell" surfaces
for disinfectants to determine whether rooms have been sanitized. Using
novel wireless "mesh" networks, data on various chemicals will be
gathered and measured by sensors, and continuously learn and adapt to
new smells over time. Due to advances in sensor and communication
technologies in combination with deep learning systems, sensors can
measure data in places never thought possible. For example, computer
systems can be used in agriculture to "smell" or analyze the soil
condition of crops. In urban environments, this technology will be used
to monitor issues with refuge, sanitation and pollution – helping city
agencies spot potential problems before they get out of hand.
No comments:
Post a Comment