Sensors Poised to Transform User Experience

Touch, voice, GPS and motion detection expand device capabilities.

The technology behind how humans interface with their PCs and mobile devices is advancing steadily thanks to the unleashed computing power now available. Combined with miniaturized mechanical sensors and smarter software, the user experiences postulated in science fiction are getting nearer.

Sensors detect and interpret common human gestures.

“Sensors can detect where you are, what you’re doing and what you should do next,” said Marisa Ahmad, Intel product marketing engineer. “The device can tell you that you’re not in the right building or room for a meeting. It might know that you have a certain shopping list to complete, so it could suggest that you stop at a certain store you happen to be walking past. It will be a contextual platform that learns your behavior and gives you suggestions based on preferences. All the sensors will initially be used for simple apps, but eventually what they’re capable of doing is giving contextual information about you and your environment.”

Devices in the future will be poised to enable amazing breakthroughs in perceptual computing that push the limits of computer-human interaction. They’ll include the ability to harness inputs from an array of sensors to provide user feedback and enable richer, fuller experiences.

“Look at all the sensors that are present now: an accelerometer, a gyro, a magnetometer, a light sensor, orientation, inclinometer and an e-compass,” said Arvind Kumar, a principal engineer and sensor architect at Intel. “Proximity sensors and ambient-temperature sensors will be available in the near future and will help with energy-optimizing, detecting humidity and more.”

Devices already exist that can interpret gestures, identify features, provide strong security features, calculate proximity measurements and enable augmented-reality applications. The growing sensor universe can be broken down into six categories:

Touchscreens rely on sensors to pick up the location of a pointing device such as your finger.

  • System Management. Device “health” monitors improve power efficiency, responsiveness and usability by analyzing sensor data and providing triggers to other platform components.
  • Location Analysis. Inputs for sensor-based inertial navigation improve location accuracy for GPS, mapping, traffic avoidance and driving instructions.
  • Enhanced Security. Provide such environmental data as location to trusted services, monitor usage scenarios, check in for lost or stolen status or shut down to protect data.
  • Gesture Control. Detect and interpret common human gestures to make user interfaces more natural.
  • Context Management. Determine platform usage whether indoors or outdoors, recognize and monitor the environment and detect human proximity.
  • Gaming. Sensor data from touch, gestures and audio commands can be used for inputs, as well as measure tilt, rotation, panning, shaking and other actions for flying, driving and steering.

As more sensors and sensor technologies come online, and as more devices elegantly harness those inputs, the computing platform will evolve. Ahmad said, “My hope is that we have not imagined all the great things yet to come for these platforms. Once the technology is in the hands of truly creative developers, they might come up with revolutionary usage models we haven’t even thought of yet.”

 
Related stories

Sensors Poised to Transform User Experience

Touch, voice, GPS and motion detection expand device capabilities.

The technology behind how humans interface with their PCs and mobile devices is advancing steadily thanks to the unleashed computing power now available. Combined with miniaturized mechanical sensors and smarter software, the user experiences postulated in science fiction are getting nearer.

Sensors detect and interpret common human gestures.

“Sensors can detect where you are, what you’re doing and what you should do next,” said Marisa Ahmad, Intel product marketing engineer. “The device can tell you that you’re not in the right building or room for a meeting. It might know that you have a certain shopping list to complete, so it could suggest that you stop at a certain store you happen to be walking past. It will be a contextual platform that learns your behavior and gives you suggestions based on preferences. All the sensors will initially be used for simple apps, but eventually what they’re capable of doing is giving contextual information about you and your environment.”

Devices in the future will be poised to enable amazing breakthroughs in perceptual computing that push the limits of computer-human interaction. They’ll include the ability to harness inputs from an array of sensors to provide user feedback and enable richer, fuller experiences.

“Look at all the sensors that are present now: an accelerometer, a gyro, a magnetometer, a light sensor, orientation, inclinometer and an e-compass,” said Arvind Kumar, a principal engineer and sensor architect at Intel. “Proximity sensors and ambient-temperature sensors will be available in the near future and will help with energy-optimizing, detecting humidity and more.”

Devices already exist that can interpret gestures, identify features, provide strong security features, calculate proximity measurements and enable augmented-reality applications. The growing sensor universe can be broken down into six categories:

Touchscreens rely on sensors to pick up the location of a pointing device such as your finger.

  • System Management. Device “health” monitors improve power efficiency, responsiveness and usability by analyzing sensor data and providing triggers to other platform components.
  • Location Analysis. Inputs for sensor-based inertial navigation improve location accuracy for GPS, mapping, traffic avoidance and driving instructions.
  • Enhanced Security. Provide such environmental data as location to trusted services, monitor usage scenarios, check in for lost or stolen status or shut down to protect data.
  • Gesture Control. Detect and interpret common human gestures to make user interfaces more natural.
  • Context Management. Determine platform usage whether indoors or outdoors, recognize and monitor the environment and detect human proximity.
  • Gaming. Sensor data from touch, gestures and audio commands can be used for inputs, as well as measure tilt, rotation, panning, shaking and other actions for flying, driving and steering.

As more sensors and sensor technologies come online, and as more devices elegantly harness those inputs, the computing platform will evolve. Ahmad said, “My hope is that we have not imagined all the great things yet to come for these platforms. Once the technology is in the hands of truly creative developers, they might come up with revolutionary usage models we haven’t even thought of yet.”

 
Related stories