Intel demos perceptual computing software toolkit

Monday, February 25th 2013. | Software News

BARCELONA — Software engineers at Intel are exploring new ways people can use the human voice, gestures and head-and-eye movements to operate computers.

hand gestures in a demonstration of a perceptual computing
Intel’s Barry Solomon uses hand gestures in a demonstration of a perceptual computing toolkit being used by independent developers. (Photo by Matt Hamblen/Computerworld)

In coming years, their research is expected to help independent developers build computer games, doctors control computers used in surgery and firefighters when they enter flaming buildings.

“We don’t really know what this work will become, but it’s going to be fascinating to watch it play out,” said Craig Hurst, Intel’s director of visual computing product management, in an interview at Mobile World Congress. “So far, what we’ve seen has gone beyond what we thought of originally.”

Intel’s visual computing unit, created two years ago, has grown to become a top priority for the chip maker, Hurst said. Last fall, the unit released several software toolkits that are used by independent developers to create a raft of new and sometimes unusual applications.

One of the toolkits, called the Perceptual Computing SDK (software developer kit), was distributed to outside developers building applications that will judged by Intel engineers. Intel is planning to award $ 1 million in prizes to developers in 2013 for the most original application prototype designs, not only in gaming design, but also in work productivity and other areas.

Barry Solomon, a member of the visual computing product group, demonstrated how the Intel software is being used by developers on Windows 7 and Windows 8 desktops and laptops. With a special depth-perception camera clipped to the top of his laptop lid and connected over USB to the computer, Solomon was able to show how the SDK software rendered his facial expressions and hand gestures on the computer screen, accompanied by an overlay of lines and dots to show the precise position of his eyes and fingers. A full mesh model can then be rendered.

With that tracking information easily available, a developer can quickly insert a person’s face and hands into an augmented reality scenario. Or, the person can be quickly overlaid onto a green screen commonly seen in video applications to make a weather or news report. The person’s gestures could be used by a developer to interact with functions in a game or productivity application.

A company called Touchcast is building a green-screen application that will be available later in 2013. The prototype camera, called the Creative Interact Gesture camera, which Intel uses in its perpetual computing demonstrations with the SDK, will also be for sale later this year.

Mobile World Congress 2013

  • Intel demos perceptual computing software toolkit
  • AT&T to wirelessly connect most GM vehicles over LTE in 2014
  • Carriers hope Firefox makes OS market more competitive
  • Aruba announces controller and software for hybrid wireless networks
  • Samsung bolsters BYOD management with a Fort Knox approach
  • BYOD gets attention at Mobile World Congress
  • MasterCard to expand digital wallet service beyond NFC
  • HP to ship Slate7 consumer tablet for $ 169 in April
  • Mozilla previews Firefox OS with four phone makers and 18 operators onboard
  • New Lenovo tablets run Android 4.2, quad-core processors
More on MWC 2013

Related For Intel demos perceptual computing software toolkit