This week’s Intel Developer Blog explores the world of perceptual computing and instinctive forms of interaction.
A friend recently told me how her two-year old daughter is now so used to using a touch screen device that when she looked at a book recently she tried to touch the page to zoom in. It is both scary and intriguing. Children try to understand and learn about the world through touch – now it is our turn to get experimental.
Computing is moving in a new direction, where natural and instinctive forms of interaction with devices become the norm. The idea of ‘perceptual computing’ has emerged. I am sure you have heard this phrase thrown around, especially after CES, but for argument’s sake; perceptual computing is about enabling devices to understand us – through physical contact, verbal commands and gestures.
A demo that caught our attention at CES was from Intel. The demo was of the popular game Where’s Wally and showcased new eye-tracking technology. The idea is simple, and gives our computer mouse a break; we move our eyes when we interact with a device, so why not harness that movement? 2013 could reveal some interesting advances in perceptual computing.
Intel conducted research* that showed users find touch-based computing faster, easier, and more intuitive, as well as more fun. Perceptual Computing adds an extra dimension to our daily tasks. The touch screen helps nurture discoveries for all ages.
Before we know it, our computers’ ‘brains’ will be capable of interpretation of our every want and need. Gone will be the days of the needy devices constantly seeking your input and reassurance asking ‘is this junk?’ or ‘should I download this update?’. Our devices will gain emotional intelligence. With this, a real opportunity arises for developers. Now is the time to show off your coding skills and get creative. As perceptual computing is relatively new, there are no limits. Intel has created a set of ‘top ten resources for developers to get a head start with perceptual computing’.
One way to get ahead in this ever-changing interactive world is through the Perceptual Computing Challenge. This contest has been set up to encourage developers to add perceptual computing elements into their apps, from gesture controlling, voice recognition to facial tracking. You can also win prizes for this challenge. The competition is now open and a total of $1 Million in prizes is in store for the winners. To find out more, you can download the SDK** and get exploring.
Do you have more than one great idea? Then there is some good news: there is no limit to the number of entry submissions!
*Research conducted by Intel User Experience Group in USA, China, Italy and Brazil. The testing saw 81 individuals use touch and non-touch Ultrabook devices, with qualitative observations made throughout the process. Over 4,000 varying device interactions were logged throughout the research.
** The SDK 2013 Beta is free and provides fast, easy programming access to perceptual computing functionality.
• This blog post is written by Softtalkblog, and is sponsored by the Intel Developer Zone, which helps you to develop, market and sell software and apps for prominent platforms and emerging technologies powered by Intel Architecture.