Perceptual Computing: Do actions speak louder than words? | Mobile Apps | Mobile Entertainment

Perceptual Computing: Do actions speak louder than words?

Perceptual Computing: Do actions speak louder than words?
Daniel Gumble

by

Apps / February 6th 2013 at 11:30AM

This week's Intel Developer blog looks at the ways in which perceptual computing could change our daily lives.

Following on from the recent ‘perceptual computing: hands-down the best way to interact’ blog, I’ve been researching this topic to find out what is likely to change in computing and how it will impact us. I don’t expect  to see such a radical change that robots will be opening my Xmas gifts in 11 months time, but the ‘sown seeds’ are starting to sprout and show the future of technology today.

Intel senior vice president Mooly Eden recently gave us a glimpse of what comes next, as he commented that “voice recognition will do to touch what touch has done to the physical keyboard.” He continued to explain this theory: “Why do people use touch instead of keyboards? Many people say it's because it's intuitive. What I'm saying, is that voice will do to touch what touch did to keyboards”. Eden makes an interesting point; we do not touch each other in a business environment. In most cases, it is highly frowned upon. If touch is no longer intuitive because we have learnt to stop ourselves reaching out, then what comes next?

His viewpoint is both visionary, and slightly daunting. How we interact with our devices is always evolving. If touch is soon to be replaced, and voice is to become the new touch, then I can’t help but wonder ‘what will become the new voice in five years?’ For those reading this and thinking you have a great idea for what will come next – don’t forget to check out the perceptual challenge competition. You could be rewarded with cash prizes for your ideas.

For those who want a bit of inspiration, let me tell you about a few pieces of new tech that are expanding beyond just touch and that could make a difference to us all:

1.    A smartphone that tells you if you've got bad breath or are over the alcohol limit for driving. Our smartphones might be able to take the sense of smell and taste and digitise it to help us with healthcare or personal hygiene applications.

2.    A car that responds through eye-tracking technology. If you want to adjust the heat in the car, the radio volume or call someone, then you can either wave your hand or carefully glance at the thing you want to change.

3.    The line between technology and fashion seems to be blurring. I am talking about having displays on your fingernails. Yes, I did say ‘on’. With a coat of organic light-emitting materials you could soon have a mini screen at your fingertips.

The possibilities are endless, but the important question is what can you do with all of these innovations when developing apps or software?

I am sure many of you already have several app ideas on the tip of your tongue. If so, then make sure you enter the Intel Perceptual Computing Challenge. Intel has set this challenge up to encourage developers to add perceptual computing elements to their apps, such as gesture control, voice recognition and facial tracking. The competition closes on 20th February and offers a total of $1 million in cash and prizes. To get started, download the SDK and start exploring!

This blog post is written by Softtalkblog, and is sponsored by the Intel Developer Zone, which helps you to develop, market and sell software and apps for prominent platforms and emerging technologies powered by Intel Architecture.

* Like what ME does? Make sure you sign up to our free daily news headline service by clicking here. You can also follow/like us on Twitter and Facebook.