Tech analyses 12 speech features, such as pitch and volume, to determine whether you're feeling happy or about to transform in to a giant green gamma-infected beast.
Engineers at the University of Rochester are working on a computer programme that can read human feelings through speech, which is verified across 12 features like volume and pitch, to determine whether someone is angry, happy, sad or neutral.
Actors took part in the experiment and simply recited the date, with Wendi Heinzelman, professor of electrical and computer engineering, saying: "It really doesn't matter what they say, it's how they're saying it that we're interested in."
Accuracy currently sits at around 81 per cent, and the research has already been used to develop an app, which displays a happy or sad face after hearing the user's voice.
Heinzelman, continued: "The research is still in its early days, but it is easy to envision a more complex app that could use this technology for everything from adjusting the colors displayed on your mobile to playing music fitting to how you're feeling after recording your voice."