Tuesday, August 25, 2015

Poderwij every girl and sell any merchandise. So does the machine, reading emotions – Gadzeto Mania

How to recognize and exploit human emotions? There are already machines that deal with this issue better than many people. Not only will tell you what it feels like our interlocutor, but suggest what to say to arouse his interest.

Sad Robot

Have you noticed how much has changed approach of pop culture to Robots? Not so long ago were primarily or mindless force – coincidentally – working or makiawelicznymi torturers, that try to impose their yoke of mankind .
And while theme-torn emotions machine appears even in the films “Blade Runner” or “AI: Artificial Intelligence”, recently we have to deal with him more often . Let’s take just the last several months.

“She” of the spiritual relationship of man and the operating system and “Ex Machina” with Ava aware of their own spirituality. For this panned by critics for pretending Kubrick and appreciated by me though, “Automata” with melancholic robots, seeking his paradise, “Chappie” from corrupting artificial intelligence in a climate of Zef or showy “Vice” Targa wykasowanymi memories alluring androidką – all puts on its head the old Circuit .

In addition to “Her” is just a man acts as a threat, and the machines are on their – not always identical to our – how compassionate, empathetic and more human of people. Best of all is the fact that watching a sentient machine on the screen, we can not be aware spray that yes – watch “science”, but not “fiction” .



US +: do not talk so much about yourself, ask about feelings

Machines learned, or rather were taught to recognize and interpret human emotions. I deliberately do not write here about feeling, because for all responsible soulless algorithms, but – from the point of view of efficiency – the robot is able to detect and your way to understand the emotions people. The best is all that can cope with it better than many of us.

A few years ago, Lauren McCarthy and Kyle McDonald created a tool called the US + cooperating with Google Hangouts. US + monitor our speech and shall be analyzed linguistic , based on the tools of mysterious names fighters (Linguistic Inquiry Word Count) and LSM (Linguistic Style Matching), and to analyze the behavior of our interlocutors.

All This happening in real time , allowing US + can during a call to show us information on its course, represent clear statistics, the Preda all suggest eg. change the subject.

practice, so we have a machine that not only carefully listening to our conversations, but also can – often better than us – read emotions Caller and indicate how to react. Aptly described by James Koźniewski from stock FUTU:

Klara combined with her mother via Google Hangouts. He immediately starts talking about annoying colleague yoga classes, which recently became pregnant and constantly talking about it. Torrent of words on the Clare and her emotions seems to have no end. Suddenly, in the corner of your computer screen Clare discrete notification pops up: “You talk too much about yourself. Ask about the feelings of your interlocutor. ” Klara wordcap stops and asks mum about how he feels, so it can learn from her about an important issue that bothers her recently.

SHORE – scanning Caller

Sounds incredible? I agree – it’s not just sounds, but it is amazing. The possibilities for this machine, however, is not the end. In their coverage is also Recognition of gender, age and emotional state our interlocutors, an example of which can be even developed by the Fraunhofer Institute’s my favorite app called SHORE, originally designed for Google Glass.

It is also an example of how much progress has been made in this area in recent years – to recognize human emotions was possible earlier , but was based on a number of different data supplied, among others, by sensors, heart rate, sweating skin. It is no longer necessary – just simply view the face.

Thanks to modern technology we can have – with a remote touch – emotional intelligence at the level of glacial boulder, but still unmistakably know what they feel at the moment our interlocutors . An example might be even a tool called Real Eyes, designed for the analysis of reactions people watching the video.



Recognition microe

I remember how many years ago with great interest I watched the TV series “Lie to Me”, whose title translated into Polish – I usually predictable – as “Lie to Me”. Serial talked about played by Tim Roth’s Dr. Cal Lightmanie, who along with his team read of people may not like an open book, but sufficiently to replace any sensitive polygraph .

The series was based on real tests conducted by the American psychologist Paul Ekman and his theories on non-verbal communication, and in particular microe . These are involuntary, lasting about 50 milliseconds muscle spasms, expressing – using facial expressions – our true emotions.

The problem is that (apart from the controversy related to this issue) for most people microexpression are invisible . What other machine – a good camera, recording many frames per second and software for analyzing the image can in this matter do wonders. Evidence of this can be even tools such as affective or Emotinet, analyzing the behavior of the filmed people.

The sale will not end

Why all this? For now, the aim is primarily the sale: an analysis of responses to an advertisement or different products or the use of “suffering” machines extracting money from us . However, their ability to reach much further.

This is what is proposed to us SHORE US + if it’s only a foretaste of the possibilities of this technology. If you already can in real time to analyze our conversation, suggest topics and check the reactions of callers that the near future promises to be very interesting: with the support of technology each of us can be life of the party , successful sex life or salesman from which those veterans sale, as John Bosworth of “Halt and Catch Fire” will be able to learn.

I wonder, however, what happens when all begin to use such improvements, carrying them with you even in the form of glasses or some neurowszczepek. Can you imagine how the world will look like when we begin to say what suggested to us the machine, analyzing each other’s effectiveness action own algorithms?


We live in interesting times.

LikeTweet

No comments:

Post a Comment