How a computer can know our emotions

images.jpg

Computers will read our emotions by the same process that humans do. It begins by connecting an array of sensors (cameras, microphones, skin conductivity devices) to a computer that gathers varied information about facial expression, posture, gesture, tone of voice and more.
Advanced software then processes the data, and by referencing a database of known patterns it is able to categorise what it is picking up from the sensors. The pattern might match for a range of emotions such as angry, disgusted, afraid, happy, sad, surprised, amused, contemptuous, contented, embarrassed, excited, guilty, proud of an achievement, relieved, satisfied, sensory pleasure or ashamed.

image-20151125-4062-1euitcc.jpg

The system uses a feedback loop to learn and improve. If it is connected to other systems, what one system learns can be learned by all.
Here’s where it gets scary for some. With
ageing populations and more people living alone, there is rising demand for companions and helpers at home and work to perform tasks that people are reluctant to do.
This need will increasingly be met by anthropomorphic artificial intelligence functional robots that look and behave like humans. Over time, these will become increasingly life-like because people like projecting human qualities on the things we live with.
While still in the realm of science fiction, with so much effort going into their creation, a world in which people live and interact with humanoid robots is not far off.

H2
H3
H4
3 columns
2 columns
1 column
3 Comments
Ecency