Facial recognition software works pretty well. It measures various parameters, such as the distance between the person's eyes, the height from lip to top of their nose and various other metrics and then compares it with photos of people in a database.

Why not create emotion recognition software that can use its own custom parameters? 

Dev Drume Agrawal, Shiv Ram Dubey and Anand Singh Jalal of the GLA University, in Mathura suggest in the International Journal of Computational Vision and Robotics
has taken a three-phase approach to a software emotion detector.

The first involves developing an algorithm that can precisely identify and define the features of the human face. The second then analyses the particular positions and shapes of the face. The third phase then associates those features with a person's emotional state to decide whether they are happy, sad, angry, surprised, fearful or disgusted. Preliminary tests gave a 94 percent success rate the team reports.

The 1960's notion that half of human communication is non-verbal has been debunked but facial expressions and body language do convey a lot of information about a person's thoughts and emotional state. Such information, if it could be interpreted by a computer, would allow us to enhance human-computer interactions.

Obviously that would make its way into the iPhone 10, where Siri changed the background image or music based on whether or not you looked happy or sad repeating the same question three times. But in a more practical application, the recognition of anger, pent-up aggression, or fear at airport screening might allow suspicious individuals to be channeled sooner rather than later to security while those with nothing to hide would be funneled through to the usual physical checks with less delay. In "Minority Report" fashion, they could stop crimes before they happen - except for actual sociopaths, who would never have anything but a neutral face.

"Our experimental results suggest that the introduced method is able to support more accurate classification of emotion classification from images of faces," the team says. They add that additional refinements to the classification algorithms will improve their emotion detector still further.