Mental intelligence has been quantified for centuries through recognised examinations such as IQ tests. Emotional intelligence has proved harder to capture. An Artificial Intelligence (AI) startup called Human may have cracked the EQ code.

The company claims it has developed Emotional Artificial Intelligence (EAI) software that can decipher deep human personality traits in real-time. It does this by analysing subliminal facial expressions in real-time.

Video captures a person's individual expressions, which machine learning evaluates to both instantly reveal their emotions in a given moment and understand their underlying behavioural traits. The results can identify whether someone is lying, disagreeing, nervous or passionate.

The software can help companies to identify the ideal candidate for a job, minimise human bias, understand customer feelings, and predict human behaviour by understanding their feelings.

"We look into the micro-expressions," says Human's CEO and founder Yi Xu. "These are the milliseconds of movement on the face which the naked human eyes often get wrong."

These brief, involuntary facial movements reveal the emotions that humans can otherwise hide.

Human's machine learning models classify them into a range of emotional categories, such as levels of happiness and character traits like confidence.

The startup was founded in London in 2016 and has offices in the UK, China and the US. The team includes data scientists, microexpression coders, psychologists and Yi, whose CV includes stints as an investment banker at Credit Suisse and a news anchor for Shanghai's state-owned Finance Channel. It was in the latter role that the idea for Human emerged.

"We had about 600 million audience coverage, but I had zero interaction with my audience," she recalls. "They just switch channels. I didn't know what I did to cause that, but I realised it had something to with my memorable communication.

"I conveyed my message to them through the space, and I wanted to quantify this to understand my own performance. And here we go; the first form of Human was born."

How Human works

Emotional intelligence has been assessed by AI before. Human enhances its value by adding the evaluation of human characteristic traits.

Emotions can vary dramatically between seconds and situations, but characteristics retain a more stable state. They help Human to provide a more reliable personality profile.

They're broken down into seven separate types: passion; confidence; honesty; nervousness; curiosity; disagreeing; and judging.

Human can assess the emotional states of numerous individuals as they change. Credit: Human

Clients can also request Human to tailor specific categories to suit their specific needs, such as an individual's level of distress and depression.

"We give a company a combination of the emotion characteristics, and then they interpret that into what the profile looks like," explains Yi.

"There is no right or wrong. It doesn't mean that not being passionate is better. It could be that for this particular purpose I want to find someone who is a calmer person, for example, or a less active person.

"We tend not to interpret the data for our clients. We provide the solution and we minimise the human bias in the data for them. However, it's ultimately a human decision process. We leave the decision in our clients' hands, for them to decide how they want to use this data, and how they want to look into the data."

Human's diverse roster of clients

Recruitment companies have been particularly interested in Human's technology. Video and voice interviewing company Montage has been working with the company to reduce discrimination in job interviews.

"They send us the videos," explains Yi. "We don't know who is who, we just see video data, and then we send back the analysis. The idea behind this process is to remove the bias or discrimination in the screening process."

The software also allows recruitment companies to screen candidates by emotional intelligence and understand who they are as well as what they know.

"You don't know who the person is. Race, gender, or age don't matter. You tell us that you want to screen candidates for the most passionate, or the most confident. Then we rank these, and provide you a list of candidates ranked highest with the traits you select.”

Human is also helping sports intelligence companies and teams, including an unnamed Major League Baseball (MLB) club, to scout players and assess their mental state before and during games, which they can compare with both their teammates and competitors.

Identity fraud detection is another early use case. Clients who talk to potential fraudsters can assess stress reaction, honesty and nervousness, adding another layer of the data to their investigations.

Yi believes the technology can also make a deeper contribution to society. It could prevent violence by assessing public domain information for emotional risk, or proactively identify problem gambling through distressed player identification.

Human is currently working with a public authority to identify suicide risk by conducting real-time analysis on CCTV footage of faces in a crowd.

"They know at certain spots, people have a higher likelihood to commit suicide," says Yi. "It's never early enough to prevent these. Now, they want to use a machine to send alerts."

By recognising the expressions associated with suicide risk, they can immediately reach out to the victim and offer help.

Yi says the client is responsible for addressing privacy concerns and following data protection regulations, but Human has its own measures in place to limit the risk.

"We are very cautious who our clients are," she says. "We want all of our clients to get on top of data privacy, but at the same time, we are aware for a potential breach of data privacy by certain parties in the future, where, we haven't seen so far, but we are ready. When that case comes up, we are ready to fix it."

Limitations of EAI

Cynics among us may question the accuracy with which AI can evaluate the correlation between an individual's distinctive facial expressions and their emotional status. These can be strikingly varied. Some of us are more emotive than others, and at times we deliberately disguise our true feelings in our faces.

Yi claims that Human is capable of recognising extremely subtle variations.

"In every facial expression, every emotion, or characteristic trait, there's always so many things that jump up, but you'd be surprised by how sensitive [the details are that] machine can pick up, because it looks at a lot more data than the human eye can see, and at a lot higher frame frequency," she says.

The effect of cultural differences on facial expressions is another area of debate.

Charles Darwin argued that they are innate and universal, but other prominent scientists believe they're culturally learned.

Dr David Matsumoto, a professor of Psychology at San Francisco State University and expert on microexpressions, has conducted research that supports Darwin's claims. Matsumoto analysed thousands of photographs of both sighted and blind judokas at the 2004 Olympic and Paralympic Games, and discovered that they displayed similar facial expressions during their wins and losses.

Human's algorithms are adjusted to account for any potential impact of cultural differences, but the company leaves any actions up to its client.

"You wouldn't consider a Finnish passion level is the same as American passion level, for example, right?" Yi asks.

"That's a cultural adjustment that we take into account. But we only do the adjustment. We would never, say, discriminate, separate the groups completely and then compare them."

Her ultimate ambition is to use EAI to develop a deeper level of understanding about human nature and behaviour.

"Even psychologists use human judgment," she says. "So I want to see if there is any way we can decipher this part, we can really understand who we are better."

Find your next job with techworld jobs