II.1 AI makes Happiness Measurable by Aggregating the Wisdom of the Crowd
Computers and the Internet empower us to measure inter-human interaction on a high level of granularity and detail. Sensors combined with AI give us the capability to constantly analyze and interpret communication. In this chapter we will learn how to leverage the most recent technological advances of the Internet, wearable technology, cloud computing, and artificial intelligence to measure happiness, wellbeing, workplace satisfaction, and stress, and mirror back these measurements to the individual. This will lead to more connected, collectively aware, entangled team members, and thus to teams collaborating in groupflow.
The recent progress in artificial intelligence allows users to successfully accomplish typically human tasks of common-sense pattern recognition that were impossible for humans until now such as recognizing human faces and writing newspaper articles. While the “wisdom of crowds” has been described and tested many times, AI also combines the limited intelligence of isolated individuals to empower these isolated individuals to accomplish tasks which would be impossible for the individual to achieve in isolation. For instance, there is a test called “reading the mind in the eye” that tests emotional intelligence by showing only the eyes of a face and asking the reader to correctly identify the emotions of the person belonging to the face. This is one of the tests to identify autistic people, it has also been shown that men perform much worse in this test than women. When I took the test, I got about 50% of the faces wrong. Using facial image recognition, AI can get up to 90% accuracy recognizing the emotion of a face. This is done by combining the wisdom of the crowd: A machine learning system is trained with thousands of face pictures which previously have been labelled by different people with the emotion that the individual person doing the labeling thinks she sees in the face. Thus, even if an individual is as bad as I am in recognizing the correct emotions of others from looking at their eyes, combining the collective emotional intelligence of many people allows the AI system to make a 90% accurate prediction. It gives the “emotionally challenged” individual a tool for accurately recognizing the mood of the person the individual is talking to. For instance, wearing smartglasses, a user can apply this automated face emotion recognition to look at other people’s faces, and have the computer tell the wearer of the smartglasses the emotions of the people the wearer is looking at, thus overcoming a potential deficiency in reading the mind in the eyes.
Figure 15. Emotion recognition with Vuzix Smartglasses
Figure 15 shows the view from the smartphone camera of an app that we built that runs on an Android phone in combination with the Vuzix smartglasses. It captures the face of the person that the wearer of the glasses is looking at – in this case I am looking at myself, while wearing the Vuzix glasses running our software, the picture on the smartphone shows the wearer the emotions of the person the wearer is looking at. In this example I get a confirmation that I am showing a happy face.
On the technical level we combine social network analysis, online social media tracking, machine learning and AI to build a social map of self and its environment as the foundation for measuring and improving entanglement in groups and between two individuals. Our approach tells you where you are on the journey of finding yourself, knowing who you really are, and how to better navigate your social landscape for a happier future. Our approach includes the calculation of human emotions from facial expression recognition, and body language through sensors. The same methods are applied to analyzing global networks such as Twitter or Reddit to find entangled swarms of Trump followers and global warming activists. In organizational networks constructed from communication archives such as e-mail, entanglement in teams, departments, branch offices, and divisions of companies and other organizations can be measured. In direct interpersonal interaction among two individuals and in small teams their degree of entanglement can be tracked. Our approach is based on social network analysis, a statistical method based on graph theory to compute the structural properties of networks, and compare them with the business task that the members of the network are working on. In addition, we also use Natural Language Processing to analyze the content of network interactions, such as the content of tweets, E-Mails, and Skype, Teams, or Slack messages. Using image and sound processing we calculate emotions from facial expressions and from voice as well as from body movements captured by the sensors of a smartwatch. All of these metrics extracted from communication archives are fed into a time series analysis of changes over time. Using these values as inputs for machine learning will predict the future behavior of network nodes – individuals and groups – based on their statistical properties. Analyzing online social media using this method will foresee tomorrow’s mood of Donald Trump, the spread of COVID19 through online social media, and what will be the most popular news tomorrow. Creating online virtual tribes shows customer demographics based on their online expressions. Examining corporate e-mail allows a company to create a “weather forecast” of tomorrow’s customer, employee, and supplier satisfaction. Tracking body signals of individuals through smartwatches and face and voice emotion recognition with cameras and microphone will calculate individual and team happiness and meeting satisfaction.