Apple’s Acquisition of Emotient
Apple just bought a technology company dedicated to facial expression recognition called Emotient. I have to admit that this kind of thing gives me chills, or fear, or I don’t know. I’m not talking about the acquisition itself (which should scare us anyway because of the tremendous power that large companies possess and how they buy each other out), but about the recognition of human facial expressions.
The Problem of Involuntary Privacy Loss
Because it undoubtedly shows that there is no limit when it comes to privacy or respect for ourselves. All our actions in the digital world (with or without our consent) feed large data mining systems that are used for various things. In the face of this, we currently have the possibility of not participating, of not leaving a trace, simply by not using digital services, looking for anonymity alternatives or something like that.
But when it comes to face reading, we have no way to control it. Nobody asks us if we let a computational system read and process our expressions. But somehow our expressions are public, since we don’t walk around with our faces covered, somehow they are readable and processable.
The Terrifying Landscape of Technological Convergence
At first it can be very fun, but if we start adding up related technologies, the picture can be terrifying. Let’s add them up. Facial recognition + cameras in almost all devices + cameras on the streets.
Concerning Possible Uses
Mood-targeted Advertising
Advertising sales related to people’s mood. Are you sad? I sell you antidepressants. Are you happy? You’ll take advantage to make some stupid decision, I sell you something you don’t need.
Massive Emotional Experimentation
I experiment with your expressions and see what happens. Simplified: Are you sad? I show you something and make you happy, or the inverse: are you happy? I show you something that makes you sad. Using this massively I can discover knowledge related to what elements make people sad or happy. It’s not that being sad or being happy is bad… what I think is wrong in my understanding is that they experiment with our feelings and that we are the test subjects, the guinea pigs.
The Potential of Massive Analysis
Imagine putting a camera in a movie theater and capturing the expression of all attendees. The potential of what can be achieved with the generated data is tremendous.
The Risks of Proprietary Software
Now, Apple owns this technology. Apple’s operating system is closed, so I have no way to control what it actually does. What if the camera on their phones or laptops is activated without me realizing it and my mood/face is monitored? What if they use that vulnerability of mine (of my mood) to take advantage of me because I somehow have my defenses down? What if with all that data processing they discover new ways to manipulate the human mind?
Final Reflection
Maybe my thinking is a bit apocalyptic, maybe part of this has already been happening for a long time and we didn’t realize it. We don’t know.
Somehow it’s related to what I wrote some time ago about Google Glass .
And I return to citing the cartoon Mighty Man and Yukk.

Yukk and the mighty man
Update
In the end, Apple bought this company to be able to unlock phones with faces. I think I was a bit exaggerated and pessimistic in this post…