The Engineer Using A.I. to Read Your Feelings
Rosalind Picard is an electrical engineer on a mission to engineer emotion into artificial intelligence. In the world of experts who speak in zeros and ones, Picard is schooled in the language of affective computing, which focuses on the importance of encoding emotional intelligence into our digital devices. She wants technology to recognize subtle changes in our mood and adjust what it does accordingly. Artificial intelligence, she says, should respect human emotions.
A tenured professor at MIT’s Media Lab, where she directs the Affective Computing Research Group, Picard likes to cite the provocation of Microsoft Word’s Clippy, the anthropomorphic paper clip that used to pop up on the screen with the message, “It looks like you’re writing a letter.” Clippy was meant to be helpful, but users were irritated by the smiling and winking paper clip bouncing across their monitor. “If that were a person, you would probably not invite them back to your office,” Picard says.
Picard’s work has led to new tools that sense changes in the body — namely, sweat and accompanying shifts in how our skin conducts electricity — that stem from changes in the brain. After an early start creating tools to help people with autism communicate, Picard eventually developed an FDA-approved device for predicting epileptic seizures and is now pursuing a more detailed map of the link between the brain and the body.
Her aim, she says, is to empower people through A.I. rather than simply create the next great robot. Picard wants to make wearables and other devices that contribute to people’s lives and “that we’ll feel good on our deathbed saying we helped create.”
The interview has been condensed and edited for clarity.
Rosalind Picard: One of the biggest driving forces for me has been moving from just building machines to refocusing on helping people have better lives. My original vision for A.I. was to build something autonomous and intelligent. I argued that whatever we built had to be emotionally intelligent. You wouldn’t tolerate an annoying robot. Eventually, we built the MACH [My Automated Conversation CoacH], an A.I. system that could respond to nonverbal cues. Our focus had gradually shifted from creating “eye candy” A.I. to impress people to helping people. MACH was designed to sense and respond to nonverbal information and reflect it back to people so they could learn how they were coming across, so they could fix things before, say, going to that important interview.
People who have a hard time reading facial expressions motivated some of our early projects. Some individuals with autism asked us for face-reading software that could give them a real-time interpretation of the facial expression of the person they were speaking with. We installed a tiny wearable camera into eyeglasses and connected it to a computer equipped with a program that could scan the individual’s eyes, nose, mouth, and other facial features and categorize the expression as indicating confusion, interest, thinking, or other feelings that people subtly signal to each other in conversations.
We created a device that measures changes tied to the sympathetic nervous system, the seat of our “fight or flight” responses. Specifically, the device measured electrical changes in the skin. The activation of our fight-or-flight system causes a whole bunch of changes, including opening sweat glands under the skin. Even if you’re not sweaty on the surface, the skin will start to conduct more electricity, which can be measured.
Around 2007, an undergraduate student gave these wrist sensors to his brother, an autistic child around seven years old, to try out. The child’s electrical conductance went up very high, but only on one wrist. I had no explanation for this. I thought our sensors were broken. How can you be stressed on one side of your body and not the other? The sensors weren’t broken; they were working fine. And it wasn’t the brother’s biology, either.
I called the student to ask if he knew what had happened, because the data didn’t make sense. He’d been keeping a diary and had the exact moment written down. He told me that the response happened 20 minutes before his brother had a grand mal seizure. I didn’t know the child had epilepsy, which often accompanies autism.
It turned out that the body was behaving in a way I didn’t know it could behave. Our device had been designed to monitor what’s known as a general arousal response — your palms sweat, your armpits, it’s a general response. Nothing I had read or studied prepared me for the fact that you could sweat on the right wrist and not on the left, unless something was wrong with you.
I was not the first person to discover this. Neurologists and dermatologists knew this already. But it wasn’t generally known about in the literature on electrodermal activity. The dogma is that the response is general.
We started a more carefully time-synchronized study at a hospital and found that the signal the sensor registered on the wrist precisely coincided with the onset of unusual electrical activity in the brain — seizures. We published that work in [the journals] Neurology and Epilepsia in 2012. That work led to the creation of a wrist sensor for epileptics to wear that alerts them about the onset of a seizure.
Given that a seizure in a particular part of the brain can cause a particular patch of skin to sweat or experience an electrical change, we wondered how much of this connection could be mapped out. So, say, if the right side of your amygdala is activated, maybe it changes one patch and not another. If we map those connections, then when the activity happens on the skin, we can infer something is happening in your brain that’s helpful to know about.
Sometimes this change in electrical activity can be detected using electrodes attached to the scalp. But sometimes activity deep in the brain, such as in the hippocampus, may not trigger something on the scalp but can be picked up on the wrist. When you start to understand how these things are connected, you can see that this part of the brain was activated even though you’re not measuring it directly.
We are conducting a study in which patients who are undergoing brain surgery for other reasons are invited to contribute data about what’s going on in their brains concurrent with information from a device that they wear on their wrists and other body parts during surgery. This study has begun at Dartmouth-Hitchcock Medical Center, and we are awaiting approval at two other sites.
It’s pretty wacky stuff. I would have written it off entirely if I’d heard of it. But many neurologists and psychologists are very open to recognizing that the brain influences the body in ways we are just beginning to understand. Many scientists used to consider the field of neurostimulation—triggering brain activity through electrical stimulation—as another wacky area, but it is very credible and respected now.
All of this is related to our emotions, which play a powerful role in our lives. They’re connected to many health disorders. And, like seizures, they trigger changes on the surface of our skin.
Over the years of studying emotional intelligence in A.I., I began to wonder about the role emotions play in making us sick or healthy. And how do we enable each individual to have control over that?
Forecasting seizures had a big influence on this line of thinking. Knowing what’s about to happen gives you control. Uncertainty increases stress, which is very related to mental health. We want to give individuals something to help them do better, rather than just focusing on A.I. that only people in powerful positions have access to. We are working with positive psychologists to better understand how people who are already well stay well. It’s shifted our focus to not only working with people who are unhealthy but also figuring out how to keep people healthy.
In the world of A.I., some of us are stepping back now and asking what we are doing to human health. What leads to true human flourishing and well-being? Are we enabling the kind of A.I. that gives wealth and power to a smaller and smaller number of people? Or are we enabling A.I. that helps people?
We’ve been doing a lot of reflecting to figure out how to build tools that will enable a better future for humans, not for machines. When we build technology that reminds people that they matter, that helps them achieve something they couldn’t do before, I just feel joy from head to toe.
The Engineer Using A.I. to Read Your Feelings
Research & References of The Engineer Using A.I. to Read Your Feelings|A&C Accounting And Tax Services
Source