What's In A Face? Using Deepfake Tech To Spin Gender Bias On Its Head
Posted on Oct 28, 2020
Last year, Megan Sirockman and Kelsey Chudiak of digital experience agency Critical Mass presented at a conference. So did their (male) colleague. When they compared their feedback with their male colleague’s they discovered one major difference: his was about the content of his talk and theirs was about them as people.
Turns out, that’s pretty common. Women get feedback about how they look and act 35x more often than men.
Megan and Kelsey decided to shed some light on the bias that exists within feedback by using deepfake technology to turn themselves into dudes. Same talk, same presenters, just different gender. They discovered pretty quickly they had a lot to learn about deepfake technology, and how bias manifests itself in performance reviews, feedback, their industry, and society.
Using everything from Harvard-designed testing programs and bias-tracking metrics to case studies powered by leading-edge deepfake technology and more, they dissected the whys and wherefores of gender inequality – and what you can do to be part of the solution on both an individual and enterprise-based level.