A New Advisory Helps Domestic Violence Survivors Prevent and Stop Deepfake Abuse
April 25, 2018
New advancements in technology are commonly announced with fanfare and excitement. Domestic violence advocates seldom react with the same enthusiasm. From experience, they must rapidly prepare for how the new technology will be misused to inflict harm on the population they serve. Artificial intelligence, it turns out, is no exception.
The latest trend, Deepfake technology, uses an artificial intelligence method called deep learning to recognize and swap faces in pictures and videos. The technique begins by analyzing a large number of photos or a video of someone’s face, training an artificial intelligence algorithm to manipulate that face, and then using that algorithm to map the face onto a person in a video. Although this technique may have legitimate uses, it can also be used to perpetrate intimate partner abuse, by making it appear as though one’s partner was in, for example, a pornographic video that they were not in fact in.
The attached Domestic Violence Advisory, authored by attorneys Erica Johnstone of Ridder, Costa & Johnstone LLP, and Adam Dodge of Laura’s House, explains how California family courts can prevent and stop deepfake abuse under California’s Domestic Violence Prevention Act.