In past conflicts, authoritarian regimes have attempted to exploit American POWs for propaganda gain. These efforts often took the form of video and audio recordings as well as photographs of prisoners of war, although these activities represented a clear violation of the Geneva Conventions. The capabilities of advanced digital capabilities such as deepfake technology present an important new tool for potential adversaries in future conflicts. The US military must prepare for the possibility that these new technologies will be used against prisoners of war in future conflicts.
Deepfake is a video and audio processing technology to make it appear in a video that a person is saying or doing something they never said or did. This technology uses the target’s pre-existing audio and video to create a video (possibly even a real-time, live video feed) where another person controls what is being said by the deepfake subject, duplicating the target’s face, features, speech, and voice privilege. The end product is often not only acceptable, but also believable.
Many famous deepfake clips have appeared on the internet on social media. The deepfake of Belgian visual artist Chris Aumé gained international attention when he created compelling manipulated videos featuring what appeared to be Tom Cruise. The person instead is actor Miles Fisher who was created by Umi to look like Tom Cruise. It took Ume two months to create the Tom Cruise deepfake, but he was unable to access Tom Cruise and was unable to contact him to extract audio or features to speed up the deepfake creation process. Today, deepfakes can be created in less than five minutes. In a POW or POW scenario, the captor’s access to the POW will make it very simple for the POW to create a fake POW avatar.
From the perspective of prisoners of war and captive recovery, this technology creates two distinct concerns.
The first concern is to release the deep fake of a prisoner of war to the public. Despite the violation of the Geneva Conventions, such deep forgeries can be manipulated and used to create accounts of war crimes and atrocities, rejection of the US war effort, demands to end the war, and other propaganda. Videos and audio on the American front could be widely distributed to undermine the American war effort and will to fight, strain families, influence politicians, and create divisions in societies to weaken support for the war.
The second concern is that the hijacker may show a deepfake to POWs in order to manipulate them while they are in captivity. The captor can use the technique of deep forgery to indoctrinate, destabilize psychologically, and manipulate the captive’s mental state. This effect becomes more likely in a prolonged conflict where families may persist for several years. Even if every deepfake can be dismissed as potential fake, it is possible that over time, isolation and pressure from surrounding circumstances can cause a POW to accept the deepfake as real.
In our view, the concerns of fake POWs must be addressed before potential conflicts where such tactics can be used. Planning and research initiatives must begin to address these increasingly likely possibilities. Initial efforts should include: (1) establishing methods for identifying deepfakes shortly after their publication, (2) exploring the possibility of preparing a real, validated video of all service personnel as an aid to identifying deepfakes, with sufficient data to be deposited prior to publication (much like what is currently being done for ISOPREP), (iii) prepare both the military and the public in advance for the possibility of deep forgery, and (iv) include deepfake information in POW training in order to prepare service personnel for the possibility that deep forgery may be used. against them in captivity.
Jan Kalberg is a research scientist at the Army Cyber Institute at West Point and an assistant professor at the US Military Academy. Colonel Stephen Hamilton is Chief of Staff and Technical Director of the Army Electronic Institute at West Point and Associate Professor at the United States Military Academy. The opinions expressed are those of the authors and do not reflect the official policy or position of the Army Internet Institute at West Point, the United States Military Academy, or the Department of Defense.