US Military Group Wants Weaponized Deepfakes, Better Biometric Tools
At least some in the U.S. military have heard enough about deepfakes and they want in.
Investigative-news publisher The Intercept has got hold of a lengthytechnology wish listthat its editors feel was created by the U.S. Special Operations Command. Two items in the document are biometric in nature.
The command, most often referred to as SOCOM, performs the United States’ most secret and daring military missions. And officers want to add the ability to create and deploy deepfakes against those outside the country.
They also want to better their game when it comes to biometrically identify individuals using, among other techniques, touchless fingerprint capture over long distances and in all environments. Officials also want rapid handheld DNA collection gear. This can be found in the document above under 18.104.22.168 Biometrics.
In all cases,SOCOMwants to cut false positives and the ability to compare scanned biometrics against watch lists on handheld devices or remote databases. Those handhelds will need to perform all common biometric analyses, including DNA comparisons.
But the showstopper is the unit’s deepfake ambitions (at 22.214.171.124. Military Information Support Operations in the document). The leaders of many advanced economies, including various agency heads in the United States, have publicly stated their wariness of deepfakes.
(Three years ago, aNATO panelabout deepfakesdismissedconcerns about deepfakes. Even last year, there were thosetelling peoplenot to worry.)
Activist Post is Google-Free
Support us for just$1 per month at PatreonorSubscribeStar
Many feel military deepfakes belong in a category of weapon that by their nature cannot be reliably controlled (in some cases, conceivably) once unleashed. There is no end to the scourges that could result, which could include rape, biological and chemical attack and nuclear bombs.
Some military officers and military experts think deepfakes can be interpreted as at least partly illegal according tointernational laws of war. They likely run afoul of article 37 of the Geneva Conventions, prohibiting perfidy.
One common example of perfidy is pretending to want to surrender. So is a soldier pretending to be wounded or to be a civilian.
It is less clear in situations where nations or even soldiers on the battlefield might use a deepfake to convince civilians that a particularly heinous attack is coming, creating panic at the least.
A case can be made that perfidy has occurred in Ukraine, where a deepfake of the country’s presidentappeared, telling his nation to stand down in their defense against Russia’s invasion. It has been widely reportedly that Russian troops have tried to pass themselves off as civilians.
Brigham Young University law professor Eric Talbot Jensen wrote about this topic three years ago and decided, “Deepfakes present an inevitable innovation” in war making.
In hisanalysisfor the scholarly publicationArticles of War, Jensen’s suggestions are few.
The international community has to judge which uses of deepfakes are illegal in war. And military leaders have to find uses for deepfakes that are safe for civilian populations.
Jim Nash is a business journalist. His byline has appeared in The New York Times, Investors Business Daily, Robotics Business Review and other publications. You can find Jim onLinkedIn.
Become a Patron!
Or support us atSubscribeStar
Subscribe to Activist Postfor truth, peace, and freedom news. Follow us onSoMee,Telegram,HIVE,Flote,Minds,MeWe,Twitter,Gab,What Really HappenedandGETTR.
Provide, Protect and Profit from what’s coming!Get a free issue ofCounter Marketstoday.