Senate Foreign Relations Chair Cardin targeted in deepfake attack


Full story

The chairman of the Senate Foreign Relations Committee was the target of a deepfake attack in which someone posed as a top Ukrainian official during a Zoom call. The incident, which happened earlier this month, involved someone impersonating the now-former Ukrainian Foreign Minister Dmytro Kuleba during a conference with Sen. Ben Cardin, D-Md.  

The story was exclusively reported by Punchbowl News. Straight Arrow News reached out to the committee to confirm the incident and the details. 

The Punchbowl report said the person impersonating Kuleba asked questions participants thought were bizarre like, “Do you support long-range missiles into Russian territory?”

According to Punchbowl, the Senate Security Office warned a group of leadership aides and security chiefs to be vigilant about others trying to do the same. 

Sources told the outlet that Kuleba’s voice was likely recreated using artificial intelligence. The security notice stated the deepfake had “technical sophistication and believability.”

The motive is still not confirmed, however, people briefed said Russia may have done this in an attempt to create propaganda with a senior U.S. senator saying he supports attacks on Russian territory. 

The FBI is investigating. 

Tags: , , , , ,

Media landscape

Click on bars to see headlines

21 total sources

Powered by Ground News™

Full story

The chairman of the Senate Foreign Relations Committee was the target of a deepfake attack in which someone posed as a top Ukrainian official during a Zoom call. The incident, which happened earlier this month, involved someone impersonating the now-former Ukrainian Foreign Minister Dmytro Kuleba during a conference with Sen. Ben Cardin, D-Md.  

The story was exclusively reported by Punchbowl News. Straight Arrow News reached out to the committee to confirm the incident and the details. 

The Punchbowl report said the person impersonating Kuleba asked questions participants thought were bizarre like, “Do you support long-range missiles into Russian territory?”

According to Punchbowl, the Senate Security Office warned a group of leadership aides and security chiefs to be vigilant about others trying to do the same. 

Sources told the outlet that Kuleba’s voice was likely recreated using artificial intelligence. The security notice stated the deepfake had “technical sophistication and believability.”

The motive is still not confirmed, however, people briefed said Russia may have done this in an attempt to create propaganda with a senior U.S. senator saying he supports attacks on Russian territory. 

The FBI is investigating. 

Tags: , , , , ,

Media landscape

Click on bars to see headlines

21 total sources

Powered by Ground News™