An audio clip falsely attributed to Kamala Harris has been taken down by TikTok after garnering widespread use across online posts.
Concern has risen over the dissemination of digitally altered images and videos, known as deepfakes, particularly in the lead-up to the presidential election. With evolving technology, creating convincing deepfakes has become easier, potentially leading voters astray by portraying candidates as saying or doing things they never actually did or said.
The most recent political deepfake making the rounds online involved a doctored audio clip of Harris seemingly speaking incoherently.
The audio was taken from a speech delivered by Harris at Howard University on April 25, 2023 during an abortion rights rally.
Harris stated, “I think it’s very important, as you have heard from so many incredible leaders, for us at every moment in time, and certainly this one, to see the moment in time in which we exist and are present, and to be able to contextualize it, to understand where we exist in the history and in the moment as it relates not only to the past but the future.”
The manipulated audio circulating online misrepresents Harris, making it appear as if she said, “Today is today, and yesterday is today yesterday. Tomorrow will be today tomorrow, so live today so the future today will be as the past today, as it is tomorrow.”
In 2023, the fake video initially went viral before being widely debunked. A post on X that gained over 4.2 million views now includes a disclaimer stating the video is altered.
Politifact, a fact-checking website, also shared a video on TikTok explaining how to identify tampered footage, citing disparities between Harris’ mouth movements and the voiceover.
Despite efforts to debunk it, the video resurfaced on social media recently. An X user reposted it on July 16, prompting discussions with 3.4 million views. Comments on the post cautioning its manipulation redirect users to Reuters‘ debunking.
On TikTok, the platform has stringent policies against harmful AI-generated content and misleading media, swiftly removing such content while collaborating with fact-checkers to verify content accuracy in real time.
Nevertheless, some users continue to share the doctored audio, leading to posts being blocked and users receiving notifications from TikTok about breaching community guidelines, as was the case in a discussion on one audio post.
This incident isn’t the sole deepfake incident ahead of the 2024 election. A doctored clip of Joe Biden falsely stating that Russia had occupied Kyiv for a decade was debunked in March. In January, a fake clip emerged supposedly featuring Biden threatening war on Texas, which was soon debunked by Politifact.