TikTok and a Phony ‘Commitment to America’ Infographic: 2022 Midterm Disinformation in Real Time
TL;DR: What happens if a piece of midterm election disinformation that is successful in going viral is traced back to a video by a TikTok account that was quietly deleted? Will anyone take note, or will it slip by unnoticed?
Greetings everyone! In the past few weeks I managed to fight back against my post-dog bite fears by receiving the required six shot regimen of Tetanus and Rabies shots. For the next six months I will be so chalk-filled with Rabies antibodies that I could be the one going around and biting dogs if I wanted to. And truth be told, the more upscale hipster Russian mobilization dodgers I see toting Corgis, I just might.
Should of left little Sashi the Corgi back in Moscow, Sergei. You’ll be running with the big dogs here in the Caucasus region.
Dog bites? So what.
Multiple COVID infections? Whatever.
Weird to have gone through the past few years and the one main post-COVID symptom of mine is that my feet looked like freckled trout for the first time. I’ll tell you something, for a guy like me who was always convinced he was going to die young, there’s no way I would allow the surprise of reaching middle age to be hindered by the fear of a mystery ailment that may or may not have originated in a Chinese fish market.
My significant other marveled at my lack of apprehension about vaccines. It’s easy to understand, I explained. Pancho Villa once said it’s better to die on your feet than to live on your knees. For me, at this point in my life, I’d rather die from the supposed myocardial side effects of an American vaccine than to live with the aftereffects of full blown COVID-19 sans vaccine. I even got some snarky direct messages from a twenty something anti-vax tech bro who objected to me calling him out for labeling his irregular heartbeat as a heart attack. Cream soda gives you an irregular heartbeat, son.
But I digress, as with all those other times I am back to talking about TikTok.
No, this time I won’t be writing about state-sponsored anti-Pfizer messaging on TikTok — although I have a really interesting story about that. And, unlike those other errant ramblings about the identities of white supremacist extremists on TikTok or pro-Kremlin advertising gimmicks by Moscow social media marketing agencies, this time I have something about a piece of content that originated on TikTok and managed to successfully go viral on Twitter and Facebook.
I am still working on this one, and of course, I am a lazy and undisciplined thinker. But, I wanted to get some of this down before I forget. Below you’ll find the obligatory letter to TikTok’s security division.
Back on October 21st, I was scrolling through Facebook when I noticed a university-era friend of mine shared an unusual screenshot. The screenshot was a bizarre info-graphic that looked like this:
After spotting a screenshot of this image on Facebook, I saw that it tracked back to a Twitter user. From there, attribution was easy.
I tweeted about this bogus image on October 21st. Please note: the “Stay Informed” warning message was not on this image/tweet at that time.
I did all the usual tricks to determine the attribution and/or find a source for the image. Since the language so was particular, I stared there:
A search led to a meme hosting website. Crucially, the image they displayed gave attribution to a TikTok account:
Of course, this link now leads to a 404. Meaning that evidence of what happened (aka — how to attribute) is being wiped without providing the public with the reasons why.
Although it is now a 404 error page, for a while this meme hosting link provide proof of a user name. I spotted either a “stevet7la” or a “stevet71a”
The time and date stamps indicate that the image originated first on TikTok. A corresponding search of Twitter shows that the sharing of the content was organized on other social media platforms.
A quick check of Twitter reveals that links to the now deleted video are still up:
But the evidence of the (possible) crime is now gone. Was this account from a bot farm in the Philippines? Or an influence operation in China or Iran? The copy-pasta allegations automatically but the onus on TikTok to be more forthcoming.
This was all too much. After this incident, which include the unique evidence of attribution of election misinformation back to one particular deleted TikTok account and video, I knew I needed to email TikTok’s security division.
Here’s the letter:
Dear TikTok Security-
Hello! Thank you for taking the time to read this message. This is a message for TikTok’s security team, and to the division in charge of election integrity. If you could pass this message onto Eric Han in particular, I would greatly appreciate it.
While this is kind of an open letter (I am really not expecting a response) I do think it is wise to write this down. The Reuters Fact Check team will be carbon copied on this message as well, since it is related to their recent reporting.
The TL;DR: A recent “Commitment to America” info-graphic that featured disinformation about the Republican party agenda for the upcoming midterm elections originated from a video posted by a TikTok user named SteveT396. TikTok deleted this user, and removed the video without explanation. Was this account part of an organized effort by adversaries of the United States? Will TikTok share this information with Congress and the media?
What happens if a still image from a TikTok video about the midterm elections manages to make and impact and go viral — yet the evidence of it is erased from existence? This isn’t a hypothetical scenario, this just happened in recent days.
This message today isn’t about something that is still on the TikTok platform — it’s about something that has been removed.
A few days ago, a friend of mine in Seattle shared a screenshot of a Tweet on Facebook. This screenshot was of a tweet purportedly from a Twitter user named Michael Capanzzi. This Capanzzi twitter user shared an image of a graphic called “Commitment to America” which shared a number of easily disproved falsehoods about a supposed policy agenda that the Republican Party has articulated for the 2022 midterm elections.
As soon as I saw this graphic I strongly suspected it to be part of an organized effort of disinformation. The Reuters news agency published a Reuters Fact Check about this fake graphic image that became viral. You can read the Reuters Fact Check article here:
I noticed a few troubling aspects about this graphic. But, most importantly, I believe that this graphic originated on TikTok in a video posted by a TikTok user named SteveT396 on or around October 18th of 2022.
The origin of the dubious infographic that went viral across social media first appeared in a TikTok video posted by TikTok user “SteveT396.”
This is a link to the video which has been removed: https://www.tiktok.com/@stevet71a/video/7155877856596282666?_t=8Wc9bmpKmo2&_r=1
Here’s a link to that account page which has also now been removed: https://www.tiktok.com/@stevet71a
This account had long been called for copy-pasta behavior online:
There also was a burst of activity in promoting these videos on Twitter. A Twitter search for the term “Check Out Steve T396's” will show numerous Twitter accounts sharing videos from this account this week:
I was just curious to know if TikTok:
Was aware that a major source of election related disinformation that spread across Twitter and Facebook in recent days originated in a TikTok video (https://www.tiktok.com/@stevet71a/video/7155877856596282666?_t=8Wc9bmpKmo2&_r=1) from a user named “Steve T396”
Would be receptive to share more information about the video that was removed with journalists, researchers and Congress.
Would be receptive to share more information about the account that was deleted with journalists, researchers and Congress.
Unlike other social media accounts, when TikTok removes or suspends an account, there doesn’t appear to be any kind of in-app notification. A simple “Couldn’t find this account” message appears for accounts that have been deep-sixed, while videos that were removed don’t have an explanation as to why.
In this instance, it appears clear that a graphic image that originated in a TikTok video posted by Steve T396 became a major source for election disinformation across multiple social media platforms. How can the American people know that there wasn’t any malign intent by a foreign power to use TikTok as a means of insert this graphic into public discourse?
It’s for this reason that I believe that TikTok should preserve all evidence related to the Steve T396 (https://www.tiktok.com/@stevet71a) and present them to researchers and journalists. This is an important event in the history of TikTok, and will likely be a source of future questions for TikTok executives that appear before Congress.
Thank you for taking the time to read this message. I knew I needed to jot this down quickly. I am continuing to research this, and I would be happy to answer any questions about this issue.
I’ll continue to work on this one — mostly because of how unique the set of circumstances surrounding this incident are.