TikTok and a Phony ‘Commitment to America’ Infographic: 2022 Midterm Disinformation in Real Time

TL;DR: What happens if a piece of midterm election disinformation that is successful in going viral is traced back to a video by a TikTok account that was quietly deleted? Will anyone take note, or will it slip by unnoticed?

Screenshot of a TikTok video from a since deleted TikTok user that went by the handle Steve T396. This TikTok video produced an inaccurate image (text in an info-graphic) that was widely successful on other social media platforms. This is a rare example of one specific piece of TikTok content that managed to seed 2022 midterm election misinformation onto a multitude of other platforms. This will hopefully be a textbook example of how TikTok can be a vector for text based images and misinformation.

Greetings everyone! In the past few weeks I managed to fight back against my post-dog bite fears by receiving the required six shot regimen of Tetanus and Rabies shots. For the next six months I will be so chalk-filled with Rabies antibodies that I could be the one going around and biting dogs if I wanted to. And truth be told, the more upscale hipster Russian mobilization dodgers I see toting Corgis, I just might.

Should of left little Sashi the Corgi back in Moscow, Sergei. You’ll be running with the big dogs here in the Caucasus region.

Dog bites? So what.

Multiple COVID infections? Whatever.

Weird to have gone through the past few years and the one main post-COVID symptom of mine is that my feet looked like freckled trout for the first time. I’ll tell you something, for a guy like me who was always convinced he was going to die young, there’s no way I would allow the surprise of reaching middle age to be hindered by the fear of a mystery ailment that may or may not have originated in a Chinese fish market.

My significant other marveled at my lack of apprehension about vaccines. It’s easy to understand, I explained. Pancho Villa once said it’s better to die on your feet than to live on your knees. For me, at this point in my life, I’d rather die from the supposed myocardial side effects of an American vaccine than to live with the aftereffects of full blown COVID-19 sans vaccine. I even got some snarky direct messages from a twenty something anti-vax tech bro who objected to me calling him out for labeling his irregular heartbeat as a heart attack. Cream soda gives you an irregular heartbeat, son.

But I digress, as with all those other times I am back to talking about TikTok.

No, this time I won’t be writing about state-sponsored anti-Pfizer messaging on TikTok — although I have a really interesting story about that. And, unlike those other errant ramblings about the identities of white supremacist extremists on TikTok or pro-Kremlin advertising gimmicks by Moscow social media marketing agencies, this time I have something about a piece of content that originated on TikTok and managed to successfully go viral on Twitter and Facebook.

I am still working on this one, and of course, I am a lazy and undisciplined thinker. But, I wanted to get some of this down before I forget. Below you’ll find the obligatory letter to TikTok’s security division.

Back on October 21st, I was scrolling through Facebook when I noticed a university-era friend of mine shared an unusual screenshot. The screenshot was a bizarre info-graphic that looked like this:

An inaccurate and misleading info-graphic that was widely shared on social media platforms in mid October 2022. This info-graphic became the subject of a fact checking campaign.

After spotting a screenshot of this image on Facebook, I saw that it tracked back to a Twitter user. From there, attribution was easy.

I tweeted about this bogus image on October 21st. Please note: the “Stay Informed” warning message was not on this image/tweet at that time.

A person could literally tweet all day about bogus (and possibly state-sponsored) agitprop online. But, even in a sea of disinformation, this particularly odious info-graphic stood out.

I did all the usual tricks to determine the attribution and/or find a source for the image. Since the language so was particular, I stared there:

A Google search for the “Make social security solvent” sentence gave more clues.

A search led to a meme hosting website. Crucially, the image they displayed gave attribution to a TikTok account:

This tweet explains the path to the meme website that helped boost a bogus info-graphic.

Of course, this link now leads to a 404. Meaning that evidence of what happened (aka — how to attribute) is being wiped without providing the public with the reasons why.

In the words of Doc Brown: “Erased — from existence.” One of the sources of a major piece of midterm election disinformation is now just a 404 error message.

Although it is now a 404 error page, for a while this meme hosting link provide proof of a user name. I spotted either a “stevet7la” or a “stevet71a”

Evidence pointing towards a TikTok account that was a source for midterm disinformation.
Here’s a conservative TikTok user calling out a Stevet71a account for engaging in copy-pasta — AKA inauthentic behavior.
After doing a little more digging, more evidence emerges about @stevevet71a . This is a TikTok account that managed to successful cause a piece of blatant midterm election misinformation to spread from the misinformation’s origins on TikTok to Twitter and Facebook.

The time and date stamps indicate that the image originated first on TikTok. A corresponding search of Twitter shows that the sharing of the content was organized on other social media platforms.

A quick check of Twitter reveals that links to the now deleted video are still up:

A Tweet that is still up on Twitter that links to a since-deleted TikTok video from a TikTok user called Steve T396. Archived.

But the evidence of the (possible) crime is now gone. Was this account from a bot farm in the Philippines? Or an influence operation in China or Iran? The copy-pasta allegations automatically but the onus on TikTok to be more forthcoming.

Reuters Fact Check noticed the misleading info-graphic as well.

This was all too much. After this incident, which include the unique evidence of attribution of election misinformation back to one particular deleted TikTok account and video, I knew I needed to email TikTok’s security division.

Here’s the letter:

Dear TikTok Security-

Hello! Thank you for taking the time to read this message. This is a message for TikTok’s security team, and to the division in charge of election integrity. If you could pass this message onto Eric Han in particular, I would greatly appreciate it.

While this is kind of an open letter (I am really not expecting a response) I do think it is wise to write this down. The Reuters Fact Check team will be carbon copied on this message as well, since it is related to their recent reporting.

The TL;DR: A recent “Commitment to America” info-graphic that featured disinformation about the Republican party agenda for the upcoming midterm elections originated from a video posted by a TikTok user named SteveT396. TikTok deleted this user, and removed the video without explanation. Was this account part of an organized effort by adversaries of the United States? Will TikTok share this information with Congress and the media?

What happens if a still image from a TikTok video about the midterm elections manages to make and impact and go viral — yet the evidence of it is erased from existence? This isn’t a hypothetical scenario, this just happened in recent days.

This message today isn’t about something that is still on the TikTok platform — it’s about something that has been removed.

A few days ago, a friend of mine in Seattle shared a screenshot of a Tweet on Facebook. This screenshot was of a tweet purportedly from a Twitter user named Michael Capanzzi. This Capanzzi twitter user shared an image of a graphic called “Commitment to America” which shared a number of easily disproved falsehoods about a supposed policy agenda that the Republican Party has articulated for the 2022 midterm elections.

As soon as I saw this graphic I strongly suspected it to be part of an organized effort of disinformation. The Reuters news agency published a Reuters Fact Check about this fake graphic image that became viral. You can read the Reuters Fact Check article here:

https://www.reuters.com/article/factcheck-republicans-graphic/fact-check-fabricated-commitment-to-america-graphic-circulates-inaccurate-republican-agenda-ahead-of-midterms-idUSL1N31P0ON

I noticed a few troubling aspects about this graphic. But, most importantly, I believe that this graphic originated on TikTok in a video posted by a TikTok user named SteveT396 on or around October 18th of 2022.

The origin of the dubious infographic that went viral across social media first appeared in a TikTok video posted by TikTok user “SteveT396.”

This is a link to the video which has been removed: https://www.tiktok.com/@stevet71a/video/7155877856596282666?_t=8Wc9bmpKmo2&_r=1

Here’s a link to that account page which has also now been removed: https://www.tiktok.com/@stevet71a

This account had long been called for copy-pasta behavior online:

https://www.tiktok.com/@shellytheenforcer/video/7141184285071166766?_r=1&_t=8Wh7UkUPc0H&is_from_webapp=v1&item_id=7141184285071166766

There also was a burst of activity in promoting these videos on Twitter. A Twitter search for the term “Check Out Steve T396's” will show numerous Twitter accounts sharing videos from this account this week:

https://twitter.com/search?q=Check%20Out%20Steve%20%20T396%27s&src=typed_query&f=live

I was just curious to know if TikTok:

Was aware that a major source of election related disinformation that spread across Twitter and Facebook in recent days originated in a TikTok video (https://www.tiktok.com/@stevet71a/video/7155877856596282666?_t=8Wc9bmpKmo2&_r=1) from a user named “Steve T396”

Would be receptive to share more information about the video that was removed with journalists, researchers and Congress.

Would be receptive to share more information about the account that was deleted with journalists, researchers and Congress.

Unlike other social media accounts, when TikTok removes or suspends an account, there doesn’t appear to be any kind of in-app notification. A simple “Couldn’t find this account” message appears for accounts that have been deep-sixed, while videos that were removed don’t have an explanation as to why.

In this instance, it appears clear that a graphic image that originated in a TikTok video posted by Steve T396 became a major source for election disinformation across multiple social media platforms. How can the American people know that there wasn’t any malign intent by a foreign power to use TikTok as a means of insert this graphic into public discourse?

It’s for this reason that I believe that TikTok should preserve all evidence related to the Steve T396 (https://www.tiktok.com/@stevet71a) and present them to researchers and journalists. This is an important event in the history of TikTok, and will likely be a source of future questions for TikTok executives that appear before Congress.

Thank you for taking the time to read this message. I knew I needed to jot this down quickly. I am continuing to research this, and I would be happy to answer any questions about this issue.

-Robby Delaware

I’ll continue to work on this one — mostly because of how unique the set of circumstances surrounding this incident are.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store