When the Capitol riot happened, mainstream social media sites — like Twitter and Facebook — took action.
Then-President Donald Trump was banned from Twitter permanently and suspended from Facebook for at least two years. The social media company, now known as Meta, said it would reassess the risk of violence near the end of Trump's suspension in January 2023.
While getting Trump off the sites was an immediate reaction from social platforms, lawmakers have spent the last year looking into the role that these social media sites might have played in the actual planning of the Capitol insurrection.
In 2021, we saw big tech CEOs in the hot seat on Capitol Hill as U.S. lawmakers grilled them about the spread of misinformation and how their sites enabled extremists.
Lawmakers have to consider the "broader ecosystem" and "not just technology platforms we use," former Twitter CEO Jack Dorsey said.
"I believe that the former president should be responsible for his words and that the people who broke the law should be responsible for their actions," Meta CEO Mark Zuckerberg said.
Zuckerberg caught a lot of heat for that comment, which critics saw as Facebook downplaying its role in the events of Jan. 6. And to put some numbers to all this, an investigation from ProPublica and Newsy's partners at The Washington Post published earlier this week found during the period between Election Day and the Capitol riot, there were at least 650,000 posts in Facebook groups that attacked the legitimacy of President Joe Biden's win, with many posts calling for political violence.
So, have these social media sites made major changes to avoid becoming tools for political violence? Yes and no.
The companies have put in some new measures.
Meta made sure its oversight board was up and running to make decisions with real impact. The group is supposed to be independent of Meta and work as a "Supreme Court" of sorts. It issued its first five decisions in January of 2021.
The company also introduced new enforcement protocols for content posted by public figures during times of civil unrest and violence in June of last year.
But Meta's critics say there's still a lot of work to be done — and the company has felt a bit more heat since a whistleblower came forward last year saying the company did NOT do enough to combat misinformation and political violence planned on its platforms. That misinformation includes content related to the 2020 presidential election and Jan. 6 insurrection. The whistleblower's complaint claims this was done to "promote virality and growth on its platforms."
"These problems are solvable," Facebook whistleblower Frances Haugen said. "A safer free speech respecting more enjoyable social media is possible, but there is one thing I hope you take away from these disclosures is that Facebook can change but is clearly not going to do so on its own."
This year, Meta says it will give updates about when it leaves up content that violates its rules because of newsworthiness and would no longer presume that speech from politicians is inherently of public interest.
Then there's Twitter. While the site is nowhere near as large as Facebook, Trump did have a massive following and tweeted frequently during his presidency. Twitter ended that just a few days after the Capitol riot.
Immediately after the events of Jan. 6 last year, the company stopped allowing people to reply to, like or retweet posts that violated its updated civic integrity policy. Then, the company permanently suspended thousands of accounts that mainly shared QAnon content.
Twitter also launched a pilot program called Birdwatch in which users can identify tweets they think are misleading and need more context. Finally, the site teamed up with The Associated Press and Reuters to elevate credible information on the platform.
It's worth noting that founder and CEO Jack Dorsey stepped down back in November and was replaced by Twitter's chief technology officer, Parag Agrawal.
To mark one year since the insurrection, the social networking site convened a team meant to monitor any content on their platforms associated with Jan. 6 that could lead to more political violence.
However, even if there are these safeguards, restrictions and community guidelines in place on the mainstream social networks, there are still a bunch of other services that don't mind becoming home to misinformation or extremism.