Answers to Most Frequently Asked Questions

Stay Informed

Stay Informed

Verity7 is a consulting company focused on promoting truth in media.
We provide anti-disinformation training for a variety of different organizations.

The difference is that misinformation may be simply mistaken or inadvertently incorrect. Disinformation is false information created on purpose in order to manipulate and confuse the reader.
Section 230 of the 1996 Communications Decency Act is often credited with having created the social media business model. It says:

“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” (47 U.S.C. § 230(c)(1)).
According to the Electronic Frontier Foundation, “Section 230 embodies that principle that we should all be responsible for our own actions and statements online, but generally not those of others. The law prevents most civil suits against users or services that are based on what others say.”

In lieu of any detailed examination, this argument makes sense. Except that now it is outdated, and no longer helpful to the body politic.

What’s changed is that platforms like X and Meta are in every way publishers and ought to be held responsible for what they amplify and monetize. That they are merely bystanders providing a neutral forum is a clever ruse to avoid the liabilities that all publishers are subject to. And the result is a firehose of hate speech in your world, generating a firehose of cash in their bank accounts.

Gonzalez vs. Google is important in the fight against disinformation and hate speech because it illustrates the power of Section 230; and demonstrates why social media as we know it today would have to radically change if Section 230 were deprecated or substantially changed.

Section 230 permits platforms like YouTube to allow users to post violent videos that are designed to radicalize certain types of viewers with an intent to cause harm in general. Under Section 230, YouTube can promote and monetize these videos but has no liability for any harm that comes from the publication and promotion of those videos.

In 2015 Isis attacked a nightclub in Paris, murdering dozens of innocents in cold blood. One of the victims was the daughter of Mr. Gonzalez. Mr. Gonzalez sued YouTube, claiming that they were a proximate cause and were liable in the wrongful death of his daughter—because they helped radicalize the person who committed the crime.

The US Supreme Court ruled in favor of YouTube (Google) citing Section 230 as their protection.

No, they were not. Fox News is regulated by the FCC, is not a digital platform, and therefore is liable for lies they openly promote. However, the exact same lies were promoted and profited from by social media platforms without generating any liability. This is because they were protected under Section 230.
According to a report in dailykos, it looks like Russia definitely interfered in the 2016 election, and part of their effort involved disinformation. Today troll farms from overseas operators in Russia, China and elsewhere daily assault US social media users with targeted disinformation with a plan to influence elections in the US. The biggest lie in the electoral disinformation toolkit is that the 2020 election was stolen by Biden, and that Trump won “bigly”. Of course this has been proven false over and over, but disinformation has a habit of remaining in discourse long after it has been proven incorrect.
Often a content-provider whose content has been deprecated by a social media platform will mistakenly claim they’ve been censored. This is a self-serving misuse of the term. Censorship is when the government (and ONLY the government) makes a law that removes your right literally to speak about a subject. Under censorship laws, you cannot speak of censored subjects to your family, your friends, your co-workers, the general public and of course you cannot even print handbills or send emails about the censored speech. It is very very unlike mere lack of amplification.

Lack of amplification is not censorship. Lack of amplification simply means that you, a free citizen, have not been permitted to see your content published (for free) to the entire world by a social media platform. It is the direct equivalent of your book not getting published; or your letter to the editor not being printed. This is not censorship. It is simply that you have not been granted access to global amplification engines. You can still stand on the corner with a megaphone and tell everyone you know about your content. In so doing, you would prove that you are not, in fact, censored.

AstroTurfing is a term that refers to the way a content provider may try to make it seem like there is “grass roots” (aka widespread, unsolicited) support for a topic or thing, when in fact there is no such support. Usually it is achieved by the creation of numerous fake “bot” accounts on social media platforms; as well as the identification of certain users likely to forward materials without questioning them. The technique is often used to create false “movements” in favor or against one political cause or another.

AstroTurf itself is a type of fake grass that was created for the playing field inside of a domed baseball stadium. It looks like grass but it’s made of plastic.

The cost of disinformation to organizations can be significant and include:

1. Reputation Damage: Loss of trust and credibility among stakeholders, customers, and the public.
2. Financial Losses: Due to reduced sales, declining stock prices, or increased advertising and PR efforts.
3. Legal Expenses: Defending against false claims, lawsuits, or regulatory fines resulting from misinformation.
4. Cybersecurity Costs: Protecting against cyberattacks and breaches fueled by disinformation campaigns.
5. Employee Productivity: Time spent addressing disinfromation internally instead of focusing on core tasks.
6. Crisis Management: Resources spent managing the fallout and mitigating damage caused by disinformation.
7. Increased Advertising Costs: Need for additional ad spend to counteract negative narratives or misinformation.
8. Lost Opportunities: Missed partnerships, collaborations, or business ventures due to damaged reputation.
9. Policy and Compliance Costs: Adapting to new regulations or industry standards aimed at curbing disinformation.
10. Customer Acquisition Costs: Increased efforts needed to win back trust or acquire new customers.
11. Monitoring and Analysis Tools: Investment in tools and services to track and combat disinformation.
12. Staff Training and Education: Investment in anti-disinformation training programs for employees.
13. Loss of Market Share: Competitors may gain market share if the organization’s reputation is tarnished.
14. Long-term Repercussions: Effects on brand perception and trust, which may persist even after the incident.
15. Stakeholder Relationships: Damage to relationships with investors, partners, and stakeholders due to mistrust.