By Aaron McDuffie
Merriam-Webster defines disinformation as “false information deliberately and often covertly spread (as by the planting of rumors) in order to influence public opinion or obscure the truth.” A Congressional Research Service Report characterizes disinformation as a subset of information warfare. It describes it as: “Unlike misinformation, disinformation is intentionally false. Examples include planting false news stories in the media and tampering with private and/or classified communications before their widespread release.” NATO defines disinformation as, “the deliberate creation and dissemination of false and/or manipulated information with the intent to deceive and/or mislead. Disinformation seeks to deepen divisions within and between Allied nations, and to undermine people’s confidence in elected governments.” From these definitions, disinformation can be defined as a form of information warfare that disseminates false or manipulated information with the intent to deceive or influence populations.
Disinformation creates or furthers divides across the U.S. population by flooding the marketplace of ideas with conflicting information or disinformation that overwhelms consumers’ rationality and gets them to revert to implicit biases. This is generally done via four different approaches: 1) fear-based messaging to draw in audiences; 2) social messaging to highlight and further divisions on hot-button issues; 3) political messaging for and against candidates; and 4) economic messaging – turning populations against technological or industrial developments. This phenomenon can lead to violence against “others,” distrust in institutions, and a general reduction in the quality of information in the marketplace of ideas. While history is littered with examples of disinformation, more recent examples include the Russian and Iranian amplification of a 2016 campaign to have California secede from the United States, narratives that COVID-19 is a U.S. bioweapon being used on the world, and, most notably, former President Trump’s lie that the 2020 U.S. Presidential Election was “rigged” and “stolen” from him.
Legal Remedies to Disinformation
It goes without saying that efforts to curb the dissemination of disinformation would conflict with the First Amendment’s protections regarding free speech. Of the limited exceptions to the First Amendment’s guarantee of free speech, the incitement exception is most applicable. The leading case on the First Amendment’s incitement exception is Brandenburg v. Ohio, a case that replaced the “clear and present danger” test of Schenck v. United States with the “imminent lawless action” test. Brandenburg holds that “the constitutional guarantees of free speech and free press do not permit a State to forbid or proscribe advocacy of the use of force or of law violation except where such advocacy is directed to inciting or producing imminent lawless action and is likely to incite or produce such action.” This means that unless speech is 1) intended to produce imminent lawless action and 2) likely to produce such action, the speech is protected and therefore cannot be prohibited via legislation.
Unfortunately, this broad protection of speech means legislation and policy cannot directly address the issue of disinformation and information warfare. Instead, the law must indirectly address the problem. History’s response to prior eras of fake news and disinformation was the development of journalistic ethics and objective journalism; however, that development was the result of business decisions rather than legal decisions. Additionally, the digital revolution’s impact on newspapers, media outlets, and local investigative journalism seems to have hindered objective journalism for the time being. However, the issue of disinformation is fundamentally an individual one as it works when information is presented in an inflammatory way to hijack and overwhelm the cognitive and rational part of people’s minds.
Because there is little relevant law directly on point to address the problem of disinformation, the judiciary is therefore an inappropriate vehicle of redress. Instead, legislatures must create statutory schemes that attempt to draw borders for the marketplace of ideas and, like the free market, correct imbalances and monopolies. Direct intervention may be improper and run afoul of other constitutional limitations, but Congress can authorize the funding of studies on matters relating to the problem of disinformation like sustainable business models for objective journalism or cognition and psychologically equipping individuals to rationally respond to information.
Summing Up
The issue of disinformation is complex, and it is made up of smaller, but still significantly complex sub-issues. The law is largely prohibited from directly and reactively addressing disinformation, but it is not prohibited from indirectly and proactively addressing the foundations of disinformation. The marketplace of ideas may be flooded with sensationalism and conspiracy theories, but that does not prevent Congress from funding the development—not the production—of higher-quality ideas in that marketplace. If we desire better public discourse, government can play a role by fostering the development of standards for information exchanged in our public square.
Kommentarer