The Oct. 12 Facebook post from a Fairfield man said the photo showed chicken defrosting in the sun out behind a local restaurant. “I’ll never eat this food again and I definitely would not recommend it,” the post ended.

Except that the owner said it was cabbage, not chicken, drying out for use by the staff, not for serving the public. The state, alerted by the viral post, agreed — adding that the restaurant’s owner had never failed an inspection. Also, the photo was taken more than a year ago.

The Morning Sentinel told the full story on Oct. 17, but not before the post had taken on a life of its own — by Tuesday morning, it had more than 1,500 shares and had elicited more than 500 comments.

It’s just one small way in which systems that allow anyone to write just about anything, for almost any reason, and disseminate it broadly favors rumor and sensationalism over facts. And while there is underway a big discussion over what to do with the tech giants that control who sees what, there is an immediate threat to the 2020 election — at every level — that warrants attention.

In the Fairfield case, it was just a single person with seemingly good intentions who spread the misleading post.

More often, however, such posts are coordinated by powerful actors. They are intentionally misleading and targeted to do the most damage — to turn people against each other, destabilize institutions, and disorient the electorate.

Advertisement

In the most well-known case, as confirmed by the recent Senate Intelligence Committee report, as well as by the U.S. Intelligence Community and the Mueller report, Russia during the 2016 election used Facebook, Instagram and Twitter to spread misinformation and sow discontent.

Using false reports, trolls and conspiracy theories, all dressed up to appear authentic, the Russians aimed at fissure points in American society — most often race — to, at separate times, whip people up against their political adversaries and drive down turnout.

Those efforts continue to this day, and they are not the only ones. Facebook announced earlier this week that it had found and taken down four state-sponsored disinformation accounts — one from Russia, three from Iran —the latest of dozens such accounts to be removed.

And overall, from last October through March, Facebook erased 3 billion fake individual accounts, twice as many as the previous six months.

Fake Facebook pages are trying to influence local elections, too, as seen in recent Lewiston and Waterville municipal races, where trolls posing as the opposition create over-the-top posts in an attempt to drive away undecided voters and enrage their own supporters into action.

Political operatives, taking advantage of an increasingly partisan electorate and the diminishing newspaper industry, also have created partisan websites designed to look like unbiased news. The executive director of the Maine Republican Party used one such site, created anonymously, to spread bogus claims about a candidate in the 2017 Lewiston mayoral race.

Advertisement

These sites, run by Democrats and Republicans alike, don’t disclose funding or, often, any affiliations they may have. Particularly once the posts are shared on social media, they blend in the media landscape, and to most people look legitimate.

Whether by political operatives or foreign governments, these efforts are meant to pull at people’s insecurities and fears and play on their biases. The misinformation is fed through social media, specifically targeting people it is most likely to influence.

Pretty soon, everybody’s playing with different “facts.” What happens next is much worse than leaving some chicken out in the sun.

 

Comments are no longer available on this story