60 Minutes recently did an interview with Peter McIndoe, the founder of the Birds Aren’t Real movement. His theory posits that the government has replaced birds with robotic drones. Their sole purpose is to spy on us.
One time, while he was witnessing a parade supporting some conspiracy theory or other, he decided to test the crowd’s tolerance for misinformation, with his absurd new idea.
The movement now boasts a million followers.
However, most of them are aware of the joke. It is a parody on conspiracy theory and conspiracy theorists.
The 60 Minutes segment was certainly tongue-in-cheek, but it did bring attention to the dangers of misinformation.
Unfortunately, a lot of misinformation has gained traction with a larger swath of society than ever before. It has been facilitated by allegedly reputable media outlets that should know better — but they play along. Maybe for ratings and advertising dollars?
Anyway, Birds Aren’t Real attempts to combat this by — as they put it — holding up a mirror to America in the Internet age.
But my question is:
How absurd is too absurd?
At what point do, let’s just say, the “most suggestible” among us decide that some outlandish bit of propaganda is a bridge too far?
See if this is believable:
There aren’t enough COVID vaccine doses. There never were.
In an effort to avoid panic, the government lied about having all of the doses the country would need. Knowing this, what could they do, if and when it became apparent that demand exceeded supply?
One thing they could try is to discourage demand. But how could they do that, when they already claimed that everyone who wanted a dose could get a dose?
Maybe they could circulate any discouraging messages by proxy: use others, not attached to the government, to spread a message that would suppress vaccine demand. If you could get that message to be picked up by a critical mass of the population it would go viral. The most susceptible to misinformation would voluntarily opt out of getting the life-saving vaccine and preserve the supply of medicine for others.
It was a matter of planting negative information about the vaccine.
So who would be most likely to believe, and then disseminate, misinformation about a government program? The answer is lots and lots of people, if you put it on their favorite social-media outlets.
And thus Operation: Bad Vaccine was covertly launched. By this time, vaccines were being distributed around the country by appointment. Supply was dwindling as people, fearful of the pandemic and tired of restrictions, clamored for the protection.
South Dakota had ramped up their vaccination rates impressively. Their distribution program was a model for the rest of the country. COVID cases and hospitalizations declined rapidly.
But what’s in the vaccine? You don’t know. Why would you put that in your body?
I heard that the government put a chip in the shot, to track you.
I saw a woman on Facebook who’s now magnetic after getting the shot. Metal is sticking to her arm. Can’t be good.
South Dakota’s progress stagnated. Instead of being a model for the other states, they became a laggard in getting people vaxed. The positive trend in cases and hospitalizations reversed.
But how? Operation: Bad Vaccine had worked brilliantly.
In the old days, planting a rumor would only go so far. Then the Internet greatly broadened the reach of dubious information. And today, thankfully for this government program, that kind of message is picked up by major media outlets — with little-to-no vetting. They unwittingly did the government’s bidding in suppressing demand for the vital medicine. Limited supplies could now meet the reduced demand.
The covert program succeeded– but at a deadly cost.
Believable? Human nature indicates that whether it is or not depends on what one wants to believe. We all have to be on guard for that, with a healthy skepticism.
For a limited time, new visitors to the grand opening of my new website
will get a free download of Ghosts of Forgotten Empires.
It’s available in any of the formats below at https://michaeljfoy.com/.