Living in a Post-Truth World

I thought about writing this three years ago; it’s a bug that’s been stuck in my mind for years prior to the current ‘AI boom’. But here we are, at a point where the issue is too hard to ignore, too obvious, too damaging. We are in an era where truth is no longer defined by evidence and reason. The current technology landscape and generative machine learning have created an environment wherein the validity of anything we see, read, or hear is in doubt. Truth seems increasingly elusive, and when found doesn’t carry the weight it once did.

When ChatGPT arrived, it rapidly gained a following for what was, at the time, a novel and intensely capable generative chat system. Even then, I had a feeling that this was going to be a problem. Specifically that the ability to generate, with a simple prompt, endless streams of profound gibberish would rapidly result in any original, interesting, useful content being buried under layers of AI garbage. That is now just part of our new reality.

We’ve moved quickly from a philosophical question into a state of existence where these issues are current, pervasive, and have no practical or safe solutions. We are in a crisis, driven by the proliferation of techno-weapons placed into the hands of a general, and generally bored populace.

Truth used to be anchored in the tangible. We had faith that photographs and recordings accurately portrayed reality. While there were always exceptions, the realm of crazies and conspiracy theorists, they were easy to spot. Now, the guardrails are gone. The ability to manipulate reality is absolute, and we are falling for it. Hard.

The Weaponization of ‘Proof’: A Fake Worth a Thousand Lies

A lie will go around the world while the truth is pulling its boots on. If this was the case in the days of horses, couriers, and print, it is exponentially so today. People have always believed what they’ve wanted to believe. And the truth rarely makes the most interesting gossip. The truth has always been something that needs to be sought, and then nurtured and protected once found.

The last hundred years or so have seen a dramatic shift in how we are able to present the truth. Photographs, videos, and audio recordings, were able to accelerate the truth into the forefront. It was much easier to put evidence to claims. Manipulation of the source data was difficult or impossible. So lies would increasingly come in the form of ‘spin’ in the media. Taking evidence and cherry picking the bits that are convenient, framing them in a way that supports your side. People and organizations got very good at putting spin on the available evidence, but the truth still remained as long as the original material was available. The ability to spin the narrative on a massive scale was, perhaps, the advent of the death of truth.

There are no longer barriers to producing false evidence. There is no longer a need to spin the narrative of available media. Now the ability to entirely fabricate reality is in the hands of an outstanding number of people on Earth. This isn’t new. Photoshop notoriously took flack as a tool for magazines to invent an unrealistic image of their models. But that type of manipulation doesn’t even hold a candle to the ability of generative AI models to produce convincing media. Images, sound, and video that are now realistic enough to fool 99%+ of anyone who looks at them.

If a picture is worth a thousand words, then a (near) perfect fake is worth a thousand lies. These lies don’t take skill to produce. All it takes is a person, a phone, and an idea to produce something that thousands to millions of people could see, believe, and react to. Yes, this is certainly more likely to happen among groups of people who are historically not critical of the things presented to them, but it’s now also occurring among people who generally have had a critical eye. Even for those of us who can identify, evaluate, or dismiss 99 out of 100 fake pieces of information that come across our feeds, how much damage does that last one do?

A Double-Edged Sword: In Doubt of the Truth

This issue is truly a double-edged sword. On one side, there is the massive potential for people to believe and act on things that they see that are fundamentally not true. To believe without critical analysis things that are completely fabricated. On the other side, the simple knowledge that this problem exists can cast doubt on anything that is actually true. The truth, already struggling to keep pace with the sensational, is dragged even further behind.

It’s easy for people to believe the things that suit them, and disbelieve, however true, the things that do not. But criticality and rational thinking is a sliding scale, full of grey area. What leaves doubt for one person may be gospel for another. There have always been certain thresholds. One is the threshold of doubt, where anything under it is easy to dismiss and not worry about. The other is the threshold of belief, where anything beyond is to be believed unquestionably. These have to exist, or else living in the world would be too exhausting. Having to doubt, challenge, prove every single thing you are presented with is not sustainable. These thresholds are increasingly pushing towards the extremes, leaving us in the grey quagmire of doubt.

The most obvious issue is the one that we see occurring day to day. People are producing, consuming, sharing, and acting on information that is inherently made up and/or blatantly false. The consequences could be as simple as having to roll your eyes at a coworker who now believes that Jesus is showing up in swarms of shrimp, to significant economic impacts perpetrated by fake news of attacks on critical buildings or businesses. It’s easy to see how this plague of nonsense begins to erode an otherwise fairly stable foundation of knowledge, of reality. It’s easy to see how this sort of generated content can easily reinforce the beliefs of people who see it, and in turn create a vehicle to continue to polarize those that buy in.

Inundated with convincing evidence, at a scale that was never before possible and quality that most people alive never would have imagined, people are overwhelmed and emotionally exhausted. No, they don’t want to verify everything that comes their way. Yes, that person seems to have their same opinions and values, so why would they lie to me? Of course this is happening.

The idea of an echo chamber isn’t new. However, the ability to fabricate a convincing reality has taken the echo chamber to new volumes. It’s no longer about whether you think something is right or wrong. It’s no longer about whether your social or moral values align with some cause. It has literally become a cognitive dissonance where the perceived realities of different people or groups fundamentally don’t come from the same inputs. It’s no longer a matter of interpreting the same evidence or circumstances and coming to different conclusions. It’s a matter of being presented with entirely different pictures and assuming everyone else is seeing the same thing.

The other side, and perhaps the most insidious and most damaging, is the new state of enhanced doubt that so many of us now live in. We doubt the things we see, even more than ever before. It seems too convenient when something lines up with our beliefs; it seems too sinister when something goes fundamentally against them. We don’t know if we can trust images, videos, and recordings. We don’t know if sources we once considered authoritative now are pushing things that are fundamentally false. We don’t know, so we hesitate. Is this true? Is this something that I could invest emotion in? Is this something I should spread and share? What happens if it’s fake and I’ve taken up the cause? Do I capitulate, lose face, lean in?

This environment of doubt only benefits the guilty, the deceivers. The environment of confusion and doubt is where the truth goes to die. It’s easy to see, it’s happening all around us, every election cycle, every scandal. Somebody does something wicked or unsavory. It’s caught on tape, video, image, written about, etc. and rightfully brought to the public’s attention. Due to the environment of doubt and confusion, they are able to claim that the evidence is made up, the situation is fabricated, they never did that (or whatever they did wasn’t as bad is it’s presented), and they win. They win because the truth is in doubt, reality is up for debate, and their sycophant followers don’t actually care about the truth. They don’t care about exposing evil. They just care about maintaining their bubble of fabricated reality and making sure nothing scary gets in to disrupt it.

Social Media: Ground Zero in the War on Truth

While the internet is just one theater in the war on truth, the battlefield of social media is probably the most incendiary. In my 20 years using social media I’ve seen it go from an incipient quirky thing (Hey Tom!) to a fundamental tool for connection and information to a vehicle for a new kind of monetary scheme and now to a dead zone populated by bots and bullshit with little left in the way of actual engaged users, unique thoughts, or truth.

Social media companies sometimes give you the numbers. They sometimes pull back the curtains. Even the glimpses they have given us have shown that double-digit percentages of social media users are bots. And these bots make up a disproportionately large percentage of the posts and comments, because they are just doing their jobs. They don’t have lives, they don’t have expectations, they don’t operate in reality. They exist only to pollute their spaces with garbage. Garbage posts, garbage comments, garbage traffic. Anybody who has been on any of the major platforms in the last few years can see how obvious this is.

Bots can drive the truth into the ground. Bots can raise lies from the dead. Bots give the power to the puppeteers to wield undue influence inside systems that already hold too much sway over public perception and opinion. They create the narrative. They create the content. They share the content. They engage with the content. They drown out the opinions and viewpoints of real people who might doubt the reality being constructed by the bots. Real people do not belong here. Cannot survive here.

The platforms can count all of this as ‘engagement’. They don’t care where it comes from, their metrics don’t consider it a factor. For their actual users, they make it even easier to put in zero effort to produce engagement. Under posts on LinkedIn you’ll frequently see easy buttons that allow users to say ‘Congratulations!’ or ‘That’s so interesting, thanks for the post!’ or similar. Or Facebook, where there’s now an easy button to wish people ‘Happy Birthday! emoji emoji emoji’. Let’s automate every aspect of every user on every platform, then nobody has to think or try, but the companies will have their engagement.

Real people, real content, real life, are getting buried beneath the weight of generative garbage and automated agents. The people, the users, have absolutely no say in what happens next. This is a new reality, and the algorithm is god. Unless these massive, pervasive companies actually do something more than provide lip service to the people raising concerns, nothing is going to change and things are only going to get worse.

Collateral Damage of the Social Media Warzone

The damage caused by this is real. It’s real, and it exists in every aspect of our lives and society. Every aspect of our world. There are countless examples of events in the last few years that are directly attributable to the increased ability of AI to shift and affect reality. A light is starting to shine on them, but too often people aren’t even aware that this is a category of problems that is even happening.

Politics is an obvious target. Political tampering, propaganda, espionage. Call it what you want, but the fact remains that the ability of bad actors to influence the political narrative of any election or issue across the globe is unprecedented with the tools currently available. Targeted misinformation just confirms existing beliefs, further polarizes people, and adds fuel to a fire that need not exist in the first place. Painting the picture that anybody different from you, or anybody with different opinions, is the enemy is far too easy to do. Writing them into a different reality is far too destructive. The cult of personality is stronger than ever, and the momentum is increasing. Things are increasingly chaotic on the political front, and they are only going to get worse as people sink further and further into the lies guided and perpetuated by the reigning powers. It’s easy for their followers to contribute; anybody can. The weapons are in all of our hands.

The issues apply to more mundane aspects of life as well. Shopping for something on Amazon? Good luck. Amazon is completely flooded with cheap knock off products sold by companies who are nothing more than a name registered as a storefront, product pages littered with AI-generated images and text (most of which is so low quality it may as well not exist). If you somehow found your way through that morass, you might actually stumble upon real products sold by real companies. Now you need to decide if any of these products are actually good. Reviews are now written by bots, hired by the companies selling the product, or hired by the competitors to drive the ratings down. You can’t trust the good, you can’t trust the bad. Every now and then you come across a real person who is just telling it like it is (and not trying to review in such a way the algorithm rewards their reviews). What a breath of fresh air!

The school system is not equipped to deal with the advent of AI. The old methods don’t work and there are no new options available. Our educators and education systems are under-funded, under-staffed, and under-appreciated. They aren’t going to come up with clever solutions anytime soon, and an entire generation of students is going to suffer for it. I’ve seen it firsthand. Generative models make homework of almost every type entirely trivial. How much is good enough when it comes to grades? How well are attention spans and reading comprehension doing in the face of TikTok and the flood of inconsequential slop fed up by the most sophisticated techniques? It’s all going badly. Colleges are finding that many new students don’t even have middle school levels of math or reading comprehension.

An Assault on the Vulnerable

While the state of education is bleak, what’s even more concerning is the lasting social and emotional damage that AI and interrelated technologies and habits are having specifically on young people. AI is being used for everything from basic interaction to flat-out perversion. The increasing use of AI to communicate with friends, significant others, or anyone else leaves a void in the ability to learn from these experiences. South Park hit the nail on the head with its Deep Learning episode. It was current, but also in many ways predictive. People are building their relationships on lies and misunderstanding, at some points acting as nothing more than a go-between for an AI conversation. This isn’t living. Too often now there are stories about people who are attempting online dating, having amazing connections, and then having it all come apart when they meet in person. It’s frequently because one or both sides of that communication were using AI to develop a persona that doesn’t actually exist. When virtual reality and reality meet, they rarely come to agreeable terms.

But homework, text messages, and common dating are just the surface. AI has been implicated now in several cases of suicide, murder, and death, where children and other vulnerable populations leaned into AI hard and destroyed their perception of reality to the absolute bitter end. This isn’t a one-off thing. This is becoming commonplace. And that’s just the surface that we know about. How many countless traumas are unfolding all around us, invisible, because they exist in a space between the vulnerable and the uncaring agent with nobody around to hear the cries for help. How easy is it for predators to use these tools to develop new means of catching their prey? For victimizing people who barely understand the world that they are traveling in. Keeping vulnerable populations safe has always been a challenge, but it is significantly more so when trust is placed in unknown actors and AI agents more so than in reality because it’s the place that offers the most immediate, and targeted comfort and communion.

This extends even further into the depths of victimization. People, children, predominately female, victims for doing nothing more than having pictures of themselves online. It’s increasingly easy (and surely getting easier and more accessible by the month) for others to turn those images and videos from Instagram, Facebook, or any other source into convincing nude photos or porn videos. The generated media could be used for blackmail. It could be used to generate an emotional and social grip on a victim. Or it could just be put out into the world as chaos for the sake chaos. This isn’t a solitary event. This is an increasingly available form of dark and disturbing control. How often will this occur where the victim doesn’t say anything, feels like they have nowhere to turn, and ends up deeper into trouble? How much worse will these get with increasingly prevalent access to tools that can completely blindside and emotionally devastate their victims? The truth does not matter to the victim. It doesn’t matter to the abuser. The damage is done and the truth here is utterly irrelevant in the face of reality.

Living in a Post-Truth World

The joke stopped being funny a long time ago. The reality is that everybody is now part of the joke. Producing, proliferating, or falling victim to a new and sinister state of existence. The line between what’s comically absurd and what is plausible truth has been erased and those lines are never to be redrawn.

We are at war. With our friends, our neighbors, ourselves, but more importantly with the technology and the people who create and facilitate it. We have got to be more critical of ourselves, of others, and of the tools and businesses we use. We’ve got to raise the bar across the board.

We have opened Pandora’s box, and there is no putting the demons back. The technology isn’t going away, and the ability to ‘regulate’ it isn’t going to keep up with the technology itself. The ability to do so would create a gross infraction of privacy and security, and much like any similar laws, be completely irrelevant to the bad actors who are going to pursue illicit means anyway.

We need to use our minds, hone our instincts, and teach empathy as well as strength. We can be passengers as generative AI and related technologies destroy truth and our society, or we can fight back with the brains we were born with.

Similar Posts

Leave a Reply