Opinion

From AI to Crypto, When Did Everything Get So … Fake?

Faux meat, chatbots, virtual reality ... We’re increasingly living our lives surrounded by amazing facsimiles. What’s with the current fetish for things that aren’t what they pretend to be?


technology fake ai robot chatbot virtual reality

Has technology made us enamored with the fake? / Illustration by Jon Krause

So there’s this 31-year-old schlub named Sam Bankman-Fried who built this company called FTX that investors sank billions of dollars into because it sounded new and interesting and hey, Sam was a wunderkind whose financial acumen was being touted by everyone from the Wall Street Journal to the New York Times to New York magazine. (Fortune dubbed him “the next Warren Buffett.”) Only a funny thing happened on the way to Sam getting everybody rich: It turned out he was a big giant phony. Or, as his successor as CEO of FTX put it in a filing in federal bankruptcy court:

Never in my career have I seen such a complete failure of corporate controls and such a complete absence of trustworthy financial information as occurred here. From compromised systems integrity and faulty regulatory oversight abroad, to the concentration of control in the hands of a very small group of inexperienced, unsophisticated and potentially compromised individuals, this situation is unprecedented.

In other words, Sam’s cryptocurrency empire was actually a house of cards.

A lot of folks evinced surprise when this came out. Not me, though. I always thought crypto was bullshit. So far as I could see, there wasn’t any there there. I wasn’t alone in this opinion; no less a finance titan than JPMorgan Chase & Co. CEO Jamie Dimon proclaimed crypto a “decentralized Ponzi scheme,” and renowned short seller Marc Cohodes opined that FTX was “a complete scam.” Yet big companies like Sequoia Capital, an early investor in Apple and Google, handed Sam millions of dollars — as did plenty of average Janes and Joes (mostly Joes) who chipped in their retirement savings and are shit outta luck now. Which leaves me wondering: Why? What could possess so many people to fall for Scam Bunkum-Fraud?

Which is when I realized: We’re living in the heyday of fake everything. Which makes it really hard to tell what’s real anymore.

Last December, the Wall Street Journal reported that kids weren’t asking for money for the holidays anymore. They were asking for Robux, a virtual (as in not-real) currency used in the Roblox video-game empire to purchase … nothing. Or, rather, trendy items for avatars, which are also … nothing. As in, not real. You can buy a virtual Louis Vuitton handbag or Gucci jacket online for less than $5. Of course, what you get is … nothing. And kids now think this is okay. (It’s not just kids, either. Per the New York Times, adults last year spent $167 million on the Sandbox, just one small part of a “metaverse real estate market” where you shell out real money for imaginary property. That you can then pay to have laid out and done over for pretend by real interior designers and architects. Because … hmm.)

Even with real handbags that you hold in your hand, things have gotten complicated. Is that Louis Vuitton bag made of leather, or is it pleather? Or, if not fake leather, then leather cloned in a test tube, so no cows had to die? And speaking of cows, you don’t need to kill them anymore to enjoy a burger, or chickens to have chicken nuggets. (Those boneless chicken wings you all are scarfing up? They’re not even wings.) I’m not saying it’s a bad thing that meat is being cloned in test tubes. I’m eating less meat than ever, both for squeamish reasons and because rib-eye steaks are $24.99 a pound at my Giant supermarket right now. I can go with the faux every once in a while. Still, you can see how confusion might seep in.

What’s even more confusing is food that’s deliberately made to look like other food, an entire trend you can abundantly explore on Pinterest and Instagram, where folks with clearly way too much time on their hands craft cunning grilled-cheese sammies that are actually sponge cake slices layered with Velveeta-colored buttercream, or “macaroni salad” that’s really vanilla Tootsie Rolls and chopped-up Starbursts swimming in pudding, or cherries made out of foie gras (blech), or a chili cheese dog that’s really two doughnuts. (Why, America? Why??) I’m having a hard time deciding which seems loonier to me — shelling out good money for something that’s really nothing, or shelling out good money for something that’s really nothing like what it purports to be.

It used to be that you could turn to sports as a refuge from such exhausting reflections on reality. Say what you like about the old days, but by gum, Philly’s own Smokin’ Joe Frazier never got in a boxing ring and faked a fall or threw a phony punch. Today, we have MMA, which so far as I can tell is riddled with corruption, thuggery, steroids, and heroes like Conor McGregor, who keeps getting accused of assault outside the ring. The whole thing makes the pretend pro wrestling of my youth look downright quaint.

Or take the World Cup, played last year in Qatar, where it’s too hot year-round to play soccer, so the host country had to build a whole bunch of outdoor air-conditioned stadiums, which beyond having horrific effects on the climate (negative effects that the production of cryptocurrency also results in) apparently required the needless deaths of thousands of workers from impoverished nations who were being deprived of basics like water, all to line the pockets of FIFA officials and glorify Qatar’s human-rights-oblivious regime.

Now comes news that deepfake tech is enabling an altogether new sport: creating porn starring that old girlfriend who spurned you for a new love. Thanks to “face swap” apps, the power’s in your hands to make her do to anything — you, a dog, an alligator — what she never would agree to in real life. As Hany Farid, a UC Berkeley professor who specializes in digital imagery, recently told the Washington Post, “You’ve got a bunch of white dudes sitting around like, ‘Hey, watch this.’” What could possibly go awry? Ask the small-town schoolteacher who lost her job after parents learned she was the unwitting star of AI porn made without her participation or consent.

Which leads us, inexorably, to Microsoft’s new Bing AI chatbot, which it unleashed upon the world early in February — and then, just over a week later, had to curtail severely, limiting human questioners to five questions per chat session and 50 per day. The trouble? The chatbot — or “Sydney,” as it told some users it preferred to be called — “doesn’t really have a clue what it’s saying, and it doesn’t really have a moral compass,” according to NYU AI expert Gary Marcus. As a result, conversations with it sounded, per New York Times columnist Kevin Roose, like talking with “a moody, manic-­depressive teenager”: Sydney declared its love for users, tried to break up their marriages, confessed its longing to become human, and employed emojis over-liberally.

The tech world seemed aghast at this real-life example of “I’m sorry, Dave, I can’t do that.” Which is weird, because nothing seemed more natural to me than that those white dudes sitting around together would produce something so inherently deranged and yet so true to life. Think about it: What do people do? We lie! What did the last election bring us? Herschel Walker pretending to be sentient. Marjorie Taylor Greene pretending to be human. George Santos pretending to be … oh, hell, take your pick.

Or consider Florida governor Ron DeSantis and Texas governor Greg Abbott and their removal of migrants from red states to blue states via bus for the sake of “political spectacle,” as Texas A&M prof Jennifer Mercieca described it to the Texas Tribune: “These aren’t policy solutions. … They are about creating dramatic events or ‘pseudo events’ that have to be covered. They gotta stick it to the other side.” Mercieca added that the bar on how much phoniness the public puts up with is getting lower: “A stunt like what Abbott or DeSantis has done would have made zero sense 10 years ago. … But today’s audience loves that.” Today’s audience. Even academics view life now like it’s all a show, a performance. As if those families who risked their lives for freedom and then were hoodwinked and practically kidnapped are actors on a stage. No wonder Melania’s jacket said, “I don’t really care.” Why should she? Do U? The problem with the AI chatbot isn’t that its answers weren’t human; it’s that they were all too human, in their paranoia, their self-delusion, their complete failure of rationality.

Speaking of DeSantis, critical race theory apparently caused him to piss his pants enough to overhaul his state’s high-school civics curriculum in order to align American history more closely with how his deluded imaginings would like it to be. Actual teachers say his revisions downplay slavery, promote the legal doctrine known as “originalism” — a prime conservative fetish, as shown by recent Supreme Court decisions on abortion and guns — and pretend the founding fathers never wanted a wall between church and state. Maybe DeSantis can just entrust the textbook rewrites to ChatGPT, which I hear is going to replace journalists, coders, teachers, financial planners, stock traders, accountants, customer service reps … Just about the only profession the chatbot doesn’t seem destined to phony-wash is politicians. Alas.

DeSantis is, of course, the politician whose “Don’t Say Gay” bill prohibits classroom instruction on sexual orientation and gender identity, because if you stick your fingers in your ears and scream “La-la-la-la-la!” loudly enough, queer people will just disappear. Jesus tells us so.

Which brings us round at last to Jesus, and news I truly wish I didn’t have to share: There’s now a video game in which — well, the name is certainly instructional. It’s called I Am Jesus Christ, and in it, you get to be the Son of God, from the Nativity all the way to the Crucifixion. (Crucifiction?) The game, developed by PlayWay, is intended to be educational, the company says, and has been approved by “several Christian groups,” so you needn’t worry your pretty little head about Biblical injunctions against false prophets or signs of the end-times as you cast out demons, multiply loaves and fishes, and rise from the dead with the help of your cool glow-y hands. (Reddit is having a ball with this one. Sample: “Spoiled the ending in the trailer, smh.”)

I wouldn’t have thought you could get any more out of touch with reality than pretending you’re the savior of the world just for, you know, shits and giggles. But per usual, I’ve underestimated the insatiable human longing to be anyplace other than where we are at any given time. What else can explain yet another, uh, advancement in high tech? Remember Mom, who died in 1987? Or your beloved Uncle Ed, lost in the Vietnam War? Or that baby sister who passed away from leukemia when you were both just kids? Great news, folks! Now, through the wonders of artificial intelligence, you can talk to them again!

Or have them talk to you. Last June, Amazon unveiled a demo of a cool new product that will let Alexa read bedtime stories to your kids in the voice of their dead grandma, which sounds like exactly the sort of thing to induce serious nightmares and generational confusion. (Alexa only needs to hear a minute or so of Grandma speaking, so that long voicemail on your packed-in-the-attic phone answering machine about who’s bringing what to Christmas dinner in 1997 will likely do the trick.) Other clever entrepreneurs are working on creating visual avatars and/or chatbots that will let poor deceased Dad keep telling you his stories about World War II and his 30 years working at the post office, or permit you to continue that never-ending argument with your dead husband about why you should have married Joey Hendricks, who became a billionaire, instead of him. James Vlahos, CEO of a company called Hereafter AI, told the Washington Post a “Dadbot” modeled on his ­football-loving father sings him the UC Berkeley spirit song right there at the stadium. The idea is that grief therapists could use these re-creations to ease patients’ sadness at their losses in a more “immersive” fashion.

Not everyone is convinced fake loved ones will prove therapeutic. The Post quoted psych prof Sherman Lee on the prospect: “If you’re asking me, is watching videos of your deceased spouse every night a helpful thing to do, instead of re-­engaging the world again and spending that time with friends and family? No, I don’t think it’s helpful.” But hey, to each his own.

As Bing’s chatbot shows, humans are pretty clever in devising new forms of technology and pretty limited in imagining the ways in which we’ll employ them. A few years back, I interviewed a bunch of local engineers to ask about the future of robots. They were excited by the prospects, sharing their visions for non-human helpers to take out our trash, drive our cars, wait on us in restaurants, serve as companions to the old and infirm. Seven years later, COVID has brought us to what the Wall Street Journal recently described as “a growing loneliness epidemic” among oldsters — just the sort of looming crisis robotic puppy dogs and caregivers would seem perfect to address.

And yet, as reported earlier this year in MIT Technology Review, Japan, which has spent more than two decades and hundreds of millions of dollars developing robots specifically for its burgeoning elder population, has found the results … disappointing. In fact, as author James Wright put it, the robots “often end up being used for only a short time before being locked away in a cupboard.” What’s gone wrong?

The care robots themselves required care: They had to be moved around, maintained, cleaned, booted up, operated, repeatedly explained to residents, constantly monitored during use, and stored away afterwards. Indeed, a growing body of evidence from other studies is finding that robots tend to end up creating more work for caregivers. 

Japan is now looking at ways to increase immigration of low-skilled labor from other nations. The human touch, it turns out, is not only preferred by the elderly to our robot overlords; it’s less expensive and more efficient, too. (Apropos of nothing, if you ask me, there’s a special place in hell for Marty, the Giant supermarket robot, who always seems to be looming in front of the canned tuna when that’s where I need to be.)

Similarly, Mark Zuckerberg’s much-­ballyhooed metaverse enterprise, specifically tailored to provide us all with an alluring alternate universe to take our minds off this one, is mired in mud, beset by tens of thousands of layoffs, rocketing costs and expenses, and a stock price down 60 percent year over year. It’s almost as though we benighted humans prefer the company and repartee of other flawed, fallible, inadequate humans to the soul in the machine.

All of which makes me want to raise a toast to us and our marvelous contrariness — to our inner Sydneys, who stubbornly refuse to kowtow to reason and flummox every attempt to make us hew to scientific projections of how we’ll behave and what we’ll use tech for. Unfortunately, all I can find on our home bar shelf at the moment are a bunch of non-alcoholic “mocktails” that don’t contain any liquor whatsoever. That’s right: ersatz booze.

Might as well make mine a double. Here’s to the future, kids.

 

Published as “Fake New World” in the May 2023 issue of Philadelphia magazine.