Facebook Trying to Civilize Young People; Young People Resist

Can anything take the mean out of the Internet?


It’s no surprise to anyone who’s ever had one that kids are mean. Really mean. Unspeakably mean. They’re enormously invested in social status, and the way to attain it, as far as they can see (they’re short), is to tear others down. For many eons, young people were only able to do this to those in their immediate vicinity, but now the miracle of technology allows them to stomp all over the feelings of young people around the world and drive them to suicide. (You can read about some particularly egregious examples here, if that’s how you like to spend your spare time.) This is why bullying, and cyberbullying in particular, have become such hot topics. According to Pew Research, 65 percent of those between the ages of 18 and 29 say they’ve been cyberbullied, and 92 percent have seen it done to somebody else.

Now Facebook is attempting to address the problem, at least on its pages, by teaching its users to empathize with others. A recent story in the New York Times discussed the work of Arturo Bejar, director of engineering for Facebook’s Protect and Care Team, which is exploring ways that Facebook users might let others know when their feelings are hurt by a post.

So far, Bejar’s team is focusing on providing “pre-populated messages” that users can send to indicate the objectionable posts’ effects on them. And they’ve found that the more specific the messages, the likelier young people are to use them. The vague “Embarrassing” yields fewer users than “It’s embarrassing,” both of which falter before “This post is mean. It makes me feel sad and I don’t want it on Facebook,” which 85 percent of teens in their research liked enough to send. Specificity pays.

Interestingly, though, when Facebook experimented with offering a blank box in which users could type in a personalized message of harm, it didn’t go so well. The Times quotes Marc Brackett of the Yale Center for Emotional Intelligence, who’s working with the team: “If kids are given a blank box, oftentimes they are going to say things that are not going to be helpful.” That, the story notes, includes cursing at their friends.

Good luck with that, Facebook. A story in Wired by Adrian Chen shows what you’re up against. “The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed” details the task of “content management” workers who cull objectionable material from the Internet for companies like Facebook and Twitter. Here’s a brief summary of the job of one worker, named Baybayan:

A list of categories, scrawled on a whiteboard, reminds the workers of what they’re hunting for: pornography, gore, minors, sexual solicitation, sexual body parts/images, racism. When Baybayan sees a potential violation, he drills in on it to confirm, then sends it away — erasing it from the user’s account and the service altogether — and moves back to the grid. Within 25 minutes, Baybayan has eliminated an impressive variety of dick pics, thong shots, exotic objects inserted into bodies, hateful taunts, and requests for oral sex.

 Hundreds of thousands of workers do this, all around the world, hour after hour, day after day after day. Some of the companies that employ them offer counseling to help them deal with the effects of their work, which can result in what one such counselor calls “a form of PTSD.” It takes its toll in a variety of ways:

Workers quit because they feel desensitized by the hours of pornography they watch each day and no longer want to be intimate with their spouses. Others report a supercharged sex drive. “How would you feel watching pornography for eight hours a day, every day?” [the counselor] says. “How long can you take that?”

Ask one “quality-assurance representative,” who double-checks what the moderators allow to make sure they’re not missing anything offensive. “I get really affected by bestiality with children,” she says. “I have to stop. I have to stop for a moment and loosen up, maybe go to Starbucks and have a coffee.”

Here’s the thing. These screenings aren’t performed for the sake of young people, who apparently aren’t fazed at all by “the Internet’s panoply of jerks, racists, creeps, criminals, and bullies.” According to Wired, the problem is the “Grandma factor”: Old people “won’t continue to log on if they find their family photos sandwiched between a gruesome Russian highway accident and a hardcore porn video.” The former chief of security for MySpace estimates that there’s an army of more than 100,000 Internet scrubbers out there, trying to protect us from ourselves.

Meantime, at Facebook, Bejar says his team is also experimenting with aural emojis — grunts, sighs, giggles — that users could employ to express how they feel about a post. I hate to imagine where that might go. Oh, and he says that “technology still has a lot of work to do to humanize each other.” Somehow I don’t think that’s a job for technology. Anyway, the silver lining is that once my generation dies off (can’t be long now, can it?), all these content managers can go find other work. They won’t be needed anymore.

Follow @SandyHingston on Twitter.