Why Facebook keeps stepping in it
By Barbara Ortutay
NEW YORK — Years of limited oversight and unchecked growth have turned Facebook into a force with incredible power over the lives of its two billion users. But the social network has also produced unintended social consequences — and they’re starting to catch up with it:
— House and Senate panels investigating Russian interference in the 2016 elections have invited Facebook—along with Google and Twitter—to testify this fall. Facebook just agreed to give congressional investigators 3,000 political ads purchased by Russian-backed entities, and announced new disclosure policies for political advertising
— Facebook belatedly acknowledged its role purveying false news to its users during the 2016 campaign and announced new measures to curb it. Founder and CEO Mark Zuckerberg even just apologized—more than 10 months after the fact—for calling the idea that Facebook might have influenced the election “pretty crazy.”
— The company has taken flak for a live video feature that was quickly used to broadcast violent crime and suicides; for removing an iconic Vietnam War photo for “child pornography” and then backtracking; and for allegedly putting its thumb on a feature that ranked trending news stories .
Facebook is behind the curve in understanding that “what happens in their system has profound consequences in the real world,” said Fordham University media-studies professor Paul Levinson. The company’s knee-jerk response has often been “none of your business” when confronted about these consequences, he said.
When such issues arise, Facebook generally restricts itself to bland assertions that its policies prohibit misuse of its platform and that it’s difficult to catch everyone who tries to abuse its platform. When pressed, it tends to acknowledge some problems, offer a few narrowly tailored fixes—and move on.
But there is a larger question the company hasn’t addressed direction: Has Facebook has taken sufficient care to build policies and systems that are resistant to abuse?
Facebook declined to address the subject on the record, although it pointed to earlier public statements in which Zuckerberg described how he wants Facebook to be a force for good in the world. The company also recently launched a blog called “Hard Questions ” that attempts to address its governance issues in more depth.
But Sheryl Sandberg, the company’s No. 2 executive, has suggested that Facebook has work to do on this front. In a recent apology , she wrote that Facebook “never intended or anticipated” that people could use its automated advertising to target ads at “Jew haters”—that is, users who expressed anti-Semitic views in the Facebook profiles.
That, she wrote, “is on us. And we did not find it ourselves—and that is also on us.”
Moving fast, still breaking things
Facebook’s often unresponsive response to crisis may not work much longer for a company that sometimes still seems to hew to its now-abandoned slogan—“move fast and break things.”
Facebook has so far enjoyed seemingly unstoppable growth in users, revenue and its stock price. But along the way, it has also pushed new features on to users even when they protested, targeted ads at them based on a plethora of carefully collected personal details, and even engaged in behavioral experiments that seek to influence their mood .
How it got here has to do with its exceptionalist company culture, a hands-off approach that values free speech over monitoring what its users post, and the fact that no matter how many people it hires, it will always have what amounts to a skeleton crew to deal with its huge user base.
“There’s a general arrogance—they know what’s right, they know what’s best, we know how to make better for you so just let us do it,” said Notre Dame business professor Timothy Carone, who added that this is true of Silicon Valley giants in general. “They need to take a step down and acknowledge that they really don’t have all the answers.”
Market incentives and solutions
Facebook depends on signing up as many users as possible—and pulling in as many advertising dollars as possible—to run its business. Its systems for signing up and for buying ads are both highly automated, a fact that makes the company both efficient and highly profitable.
In the first six months of 2017, Facebook pulled in sales of more than $17 billion and reported a profit of almost $7 billion.
It also helps explain not only why Facebook can seem so disengaged from its controversies, but also why it’s vulnerable in the first place, said David Gerzof Richard, a communications professor at Emerson College.
Russia, for instance, was able to exploit “the capitalist nature of what motivates Facebook,” Gerzof Richard said. If the company was truly focused on the “content, message and quality of ads,” he said, “there would be a very different platform for how you buy and place ads on Facebook.”
Gerzof Richard thinks Facebook should view the “social hacking” of its platform—that is, the unintended uses that spring from human nature—much the way it has looks at technological challenges such as spam and data breaches.
Facebook already gives out “bug bounties”—that is, prizes for people who find technical flaws in its platform. Why not do the same for oversights that allow social hacks of its ad system, user newsfeeds and the like?
“We as a species are very, very inventive,” Gerzof Richard said. “You give someone a power tool and they will figure out ways to use it that the maker has never intended.”