In Wednesday’s testimony before the combined Senate Judiciary and Commerce committees, Mark Zuckerberg exchanged the Silicon Valley startup uniform of tight-fitting t-shirt and hoodie for a finely pressed suit. The switch was more than situational; it was Zuckerberg appearing in his true form as the uniformed head of a global monopoly, one that is comprised of thousands of employees who, despite their various roles and intentions, ultimately share a corporate goal that has become toxic to Facebook the organization and the social world impacted by this social network: grow at all cost.
Let’s begin by exposing the economic roots of the problem: Facebook is a publicly traded company that makes money predicated on a simple proposition: it’s free. Its business is seemingly like Zuckerberg himself: open, casual, friendly, but while aggressively pursuing maximum growth, maximum market value. The users that populate its pages understand this fundamental bargain at some level, but, as with any relationship, there are limits and what worked while sharing photos with friends several years ago is maybe not the same as seeing news stories influenced, say, based on your facial expressions while facetiming your sick father.
Facebook has a long history of asking for forgiveness, rather than permission. Apologizing is perhaps one of their true company values. It has what has been substituted for actual change or moreover, actually understanding the manipulative affects their product technology has on the very users they whose lives they seek to improve through “connecting” them.
The Problem Is Not What We Think
Last week, when the Cambridge Analytica scandal was reported by the New York Times to have compromised more than 87 million accounts, Facebook’s lax management was given a numerical value. But for anyone who is truly serious about diagnosing the company’s problems, focusing only on Cambridge Analytica specifically and privacy concerns generally, misses a much larger point. Beyond the misuse of data by application developers, or confusing users with overly complex privacy settings, Facebook capitalizes on personal information in ways far beyond what most people understand and what users ever consciously agreed to.
Collect and act on NPS-powered customer feedback in real time to deliver amazing customer experiences at every brand touchpoint. By closing the customer feedback loop with NPS, you will grow revenue, retain more customers, and evolve your business in the process. Try it free.
Zuckerberg has pointed out that the “data access” problem leveraged by Cambridge Analytica was fixed in 2014. App developers no longer get access to that data. Facebook already has extensive, albeit complex, user settings to manage who sees what user information Except when it comes to Facebook’s own application developers.
Facebook developers have access to everything—from phone calls, metadata, messaging text, to photos to changes in privacy settings, to when you change your mind about a privacy setting. Facebook developers write computer algorithms, including using artificial intelligence—to analyze everything they can about their users. They buy information from data brokers to understand non-Facebook behavior. Based on their patent applications, they’re planning a heck of a lot more of the same. This behavior analysis is, in fact, Facebook’s bread and butter — not giving data to advertisers.
Interestingly, Zuckerberg said exactly that in his testimony. “Let me be clear, we do not share user data with advertisers.” Most of the Senators had trouble grappling with this comment, since it flies in the face of most coverage of Facebook’s privacy issue.
All the time we (as users) waste on the site isn’t an accident. Facebook uses algorithms to figure out what to show its users, whether paid advertisements, news (real or fake), or friends posts. At the most basic level, one can imagine that algorithms are used to maximize user engagement. For instance, if Facebook shows you ads based on what ecommerce sites you visited, friends feeds based on your comments, and news based on your political leanings gleaned from your posts, you are more likely to stick around and more likely to click on Facebook ads. That’s how Facebook makes money.
Zuckerberg was transparent on this point as well. “Advertisers choose who they want to reach, he said.” [Based mostly on demographic criteria]. “We decide which ad and who sees it.”
The predatory aspect of this model is that Facebook can (or will) take advantage of your emotional states, health, the vagaries of your relationships or employment status, addictions, vulnerabilities and pretty much anything else you can imagine, in order to influence mood, opinions, buying decisions, or pretty much anything else you can imagine an advertiser might want to influence.
Zuckerberg claimed that such behavior is consented to by users. But only some of the personal information Facebook developers leverage is actually configurable in Facebook settings. Other information is “consented to” through the type of online terms and conditions click-throughs virtually everyone ignores.
Three Things Facebook Can Do Now to Restore User Trust
First and foremost, Facebook should adhere to its corporate values and users, shareholders and the media should hold them accountable to these statements Zuckerberg likes to repeat. Facebook’s five founding principles, first articulated in a letter accompanying their initial SEC filing, are not only to “Move Fast” and “Be Bold,” but also to “Focus on Impact,”“Be Open,”and “Build Social Value.” I am not sure what “focus on impact” means, so first, Facebook might start by improving its core value statements. Does that mean make a positive impact on the world or make a positive impact on Facebook’s bottom line?
Second, Facebook should understand what these values mean to its users and to employees. A company’s brand is defined by its relationship with its customers. In testimony, Zuckerberg said it needed to work on its relationship with users. This is positive awareness. Facebook’s brand right now is not “we bring the world closer together” no matter how often they say it. Facebook’s brand is that they don’t respect their users’ privacy.
Similarly, a company’s culture is defined by its relationship with its employees. Zuckerberg also testified that it rightfully did not do anything about the infamous memo written by a Facebook exec that clearly showed behavior did not align with core values. Your company has no core values if behavior is not aligned. It’s apparent Facebook’s culture is one that doesn’t trust its own value statements and suffers from low morale.
Facebook employees, including executives, should meet with a broad range of Facebook users. They should try to develop empathy. They should understand their users needs, desires, pains and passions, hopes and aspirations. Instead of merely reacting to parents insistence that Facebook change features that affected their children, Facebook should get out front and understand more deeply what concerns the parents. And same goes for the teenagers.
Third, Facebook should decide what they’re willing to give up. Interestingly, Zuckerberg has argued that company values are only relevant if one recognizes the sacrifice that is made by holding oneself to a that value. Values “are not free,” Zuckerberg says. For example, he says Facebook sometimes suffers from poor features or buggy software because they move fast. But who is actually suffering, Facebook or its users? Who suffers when users’ private information is abused?
The bitterest medicine is giving up growth for growth’s sake. Zuckerberg states repeatedly that he is trying to create social value by connecting the world. Maybe he should connect less of them, until Facebook learns what value they can create and what the flipside looks like. In other words, what harm is done by creating the value? In testimony, Zuckerberg immediately went to artificial intelligence (AI) as the cure for ensuring foreign governments don’t interfere with elections. But what about all the potential harm Facebook’s AI might create?
To Zuckerberg’s credit, several times during his testimony he expressed an openness to government regulation and sees that regulation has a place in capitalism. His views were far more balanced than several anti-regulation US Senators who pretended to have the backs of today’s college-age entrepreneurs. An easy start would be to adopt the EU’s GDPR rules in the US. Zuckerberg stopped well short of that, but expressed willingness to work with Senators on legislation. The fear is that first, Facebook’s massive lobbying engine undermines Zuckerberg’s claims. And second, Facebook might be satisfied helping lawmakers draft legislation tightening privacy laws, without addressing the real issue described above. Again, Facebook must be willing to hold employees to corporate values.
Finally, breaking up the company in order to decrease market dominance is a real option. Zuckerberg would not admit he feels like a monopoly, but he couldn’t name a true competitor. User privacy and innovation are more likely outcomes when companies compete for value creation in society, that being Facebook’s 5th value.