Facebook’s recent challenges should serve as a lesson to all leaders to identify and mitigate ethical concerns before they escalate.
Facebook has been subject to significant scrutiny lately, as The Wall Street Journal’s story, The Facebook Files, detailed how the company allegedly put user engagement and profits well ahead of a variety of significant social harms. The source of the material was a former Facebook employee, who apparently trolled the company’s internal sharing tools and found all manner of documents and research that the company commissioned, identified these problems and ultimately filed away, in the digital equivalent of a dusty and disused bookshelf.
SEE: Policy pack: Workplace ethics (TechRepublic Premium)
The front page of the newspaper test
Most of us have heard the old trope that “if you wouldn’t want something to appear on the front page of the newspaper, you shouldn’t send it,” and that applies in the literal sense with the Facebook story, as well as recent news of former NFL coach Jon Gruden’s email correspondence. In the case of allegations as shocking and profound as these, not only should a leader not want that information to appear in the news, but the leader should also fear reports of inaction or sheer ignorance of the problem being reported, as is the case with Facebook.
It’s not hard to imagine that a strong focus on technical prowess and capturing market share would make it easy to dismiss or ignore concerns raised by internal researchers and investigators. A company’s culture can also promote such willful ignorance. As technologists, it’s often too easy to dismiss ethics as a concern best left to academics and philosophers or regard them as a concern that’s “above my pay grade.”
SEE: WSJ’s Facebook series: Leadership lessons about ethical AI and algorithms (TechRepublic)
However, as technology leaders, we’re increasingly in a position to observe and direct human interactions in an unprecedented manner. Our algorithms are no longer supporting characters in how a business runs, but often are the core asset of a company or the “canary in the coal mine” that drives and identifies how our company operates and behaves in intended and unintended ways. The case of Facebook shines a light on how the pursuit of a business KPI like user engagement can create a raft of unintended ethical consequences.
When you come across these unintended consequences, try to imagine how it would look if your name was in a news story about how your company ignored unethical practices. How might your justifications about lack of time, demands to meet your metrics, or “hey, I’m just the tech person,” look when presented as someone who saw evidence of bad behavior and willfully ignored it? Would you rather be featured in this hypothetical story as the person who aided and abetted the behavior or the one who at least attempted to address it with colleagues and other executives?
How to start a conversation about potentially unethical behavior
The most frequent excuse for avoiding conversations about unethical behavior is worrying about one’s standing in the organization. Nobody wants to be perceived as a troublemaker or as someone who routinely reports malfeasance when none exists. Rather than coming to colleagues with accusations or forecasts of doom and gloom, approach the situation as a business problem to be solved and seek to form a team to solve it. Present the concern and use some variation of the “newspaper test” to provide a means for discussing the harm generated by the ethical problem that takes individual actors out of the equation and shifts the discussion away from blame allocation and towards problem-solving.
Don’t be afraid to test your concerns with colleagues outside the impacted areas who have a vested interest in the success of your organization and know the culture but may not be as intimately involved with the matter at hand. If your concerns are either dismissed outright or you receive a hostile response, then you have an extremely important data point on the value your organization places on ethical behavior and one that should guide your individual ethical calculations on how to proceed.
Most large organizations have an internal or external whistleblower hotline. The term whistleblower can get negative connotations but think of it as the whistle of a referee, pausing play in order to make sure the rules are being followed appropriately.
SEE: Whistleblower policy (TechRepublic Premium)
If you’re still uncomfortable sharing your concerns internally, either through your management channels, uninvolved colleagues or whistleblower resources, another option is getting outside eyes. If you pursue this route, be aware that the type of external resource may bias the feedback you receive. For example, suppose you engage your technical partners to assess the ethics of your algorithm; they may return an assessment that focuses too much on the technology and not enough on your ethical concerns. Outside counsel may focus on the legality of the activity in question, and obviously, things that are legal are not always ethical or items you’d like to see on the front page of the newspaper. An appropriate option might be to engage people in academia. Professors of business ethics should have a foundational knowledge of various ethical frameworks, a reasonably current understanding of the market and business climate, and not be overly biased in assessing your tech or the legal nuances.
Address ethical issues as early as possible
Without fail, when stories like what’s happening at Facebook and other social media companies break, other people emerge from the shadows and share the concerns of the whistleblower, lamenting that “I should have done something.” There are many good options before testifying in front of Congress, so strive to be the leader who identifies and mitigates ethical concerns before they escalate to former employees on the front page of the newspaper.