As reported by the Wall Street Journal (opens in new tab), Meta has disbanded its internal "Responsible Innovation" team. The 20+ member group included employees of various specializations and backgrounds, including ethicists, and was tasked with investigating and responding to ethical concerns related to Meta's products. One example cited by the WSJ was how it advised Facebook Dating to not include a race filter as part of its services.
The Responsible Innovation team seems to have been established in the wake of the 2016 US presidential election, and has been active for several years. In 2021, Facebook VP Margaret Gould Stewart articulated something of a mission statement for the team titled "Breadth & depth: Why I'm optimistic about Facebook's Responsible Innovation efforts. (opens in new tab)"
"These tools have generated a lot of good in the world," Stewart wrote of Meta's products, "but their very power requires a deep sense of responsibility and a commitment to making the most ethically responsible decisions possible, every day."
Meta has noted that most of the employees from the team will continue doing similar work in other departments. A Meta spokesperson told the WSJ that future work of this nature would be more "issue-specific."
Even with that taken into account, Meta strikes me as a company that could use more ethicists at every level. Meta's biggest products, Facebook and Instagram, have had far-reaching social consequences beyond their initial purviews.
Facebook's role in warping individuals' perception of reality through the content it algorithmically determines they want to see is well documented, with a 2021 Washington Post (opens in new tab) story noting that "news publishers known for putting out misinformation got six times the amount of likes, shares, and interactions on the platform as did trustworthy news sources" during the 2020 US presidential election.
According to Reuters (opens in new tab), Meta is still facing a class action lawsuit from Rohingya refugees over the platform's sluggish response to hate speech and misinformation spread about the ethnic group. In 2017, Myanmar's military perpetrated a genocide against the Rohingya, one that was spurred on by hate speech hosted on Facebook, including sock puppet accounts set up by elements of Myanmar's military.
A 2021 report by the Wall Street Journal (opens in new tab) said that Meta's own internal documents reveal that it has been aware for some time of the staggering psychological toll Instagram can have on young people, especially teenage girls. One example was an internal Meta survey of teenage Instagram users which revealed that 40% of those who reported feeling unattractive attributed those feelings to the app.
Meta seems like a company in desperate need of a stronger Responsible Innovation team, all things considered. That's likely not in the cards from a fiscal perspective as the company loses $1 billion a month (opens in new tab) on its metaverse pivot-leading Reality Labs division. At least we can still get together and have a big laugh over that silly selfie Zuckerberg posted (opens in new tab), even as the billionaire and aspiring mixed martial artist has already had his digital avatar (opens in new tab) thoroughly yassified (opens in new tab).