I recently had the opportunity to attend the Senate's combined hearing by its Judiciary and Commerce Committees, during which Facebook owner and creator Mark Zuckerberg fielded some five hours of questions.
Clearly the concern of the some 42 senators was user privacy and the protection of personal information in the wake of the Cambridge Analytica scandal in which the personal information of some 87 million Facebook users was improperly taken and used for unauthorized political purposes during the 2016 presidential election.
While a comparison was made by some Republican senators to the voter targeting done during the Obama presidential campaigns, the difference here is the deliberate flood of misinformation done by Cambridge Analytica during the 2016 Trump campaign.
Certainly user privacy and the protection of personal data are a concern of major proportions. Facebook is an enormously large company with over $40 billion in annual income, more than 25,000 employees and more than 2 billion monthly active users.
However, as I listened to the testimony I couldn't help but think that the true overarching issue is more than privacy; it is responsibility. What is the responsibility of platform providers to manage the content of those platforms?
Mark Zuckerberg readily admitted to the Senate panel that he does take full responsibility for, in a sense, policing the content of Facebook and shoulders the blame for not having done a better job of preventing misuse of user information. He testified that he is in the process of instituting controls to better oversee and monitor instances of hate speech and its removal from the site. The challenges involved in this effort, he pointed out, deals with nuances in language.
He indicated he is totally committed to monitoring any access of large amounts of data and the use of comprehensive audits to ensure proper use of the information accessed. This step, he admitted, should have been done with regard to Cambridge Analytica and would likely have prevented that scandal. Zuckerberg also indicated he is not against some regulation and supports the bipartisan “Honest Adds Act” sponsored by Senators Klobuchar, McCain and Warner requiring verification of advertisers running political ads.
Encouraging. We now have the head of likely the largest social media platform in the world declaring that he and his company are, indeed, responsible for the content on that platform. We are making progress.
But what about other providers of platforms? What responsibility to manage content of their sites do they have? Take for example Armslist.com, which is an internet site devoted to bringing together individuals looking to purchase firearms with those individuals seeking to sell firearms.
“Engaging in the business of selling guns” is specific language from the Bureau of Alcohol, Tobacco, Firearms and Explosives that determines whether a seller is required to have an FFL, or federal firearms license. Sales by a federally-licensed dealer comes with it the requirement of a background check in the NICS, National Incident Criminal System, database of the prospective buyer.
Forty percent of gun sales are conducted at gun shows and over the internet and are done without background checks. If any of the sales transacted through Armslist.com turn out to be connected in any way to a mass shooting, for example, what responsibility, if any, should Armslist.com bear?
So, regardless of the testimony and the commitment of Mark Zuckerberg to take his responsibility for the content on Facebook more seriously than had previously been done, the question for all other information exchanging platforms regarding their responsibility for content still remains. Is managing platforms such as Armslist.com or Craigslist or Twitter or Google or any other similar information exchanging platforms any different than managing a mammoth social media platform like Facebook? If so, why?
It seems to me that providing a platform whether to share cat photos or purchase firearms comes with it the responsibility to ensure the information exchanged via that platforms are not misused and protocols to ensure proper use falls to the platform provider in every case.