Over 18 months, the group have written 14 papers
(some of which I’ve mentioned here in EiM) on everything from artificial intelligence to intermediary liability and from accountability solutions to existing legislation. Last week, it added to that bank of research by publishing its final report
: Freedom and Accountability, A Transatlantic Framework for Moderating Speech Online.
The report, to give you an idea, calls for greater transparency and accountability from the dominant digital platforms and puts forward what I believe is a sensible, flexible framework for moderation based on five components:
- Regulate on the basis of transparency
- Establish an accountability regime to hold platforms to their promises
- Create a three-tier disclosure structure
- Provide efficient and effective redress mechanisms
- Use an ABC (actors, behaviour, content) framework to combat viral deception, or disinformation
(The full report
is readable if you can spare 30 minutes or you can watch an hour-long presentation of the findings here
I wanted to find out more about the recommendations and their possible application in the real world so I asked Jeff Jarvis
, professor at the Craig Newmark Graduate School of Journalism at CUNY and a member of the TWG, to answer some questions via email.
Here’s what he said:
Q The report makes a lot of the lack of trust among tech companies, the public, and government. Why do you think that mistrust exists and when did it start?
To be clear, I speak only for myself, not for the Working Group. Others would disagree, but in my view, what we are seeing is a burgeoning moral panic that arises out of worry about change. The tech companies were too optimistic about human behaviour and did not sufficiently guard against manipulation; they also were far too opaque about their inner workings and haughty atop that. Media and politics – institutions themselves threatened by the change of the net – joined together in attempts at protectionism for their past.
Q Transparency — one of the report’s major recommendations — arguably runs counter to everything about the dominant digital platforms. How likely is it that the likes of Facebook and YouTube will change their ways?
Twitter is fairly public, releasing data on misinformation campaigns for researchers. Facebook has, in fits and starts, tried to release data (see Social Science One
) but trips over GDPR, Cambridge Analytica, and its own culture. Work is needed following the recommendations of the Working Group to examine in greater detail what transparency is needed and why to study the impact of the social networks on society and the impact of regulation to date on the social networks and the public conversation, and to hold the companies accountable for doing what they say they will do. In the long run, transparency will benefit both companies and governments to regain trust.
Q The report recommends a regulator to oversee standards and implement frameworks. What kind of body is best placed to act as that regulator across both the United States and Europe (and beyond)?
Again, I speak for myself here but personally, I do not presume to start with regulatory agencies. As for the group, it is not recommending an extra-governmental, international regulator. That is up to each government (nation and EU) to decide whether a regulator is needed, whether that regulator could be an existing body (e.g., Ofcom in the UK, FTC in the US), or whether it should be a new body.
Q The current debate pitches the respective policies of the social media platforms against one another, which distracts from the wider discussion on process and regulation. To what extent would it be beneficial if the tech companies agreed on a common set of community guidelines and terms of service?
The Working Group proposes a flexible framework that specifically enables companies and communities to establish their own standards, opposing one-size-fits-all rules, and also enabling an ongoing multi-stakeholder discussion — among technology companies, governments, researchers, civil society, and users themselves — as new challenges, such as a pandemic, arise.
Q The report says TWG ‘did not seek unanimity on every conclusion or recommendation’. But which part saw the greatest disagreement and why do you think that was?
The discussions in the Working Group were productive, collaborative, and nuanced, informed by research. When former FCC Commissioner Susan Ness convened the Working Group, I was frankly unsure whether such a disparate set of experts from various sectors, nations, and interests could reach an agreement, and so I am impressed with what the Group accomplished. I would not say there was any pattern of disagreement.
Q What reaction has there been from the tech companies to the report so far and which organisations do you expect to be first to implement some of the recommendations?
In discussions with various technology companies, they have focused on the specifics of transparency. As I said, more work is needed to ask and answer what transparency is needed for what purpose and to be diligent about coming to common definitions (for example, what does it mean to “promote” or “demote” a piece of content when countless decisions are made about it by algorithms and users?). This is not about transparency for transparency’s sake but about providing evidence and research to inform decisions by companies and governments; that could be a productive and collaborative discussion.
Thanks for Jeff for taking the time to answer questions. Who else should I reach out for a Q&A? Let me know.