Moving Beyond the False Dichotomy for Federation Management

Federation on the fediverse doesn't have to be a binary choice between allowing everything or needing to pre-approve your entire network. This is a false dichotomy.

This article is adapted from an unaccepted application for Harvard Applied Social Media Labs' Fellowship program.


At present the most widely deployed open-source software for ActivityPub is the Mastodon project which, like many Fediverse projects, implements federation management in a way that exposes service operators, moderators, and users to potential significant harms by default: open federation with anyone and everyone by default, unless federation is explicitly prevented.

Mastodon does support the ability to operate in a different mode, which is limited federation, where it only federates with servers pre-approved for federation (which significantly limits the usefulness of the software, though does enable island networks to form).

The limited federation mode is often counter to one of the Mastodon project's stated missions, which is: “each Mastodon server is a completely independent entity, able to interoperate with others to form one global social network”. When this is paired with their other goals such as “Built on open web protocols, Mastodon can speak with any other platform that implements ActivityPub. With one account you get access to a whole universe of social apps—the fediverse.”, we can start to get a picture of why federation is open by default.

The result of these policies is that other modes of networking are ignored or passed over, even whilst the network as a whole suffers from abusive and harmful actors, which must be dealt with on a server-by-server basis, which requires moderators to be aware of the full extent of harms that exist out in the wide open web. For instance, moderators need to know that there are servers setup exclusively for the publication of CSAM (Child Sexual Abuse Material) and pushing that to random people, or the number of servers that exist for trolling, trouble making, harassment, or other offensive content.

🔥
In the Child Safety on Federated Social Media (2023, Thiel & DiResta) research paper, they included significant parts of the wider network of federating services, which lead to significant findings of CSAM on the network, despite the mainstream network widely defederating from those servers.

This can lead to many people's first experiences of Mastodon being that they join the network, and then immediately face harassment and abuse, simply because of who they are.

As noted in New Paradigms in Trust and Safety: Navigating Defederation on Decentralized Social Media Platforms by the Carnegie Endowment for International Peace, written by prominent trust and safety experts and technologists familiar with ActivityPub, one of the most infamous defederation events occurred when Gab, a right-lean free speech platform attempted to join the network, on the basis of Gab's policies not aligning with a significant portion of Mastodon server operators' policies. That is, they “use[s] the pretense of free speech absolutism as an excuse to platform racist and otherwise dehumanizing content”, as described by the Mastodon founder, Eugen Rochko.

However, we've been presented with a false dichotomy here: we don't need to operate in a world where we either allow everything until blocked, or block everything until we allow it. There are other ways that we can look at federation management. Some of these include Consent Based Federation, where a moderator or administrator has to review and approve a server before accepting federation (or denying it), or models that resemble a firewall on a computer or server: you'd don't just have to accept or deny federation, but you can also apply filters to the federation between your server and another.

Whilst we cannot change Mastodon’s mind necessarily, we can change the software to enable others to experiment with alternative federation modes. At present, Mastodon’s federation management is controlled by two separate features: domain blocks and domain allows — only one is available depending on your “federation mode”. These two features involve completely separate database tables, completely separate user interfaces, etc. It's a lot of code duplication between the two, and due to their counter-mission status, domain allows ends up being rather limited in functonality, e.g., you can't allow federation but demote the content through silencing (this is where a server's content doesn't appear in shared timelines, trends, and approval of follow requests is mandatory).

Instead of this duplicated and inflexible design, we can design a different system which is much more flexible and enables downstream forks of Mastodon to experiment with alternative federation modes (such as consent-based federation). One way to design this more flexibly is to borrow ideas from Firewalls, where we have a choice of accepting, filtering, rejecting, or ignoring inbound traffic and outbound traffic.

By changing towards this more flexible model for federation management, we remove the artificial barrier to exploring alternative modes of federation in Mastodon. By combining the two distinct features of domain blocks and domain allows into one federation management feature, we do away with all the duplication and technical debt that they incur.

When I designed FIRES, I explicitly designed it around this firewall based approach, and also designed it to be recommendation based rather than prescribing the actions that must be taken (as denylists today currently do), this puts service operators in charge of how they federate with the Fediverse, whilst still providing information from trusted data providers to assist them with keeping their communities safe.


If you'd like to support my work on trust and safety in the Fediverse, please do via https://support.thisismissem.social — every little bit helps make my work more sustainable.