Child porn is a serious problem, but I'm concerned that it will become a wedge for increased mandatory moderation of the Fediverse. Here are a few relevant excerpts from the paper:
"...bad actors tend to go to the platform with the most lax moderation and enforcement policies. This means that decentralized networks, in which some instances have limited resources or choose not to act, may struggle with detecting or mitigating Child Sexual Abuse Material (CSAM). Federation currently results in redundancies and inefficiencies that make it difficult to stem CSAM, Non-Consensual Intimate Imagery (NCII) and other noxious and illegal content and behavior. ...
Fediverse administrators are responsible for not only deciding what their users are allowed to post (content guidelines) on their instance, but also for the content posted by any users on remote servers that a local user follows. If a local user follows a remote user who posts illegal content, that content will be federated to the local server and potentially be displayed to users in their federated timeline, as well as stored on the server or media cache. The primary method of dealing with this is to defederate from servers with lax content moderation...
Apart from defederation, limiting exposure to illegal or harmful content is by and large left up to users themselves. ...
The ActivityPub specification does not provide any guidance for Direct Messages; in Mastodon, DMs are more akin to “posts with an audience of two”, and are readable by instance admins. Because of this, DMs on Mastodon are unlikely to be a primary channel for child exploitation-related activity. ...
...it is unreasonable to expect users to curate their own filter list of hashtags and keywords—particularly as hashtags and keywords related to CSAM change with high frequency. The ability to perform filtering of hashtags and keywords for discoverability at the server level would be more effective at this task. ...
...individual use of PhotoDNA [Microsoft image-analysis tool] by every server in the Fediverse is likely to be prohibitive due to the potential for bursts of tens of thousands of near-simultaneous requests. However, given that the majority of Fediverse users reside on the top 10 or so servers, it would be prudent to integrate detection tooling into the codebase itself. For the front-end UI, all that would be necessary to make for easy integration is allowing admins to input their PhotoDNA and CyberTipline API keys...
On the backend, tooling would be needed to perform a request to PhotoDNA for each image ingested by the server, with a positive match triggering the content to be discarded and removed from any intermediary storage mechanism, as well as a report to NCMEC via PhotoDNA containing match details and basic user information. ...
...we propose an alternative system of attestation of analysis. In such a system, the server on which a post originates would submit imagery to PhotoDNA for analysis; as part of the response, the PhotoDNA service would include a cryptographic hash of the image along with a signature of that hash.
In the event of a match of known CSAM, the PhotoDNA Match API returns match metadata so that the originating server can make an automated report to NCMEC in a subsequent request to the PhotoDNA Report API. In the event of no match, the Match API would return a tracking ID along with the hash and signature. The post and imagery could be distributed to other servers, with an ActivityPub “update” event containing the hash of the image and signature, effectively attesting that the content has been scanned by PhotoDNA. ...
Counterintuitively, to enable the scaling of the Fediverse as a whole, some centralized components will be required, particularly in the area of child safety. Investment in one or more centralized clearinghouses for performing content scanning (as well as investment in moderation tooling) would be beneficial to the Fediverse as a whole. Given new commercial entrants into the Fediverse such as WordPress, Tumblr and Threads, we suggest collaboration among these parties to help bring the trust and safety benefits currently enjoyed by centralized platforms to the wider Fediverse ecosystem."