The problem:

The web has obviously reached a high level of #enshitification. Paywalls, exclusive walled gardens, #Cloudflare, popups, CAPTCHAs, tor-blockades, dark patterns (esp. w/cookies), javascript that makes the website an app (not a doc), etc.

Status quo solution (failure):

#Lemmy & the #threadiverse were designed to inherently trust humans to only post links to non-shit websites, and to only upvote content that has no links or links to non-shit venues.

It’s not working. The social approach is a systemic failure.

The fix:

  • stage 1 (metrics collection): There needs to be shitification metrics for every link. Readers should be able to click a “this link is shit” button on a per-link basis & there should be tick boxes to indicate the particular variety of shit that it is.

  • stage 2 (metrics usage): If many links with the same hostname show a pattern of matching enshitification factors, the Lemmy server should automatically tag all those links with a warning of some kind (e.g. ⚠, 💩, 🌩).

  • stage 3 (inclusive alternative): A replacement link to a mirror is offered. E.g. youtube → (non-CF’d invidious instance), cloudflare → archive.org, medium.com → (random scribe.rip instance), etc.

  • stage 4 (onsite archive): good samaritans and over-achievers should have the option to provide the full text for a given link so others can read the article without even fighting the site.

  • stage 5 (search reranking): whenever a human post a link and talks about it, search crawlers notice and give that site a high ranking. This is why search results have gotten lousy – because the social approach has failed. Humans will post bad links. So links with a high enshitification score need to be obfuscated in some way (e.g. dots become asterisks) so search crawlers don’t overrate them going forward.

This needs to be recognized as a #LemmyBug.

  • activistPnkOP
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    9 months ago

    Have moderation. Problem solved.

    You’re suggesting that humans do the work of a machine.

    Machines can automatically detect links that are exclusive in a variety of ways. If you want a human to do that work, then they will have to use a machine for the detection anyway. So it’s an unnecessary labor burden when moderators are overtasked as it is. Some of the tagging requires humans but putting that whole effort on mods is a recipe for disaster for the same reason: mods don’t have time to follow every link and tag it. Users do.

    The manual moderation approach also fails because (as others have pointed out) some people want to see paywalls. How does a moderator remove a link for some viewers but not others? That can only be done on the client side by the client acting on tags.

    Moderators will remove all which you dislike.

    You’ve misunderstood the report. Removal is not proposed.