Getting free from the slop machine

 A photo of comedian and performer Tim Minchin. Superimposed text says: We understand as humans that proximity is the salve of prejudice... We thought that social media would make us more proximate and make us a better democracy.... but it has turned out that the algorithms that we have allowed into the world do the opposite. They are de-humanisation mechanisms.
I read some misguided comments Tim Minchin made about social media. He conflated algorithms with all social media, and claimed they’re intrinsically dehumanising.

He’s half right. Manipulation by recommendation engines and algorithmic oppression is an extant threat that’s changing our relationships and our politics.

But social media - interactive technologies that facilitate the networked sharing of information - can be emancipatory. It can expand our worlds in real ways. That sounds Pollyanna-ish until you recall that we’ve experienced that potential ourselves. Made new friends. Seen things in new ways.

It wasn’t always this way

One of my kids was recently quizzing me about Mastodon, and he pulled me up when I said I like the chronological feed.

“What does that mean? If there’s no algorithm, how do they know what you want to see?”

I realised he’s never known a chronological social media feed. Twitter, Instagram and Facebook abandoned theirs before he was born. The idea that all I see are the accounts I follow, and that I see everything they post, in the order they post, broke his brain a bit. He didn’t like it.

“You might also like” this slop

But the recommendation algorithms aren’t working like they used to either (if they ever really worked at all). Instagram is allowing you to “reset” your recommendation algorithm. Even though they have hundreds of thousands of data points about you, they’re recommending things that you’ve already long-forgotten. Or worse, that you wish you could forget.

Beyond blocking or unfollowing accounts, there aren’t many ways for us as users on algorithm-driven platforms to control the slop we’re fed. In fact most platforms are doubling down on the slop, serving up content from accounts you never followed in an effort to drive up fake engagement metrics. I don’t know if you’ve checked Facebook lately, but the dead internet is real and your relatives are trapped in it.

That highlights a tension we all experience. We all know our privacy is being invaded and our data linked by these recommendation algorithms. We don’t like the idea that we’re living in filter bubbles where something else decides what we see, but we can’t imagine anything else possible.

What instead?

This also made me consider a major upside of the fediverse that we don’t promote enough: the best recommendation algorithm is another human, and on the fediverse that’s all we have.¹ There are still human-driven ways to encounter novelty, and to escape the flattened world served up to us by algorithms.

These are some of the ones I use:²˒³

  • For academic reading: The Syllabus is expensive, but it offers genuine insights into different fields. It’s expanded my reading and thinking.
  • For links: Metafilter is still the most venerable and reliable source of web-based surprises.
  • For films: Letterboxd has a “films like this” algorithm, but that’s about the extent of the recommendations you’ll find. It’s still what your friends watch, listed in the order they watched them.
  • For photos: Pixelfed or even classico Flickr will make you happier than Instagram.
  • For fiction: BookWyrm eclipses the bizarro, astroturfed morass of snark and spin that is Goodreads. Real humans talking about the books they’ve read, and they’re on the fediverse too.
  • Feeds: Good old RSS feeds. These can be hard to get started with, but people like Molly White have shared the feeds they subscribe to, which you can treat like starter packs (and remember that podcasting is basically audio RSS).

These options often have a cost in money or time, essentially because they involve work instead of harvesting your data and mining your attention. If you look at Facebook, TikTok or YouTube you can see that what appears to be free has a cost.

A slop-free existence is possible, but it takes a bit of work.

  1. Yes, I know there are bot accounts, but you have to follow them.
  2. You’ll notice that Bluesky is nowhere on the list. That’s because even though it’s not be the latter-day 4chan that X-the-everything-app has become, it still relies on graph neural network recommender engines. It’s still slop. The ingredients just aren’t as bad - yet.
  3. Social video is tough and not something I know enough about. I’m not a big viewer of YouTube-like platforms. Peertube would be worth trying, but I don’t use it much. YMMV.


The Lennox Street bridge underpass is an example of the old local government planning proverb: revenge is a dish best served via permanent signage.

The Lennox Street Bridge pedestrian underpass in Parramatta. It's a shaded tunnel in a sandstone bridge. A large sign of a white woman, former Sydney Morning Herald columnist and failed state political candidate Elizabeth Farrelly. Superimposed on her is the text:&10;&10;Two brutish square-jawed faux-functional holes whose every semiotic cue says, 'enter here and invite Clockwork Orange-style violence upon your person' will increase the heritage value how, exactly? This is a clear instance of architectural theory running roughshod over the facts.





This is wonderful. Unearthing video snapshots of life between 2009 and 2012.

Dancing, toddlers walking, soccer games, holidays, video messages. Surprisingly sweet insights into the recent past. via MeFi



Data collated by the tertiary education union has revealed there are 306 university executives who earn more than their state’s premier.

Interesting table to look over


I love Law & Order and I heard Noam Chomsky is a fan too. I tried to find a source - according to this 2003 profile, he is:

When Chomsky and Carol are in Cambridge, they usually watch an hour of television at night—“Law & Order” or some other cop show. Carol makes sure that they go to bed right afterward, and they wake up around eight.





An interesting project that imagines New York 20 years from now, and draws on residents' expertise to imagine it

I worked with “informed optimism,” which means you are basically working with many of the same institutions, systems, and tendencies that exist today, and understanding how there could be the best possible scenario. When I interviewed people for New York 2044, I asked them if they could use the lens of informed optimism to imagine their scenarios.

News from Home - Urban Omnibus

Cover of New York 2044, a busy-looking zine/newspaper with headlines about New York housing

On the ongoing delegation of military judgement to autonomous systems - to ultimately kill humans

The use of AI to create novel text, images, and video will likely exacerbate the challenge of cognitive warfare. This form of non-lethal warfare explains the social engineering of adversaries’ beliefs, with the overall intent to affect their defense priorities, military readiness, and operations. In this way, countries will attempt to harness AI to produce misinformation and disinformation, which are designed to mislead and deceive opponents, and across the competition continuum ranging from peace to war.

A new military-industrial complex: How tech bros are hyping AI’s role in war - Bulletin of the Atomic Scientists


Some reasonable points here, and a challenge for us is to be open to the message!

Mastodon’s structure and culture of openness present opportunities to avoid many of the epistemic perils of biased and untrustworthy large corporate platforms. However, Mastodon’s risks include techno-elitism, white ignorance, and isolated, epistemically toxic communities.

Beyond Corporate Social Media Platforms: The Epistemic Promises and Perils of Alternative Social Media - @Lakephil@mastodon.social