strypey,
@strypey@mastodon.nzoss.nz avatar

"[SubStack] began encouraging individual writers to recommend one another, funneling tens of thousands of subscribers to like-minded people. It started to send out an algorithmically ranked digest of potentially interesting posts to anyone with a Substack account, showcasing new voices from across the network. And in April of this year, the company launched Notes, a text-based social network resembling Twitter that surfaces posts in a ranked feed."

, 2012

https://platformer.substack.com/p/why-substack-is-at-a-crossroads

strypey,
@strypey@mastodon.nzoss.nz avatar

This is the strongest part of Platformer's criticism of . One can argue that hosting writing most people find unsavoury, and even making money from doing so, is desirable. In democracies that value protecting minorities from the tyranny of the majority. But recent history has taught us that showing people unsavoury writing they're not even looking for is a recipe for trouble.

I don't want people coming to a platform to read my work, then being told 'you might also like Nazis!'

(1/4)

strypey,
@strypey@mastodon.nzoss.nz avatar

The key question is, are SubStack actually doing anything like that? Or are Platformer and others overstating the reach of their author promotion tools, to shoehorn SubStack into a playbook designed for DataFarms that live and die by their engagement-driving algorithms?

Casey describes Notes as;

"a text-based social network resembling Twitter that surfaces posts in a ranked feed."

But it seems like this significantly misrepresents the way Notes works.

https://support.substack.com/hc/en-us/articles/14564821756308-Getting-started-on-Substack-Notes

(2/4)

strypey,
@strypey@mastodon.nzoss.nz avatar

Unless SubStack's website is lying about who can receive Notes from whom, Casey is wrong about Notes as a vector for exposing readers to discourse they're not looking for (note I'm assuming good faith on his part here). He could also be wrong about the virality of;

"encouraging individual writers to recommend one another"

....or...

> an algorithmically ranked digest of potentially interesting posts

If so, this makes a Titanic hole in his critique of the platform's moderation policies.

(3/4)

strypey,
@strypey@mastodon.nzoss.nz avatar

Recently I've been watching a documentary about Juul. It seems like they started out with a genuine mission of replacing a dangerous product (cigarettes) with an equally-satisfying but much less dangerous one (nicotine vapes). But due to a combination of investor pressure, and unwisely following a standard marketing playbook that didn't suit their situation, they ended up a pariah. With their nominal enemies (Big Tobacco) among their owners.

Maybe there's some lessons for SubStack here.

(4/4)

strypey,
@strypey@mastodon.nzoss.nz avatar

"Extremists on Facebook, Twitter, and YouTube for the most part had been posting for clout: those platforms made it difficult or even impossible for them to monetize their audiences."

, 2023

https://platformer.substack.com/p/why-substack-is-at-a-crossroads

Casey, you know you put YouTube in that list, right? This one;

https://www.counterextremism.com/press/extremist-content-online-youtube-permits-monetization-neo-nazi-video

I'm assuming good faith with all my might here, but you're not making it easy.

strypey,
@strypey@mastodon.nzoss.nz avatar

"In three years on Substack, I’ve been recommended plenty of boring posts, but no openly Nazi ones. My experience of them has been unobjectionable.

... It was recommendations on Twitter, Facebook, and YouTube that helped turn Alex Jones from a fringe conspiracy theorist into a juggernaut that could terrorize families out of their homes."

, 2023

https://platformer.substack.com/p/why-substack-is-at-a-crossroads

So you've seen that recommendations work differently (and better) on SubStack? Why bring up old, worse examples?

strypey,
@strypey@mastodon.nzoss.nz avatar

"The moment a platform begins to recommend content is the moment it can no longer claim to be simple software."

, 2023

https://platformer.substack.com/p/why-substack-is-at-a-crossroads

It would be obviously wrong to claim any hosted platform is "simple software". That was never their argument. Neither did they claim to be a "common carrier", although that's much closer. Surely even telecos have minimal TOS that forbid things like incitement to violence?

strypey, (edited )
@strypey@mastodon.nzoss.nz avatar

"If it won’t remove the Nazis, why should we expect the platform to remove any other harm?"

, 2023
'https://platformer.substack.com/p/why-substack-is-at-a-crossroads

... and here it is.

As Casey openly admitted in his first piece on the subject it was never really about "Nazis". Demanding boot "Nazi" publications was always about establishing a precedent for the banning of any other speech accused of being "harm" by Casey and his fellow travellers.

Classic Four Horsemen of the Infocalypse tactics.

strypey,
@strypey@mastodon.nzoss.nz avatar

"Some of our... customers are people who work in tech policy, content moderation, and trust and safety. They’ve spent years doing the work, making the hard calls, and cleaning up the internet for all of our mutual benefit. It’s only natural that they would resist spending money on a platform that spurns their profession in this way."

, 2023

https://platformer.substack.com/p/why-substack-is-at-a-crossroads

Pulling out of a platform that might reduce the need for your services doesn't sound as noble as you think it does.

strypey,
@strypey@mastodon.nzoss.nz avatar

I bet animal farmers stop buying burgers at Lord of the Fries when they realise it's vegan. That's not a principled self-sacrifice, it's ruthless professional self-interest.

strypey, (edited )
@strypey@mastodon.nzoss.nz avatar

sniping once again at SubStack and others who defend free expression:

"... platforms that have rejected calls to actively moderate content have created a means for bad actors to organize, create harmful content, and distribute it at scale. In particular, researchers now have repeatedly observed a pipeline between the messaging app Telegram and X, where harmful campaigns are organized and created on the former and then distributed on the latter."

https://www.platformer.news/taylor-swift-deepfake-nudes-x/

(1/?)

strypey, (edited )
@strypey@mastodon.nzoss.nz avatar

When read in this context, it becomes undeniable that the weasels words "actively moderate content" are a euphemism for actively preventing anyone the dominant ideology considers "Bad Actors" from expressing their ideas or communicating with each other online. For as long as these "Bad Actors" are actual fascists, or other people most of us are disgusted by, it's easy to ignore this normalisation of mass censorship. Just as most people have ignored the normalisation of mass surveillance.

(2/?)

strypey,
@strypey@mastodon.nzoss.nz avatar

"I'd say there's little chance of that, given that Telegram won't even disallow the trading of child sexual abuse material."

, 2023

https://www.platformer.news/taylor-swift-deepfake-nudes-x/

The same accusation has been leveled at Mastodon;

https://www.secjuice.com/mastodon-child-porn-pedophiles/

.. and must be taken with a grain of salt. The 'think of the children' member of the rides again.

(3/?)

strypey,
@strypey@mastodon.nzoss.nz avatar

"In any case, with each passing day it becomes clear that Telegram, which has more than 700 million monthly users, deserves as much scrutiny as any other major social platform — and possibly more."

, 2023

https://www.platformer.news/taylor-swift-deepfake-nudes-x/

Make no mistake, this is a call to "actively moderate content" in private communications channels. If the Newtonites get their way, anyone using encryption to protect the privacy of their communications will be accused of having something to hide.

(4/?)

strypey,
@strypey@mastodon.nzoss.nz avatar

Because if you read even the abstract of the page Casey gives as a reference for the claim;

"Telegram won't even disallow the trading of child sexual abuse material."

... what it actually says is;

"Telegram implicitly allows the trading of CSAM in private channels."

In other words, they don't spy on the private communications of people using their service to make sure they're not trading CSAM. Not the same. As with his comments on SubStack Notes, Casey gets this stuff wrong a lot.

(5/?)

strypey,
@strypey@mastodon.nzoss.nz avatar

To assume good faith at this point, it's necessary to believe that Casey is not well-informed about the philosophical disagreements or the facts involved in freedom of expression issues, and doesn't do the work required to make his articles on the subject fair and balanced. So he's vulnerable to getting carried along by the force of moral panics. This is serious failing for any journalist.

FWIW I do my best to be accurate and fair, but I'm an activist, not a journalist.

(6/6)

tangleofwires,
@tangleofwires@mas.to avatar

@strypey While I'm wary of extrapolating my own online experiences to represent the wider reality, I don't see Substack's Notes working (either in theory or practice) as a social network - low use, a limited feature set and inaccurate topic recommendations, there's little reason to spend more than a minute scrolling through it.

strypey,
@strypey@mastodon.nzoss.nz avatar

@tangleofwires
> I don't see Substack's Notes working (either in theory or practice) as a social network

Exactly. It's neither intended to be one, nor designed to work like one. My understanding is that it's basically an extended comment space for the community that forms around a publication, independent of individual articles.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • rosin
  • thenastyranch
  • tacticalgear
  • ethstaker
  • InstantRegret
  • DreamBathrooms
  • ngwrru68w68
  • magazineikmin
  • Youngstown
  • mdbf
  • khanakhh
  • slotface
  • GTA5RPClips
  • kavyap
  • JUstTest
  • everett
  • cisconetworking
  • Durango
  • modclub
  • osvaldo12
  • tester
  • Leos
  • cubers
  • normalnudes
  • megavids
  • anitta
  • provamag3
  • lostlight
  • All magazines