Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 19 May 2024

Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post, there’s no quota for posting and the bar really isn’t that high

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

mii,

In case you didn’t know, Sam “the Man” Altman is deadass the coolest motherfucker around. With world leaders on speed dial and balls of steel, he’s here to kick ass and drink milkshakes.

Within a day of his ousting, Altman said he received 10 to 20 texts from presidents and prime ministers around the world. At the time, it felt “very normal,” and he responded to the messages and thanked the leaders without feeling fazed.

Archive link.

earthquake,

“10 to 20” so, 10. How many of the messages were just from Nayib Bukele and Javier Milei?

fasterandworse,
@fasterandworse@awful.systems avatar

just want to share my article from this week. It’s about products that account for their lack of usefulness with ease of use. Using the gpt-4oh as an example. fasterandworse.com/known-purpose-and-trusted-pote…

ebu,

If the house doesn’t have a roof, don’t paint the walls.

i adore this line. because yeah, what i see the rest of the tech industry doing is either:

  • scrambling to erect their own, faster, better, cheaper roofless house
  • scrambling to sell furniture and utilities for the people who are definitely, inevitably going to move in
  • or making a ton of bank by selling resources to the first two groups

without even stopping to ask: why would anyone want to live here?

fasterandworse,
@fasterandworse@awful.systems avatar

thanks! I think I began saying that when I moved from digital marketing agencies to startups around 2011

froztbyte,

apparently oai has lifetime NDAs, via fasterthanlime[0]:

https://awful.systems/pictrs/image/0c9415c6-b0ab-4e52-a98a-456767ff1a53.png

[0] - technically via friend who sent me a screenshot, but y’know

froztbyte,

(the tweet is here, can’t find a working nitter ritenao)

fasterandworse,
@fasterandworse@awful.systems avatar

I can’t imagine signing that - unless I was being paid a lot of money

froztbyte,

keep in mind that the company heavily pre-filters for believers. that means that you have a whole set of other decision-influence things going on too, including not thinking much about this

(and then probably also the general SFBA vibe of getting people before they have any outside experiences, and know what is/is not sane)

fasterandworse,
@fasterandworse@awful.systems avatar

oh that reminds me of Anthropic’s company values page where they call it “unusually high trust” to believe that their employees work there in “good faith”

Unusually high trust

froztbyte,

…that’s going to be some ratsphere fucking bullshit, isn’t it

saucerwizard, (edited )

x.com/soniajoseph_/status/1791604177581310234?s=4…

David, something is cooking behind the scenes here.

blakestacey,
@blakestacey@awful.systems avatar

Hmm, a xitter link, I guess I’ll take a moment to open that in a private tab in case it’s passingly amusing…

To the journalists contacting me about the AGI consensual non-consensual (cnc) sex parties—

OK, you have my attention now.

To the journalists contacting me about the AGI consensual non-consensual (cnc) sex parties—

During my twenties in Silicon Valley, I ran among elite tech/AI circles through the community house scene. I have seen some troubling things around social circles of early OpenAI employees, their friends, and adjacent entrepreneurs, which I have not previously spoken about publicly.

It is not my place to speak as to why Jan Leike and the superalignment team resigned. I have no idea why and cannot make any claims. However, I do believe my cultural observations of the SF AI scene are more broadly relevant to the AI industry.

I don’t think events like the consensual non-consensual (cnc) sex parties and heavy LSD use of some elite AI researchers have been good for women. They create a climate that can be very bad for female AI researchers, with broader implications relevant to X-risk and AGI safety. I believe they are somewhat emblematic of broader problems: a coercive climate that normalizes recklessness and crossing boundaries, which we are seeing playing out more broadly in the industry today. Move fast and break things, applied to people.

There is nothing wrong imo with sex parties and heavy LSD use in theory, but combined with the shadow of 100B+ interest groups, leads to some of the most coercive and fucked up social dynamics that I have ever seen. The climate was like a fratty LSD version of 2008 Wall Street bankers, which bodes ill for AI safety.

Women are like canaries in the coal mine. They are often the first to realize that something has gone horribly wrong, and to smell the cultural carbon monoxide in the air. For many women, Silicon Valley can be like Westworld, where violence is pay-to-pay.

I have seen people repeatedly get shut down for pointing out these problems. Once, when trying to point out these problems, I had three OpenAI and Anthropic researchers debate whether I was mentally ill on a Google document. I have no history of mental illness; and this incident stuck with me as an example of blindspots/groupthink.

I am not writing this on the behalf of any interest group. Historically, much of OpenAI-adjacent shenanigans has been blamed on groups with weaker PR teams, like Effective Altruism and rationalists. I actually feel bad for the latter two groups for taking so many undeserved hits. There are good and bad apples in every faction. There are so many brilliant, kind, amazing people at OpenAI, and there are so many brilliant, kind, and amazing people in Anthropic/EA/Google/[insert whatever group]. I’m agnostic. My one loyalty is to the respect and dignity of human life.

I’m not under an NDA. I never worked for OpenAI. I just observed the surrounding AI culture through the community house scene in SF, as a fly-on-the-wall, hearing insider information and backroom deals, befriending dozens of women and allies and well-meaning parties, and watching many them get burned. It’s likely these problems are not really on OpenAI but symptomatic of a much deeper rot in the Valley. I wish I could say more, but probably shouldn’t.

I will not pretend that my time among these circles didn’t do damage. I wish that 55% of my brain was not devoted to strategizing about the survival of me and of my friends. I would like to devote my brain completely and totally to AI research— finding the first principles of visual circuits, and collecting maximally activating images of CLIP SAEs to send to my collaborators for publication.

saucerwizard,

Thanks! I thought the multiple journos sniffing around was very interesting.

dgerard,
@dgerard@awful.systems avatar

some of these guys get in touch with me from time to time, apparently i have a rep as a sneerer (I am delighted)

(i generally don’t have a lot specific to add - I’m not that up on the rationalist gossip - except that it’s just as stupid as it looks and frequently stupider, don’t feel you have to mentally construct a more sensible version if they don’t themselves)

earthquake,

Very grim that she feels the need to couch her damning report with “some, I assume, are good people” for a paragraph. I guess that’s one of her survival strategies.

o7___o7,
@o7___o7@awful.systems avatar

Good thing that none of this mad-science bullshit is in danger of working, because I don’t think that the spicy autocorrect leadership cadre would hesitate to hurt people if they could build something that’s actually impressive.

earthquake,

Useful context: this is a followup to this post:

The thing about being active in the hacker house scene is you are accidentally signing up for a career as a shadow politician in the Silicon Valley startup scene. This process is insidious because you’re initially just signing up for a place to live and a nice community. But given the financial and social entanglement of startup networks, you are effectively signing yourself up for a job that is way more than meets the eye, and can be horribly distracting if you are not prepared for it. If you play your cards well, you can have an absurd amount of influence in fundraising and being privy to insider industry information. If you play your cards poorly, you will be blacklisted from the Valley. There is no safety net here. If I had known what I was getting myself into in my early twenties, I wouldn’t have signed up for it. But at the time, I had no idea. I just wanted to meet other AI researchers.

I’ve mind-merged with many of the top and rising players in the Valley. I’ve met some of the most interesting and brilliant people in the world who were playing at levels leagues beyond me. I leveled up my conception of what is possible.

But the dark side is dark. The hacker house scene disproportionately benefits men compared to women. Think of frat houses without Title IX or HR departments. Your peer group is your HR department. I cannot say that everyone I have met has been good or kind.

Socially, you are in the wild west. When I joined a more structured accelerator later, I was shocked by the amount of order and structure there was in comparison.

dgerard,
@dgerard@awful.systems avatar
o7___o7,
@o7___o7@awful.systems avatar

When randoms from /all wander into the vale of sneers:

www.buttersafe.com/2008/10/23/the-detour/

mii,
carlitoscohones,

I’m starting to think that this Sam Altman guy just might be a giant asshole.

gerikson,
@gerikson@awful.systems avatar

It took you this long to suss that???

carlitoscohones,

First they came for the eyeballs, and I said nothing.

o7___o7,
@o7___o7@awful.systems avatar

Where we’re going, we don’t need eyes.

–Sam Altman, probably

sinedpick,

It’s hard to understand what Samuel Alternativeman hopes to accomplish by making such statements. Does he want everyone to give up on being creative and just defer to AI? Does he think that without a source of real creativity for training, his products have any value at all?

mii,

He’s either trying to generate new critihype by making Clippy intelligent again (“It learns just like those pesky hoomans do!”), or slither his way out of that lawsuit by claiming it couldn’t have stolen original ideas when there have never been any original ideas in the first place.

I’m still trying to figure out what’s stupider.

o7___o7,
@o7___o7@awful.systems avatar

“It learns just like those pesky hoomans do!”

It’s Furbies all over again

sinedpick,

Here’s today’s episode of “The Left is So Mean and The Right is so Nice, What Gives?” HN edition.

zbyte64,
dgerard,
@dgerard@awful.systems avatar

it’s important to keep in mind that Kevin Roose is the most gullible motherfucker.

carlitoscohones,

“Latecomer’s guide to creating an AI girlfriend”in this Sunday’s NYT magazine.

skillissuer,
@skillissuer@discuss.tchncs.de avatar

MS carbon emissions up 30% due to spicy autocomplete

www.theregister.com/…/microsoft_co2_emissions/

mii,

I know this is like super low-hanging fruit, but Reddit’s singularity forum (AGI hype-optimists on crack) discuss the current chapter in the OpenAI telenovela and can’t decide whether Ilya and Jan Leike leaving is good, because no more lobotomizing the Basilisk, or bad, because no more lobotomizing the Basilisk.

Yep, there’s no scenario here where OpenAI is doing the right thing, if they thought they were the only ones who could save us they wouldn’t dismantle their alignment team, if AI is dangerous, they’re killing us all, if it’s not, they’re just greedy and/or trying to conquer the earth.

vs.

to be honest the whole concept of alignment sounds so fucked up. basically playing god but to create a being that is your lobotomized slave…. I just dont see how it can end well

Of course, we also have the Kurzweil fanboys chiming in:

Our only hope is that we become AGI ourselves. Use the tech to upgrade ourselves.

But don’t worry, there are silent voices of reasons in the comments, too:

Honestly feel like these clowns fabricate the drama in order to over hype themselves

Gee, maybe …

no , they’re understating the drama in order to seem rational & worthy of investment , they’re serious that the world is ending , unfortunately they think they have more time than they do so they’re not helping very much really

Yeah, never mind. I think I might need to lobotomize myself now after reading that thread.

Amoeba_Girl,
@Amoeba_Girl@awful.systems avatar

Our only hope is that we become AGI ourselves.

But… wait…

saucerwizard,
sinedpick,

oh wow re-colonizing your mouth with a bacteria that continuously produces an antibiotic is a bad idea? Who could have seen this coming???

saucerwizard,

insane Timothy Treadwell voice‘SCOOTER IS SHITTING HIMSELF’

slopjockey,
Soyweiser,

Literally one of the Vogons talked about in the opening, well done substackcommenter2048.

skillissuer, (edited )
@skillissuer@discuss.tchncs.de avatar

ah yes, aduhelm (aducanumab), probably first drug that FDA approved despite zero evidence that it works, but they wanted to push something, anything that maybe perhaps will show some marginal benefit (it was discontinued few months ago, but in time it was for sale they raked in some serious undeserved money). understandably this made a lot of people very angry, especially people that make drugs that work www.science.org/content/…/aducanumab-approval www.science.org/content/…/goodbye-aduhelm

skillissuer,
@skillissuer@discuss.tchncs.de avatar

and can you guess why it was discontinued? it’s because that company has new, equally useless antibody, that also got approval, but has none of that pr stink around so they can repeat entire process again

carlitoscohones,

I have a business idea for him. Small batch alcohol produced by harvesting ethanol produced by this bacteria found in Aella’s teeth scrapings. Call it Mutella 1140, the official alcohol shot of rationalist orgies.

dgerard,
@dgerard@awful.systems avatar

uh

carlitoscohones,

Sorry for that. It was funnier in my head.

dgerard,
@dgerard@awful.systems avatar

i mean you managed fresh horror from Aella, that’s not nothing

o7___o7,
@o7___o7@awful.systems avatar

They have that IPA made with yeast from a hipster’s beard flakes, so there’s…uh…precedent.

en.wikipedia.org/wiki/Rogue_Beard_Beer

Architeuthis,
@Architeuthis@awful.systems avatar

Did the Aella moratorium from r/sneerclub carry over here?

Because if not

https://awful.systems/pictrs/image/7695d330-b809-44c2-a227-e679fc77b430.png

for the record, im currently at ~70% that we’re all dead in 10-15 years from AI. i’ve stopped saving for retirement, and have increased my spending and the amount of long-term health risks im taking

Soyweiser,

Same Aella Same, but moon.

V0ldek,

For the record, I’m currently at ~70% that we’re all dead in 10-15 years from the moon getting mad at us. I’ve stopped saving for retirement, and have increased my spending towards a future moon mission to give it lots of pats and treats and tell it it’s a good boy and we love it, please don’t get mad.

Soyweiser,

As a moonleftalonist, I have increased my spending towards nuking all space travel sites (not going well, apparently those things are expensive and really hard to find on ebay), as it is clear that the moon doesn’t want our attention, it never asked for it, and never tried to visit us. Respect the moons privacy!

Our secondary plan is making a big sign saying ‘we are here if you want to talk’.

dgerard,
@dgerard@awful.systems avatar

it’s clear that the moon was right and we have it coming

mii,

Is increasing the amount of long-term health risks code for showering even less?

Amoeba_Girl,
@Amoeba_Girl@awful.systems avatar

rip aella girl. aella dead girl

BigMuffin69, (edited )
@BigMuffin69@awful.systems avatar

Ugh, this post has me tilted- if your utility function is

max sum log(spending on fun stuff at time t ) * p(alive_t) s.t. cash at time t = savings_{t-1}*r + work_t - spending_t,

etc.,

There’s no fucking way the optimal solution is to blow all your money now, because the disutility of living in poverty for decades is so high. It’s the same reason people pay for insurance, no one expects their house is going to burn down tomorrow, but protecting yourself against the risk is the 100% correct decision.

Idk, they are the Rationalist^{tm} so what the hell do I know.

corbin,

This article motivating and introducing the ThunderKittens language has, as its first illustration, a crying ancap wojak complaining about people using GPUs. I thought it was a bit silly; surely that’s not an actual ancap position?

On Lobsters, one of the resident cryptofascists from the Suckless project decided to be indistinguishable from the meme. He doesn’t seem to comprehend that I’m mocking his lack of gumption; unlike him, I actually got off my ass when I was younger and wrote a couple GPU drivers for ATI/AMD hardware. It’s difficult but rewarding, a concept foreign to fascists.

dgerard,
@dgerard@awful.systems avatar

good to see that LessWrong is about improving human rationality to decrease the existential risk of superintelligent AI, and not long screeds on current political issues from a reactionary perspective

Amoeba_Girl,
@Amoeba_Girl@awful.systems avatar

If what you really care about is stemming the ill-effects of large and growing student debt, debt cancellation is a terrible policy. If you want people to consume less of something, the last thing you should do is subsidize people who consume that thing.

Yes, I agree, once debt is cancelled we shouldn’t let people go into debt and attending university should be free.

Soyweiser,

Also that is a weird sort of argument, how would people consume less schooling? ‘Yeah I’m going for my 4th degree now, after Biden canceled the debt of the first one, I hope he will cancel the other 3 as well’

Amoeba_Girl,
@Amoeba_Girl@awful.systems avatar

I generously assumed that he meant consume less loans but, yeah, doesn’t look great.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • techtakes@awful.systems
  • cubers
  • thenastyranch
  • khanakhh
  • mdbf
  • cisconetworking
  • rosin
  • InstantRegret
  • DreamBathrooms
  • magazineikmin
  • Youngstown
  • ngwrru68w68
  • slotface
  • Durango
  • kavyap
  • JUstTest
  • everett
  • Leos
  • normalnudes
  • ethstaker
  • osvaldo12
  • tacticalgear
  • tester
  • modclub
  • anitta
  • GTA5RPClips
  • provamag3
  • megavids
  • lostlight
  • All magazines