Rounding out the week with day #129 of #Godot games using #GodotSteam! Today is Somnipathy by @Tearcell! A truly awesome point-and-click horror game; just look at that pixel art. The trailers give just a taste of it all but wow. Even sports a demo!
While having sex in a Capital hearing room is not high decorum, at least it's accomplishing something in the building where so little actually gets done.
@karlauerbach The 7 moved out early on, though the UCLA-NMC moniker stuck around for a time. The first 11 was a 45 and that became UCLA-ATS (ARPA host #1, taking that over from the 7), and early on was booted alternately with ANTS and UNIX 6 (I'm pretty sure it was 6), and then full time with UNIX 6. I was LAUREN@UCLA-ATS on that one. Later when LLL killed their RATS project they shipped the 11/70 down and we set up that one on the other side of the room (also its touch-tone modem and VOTRAX, that I used for my Touch-Tone UNIX speech system). It became UCLA-SECURITY (host #129 the way the IMPs did assignments). I was LAUREN@UCLA-SECURITY on that one.
I forgot that I had written this in September. It was published in November. Since the its behind a paywall, I'll paste the text that you can't access in this thread.
The field of artificial intelligence (ai) cycles through what are called AI summers, epochs where every other news headline seems to be about AI and there is ample funding for the field, and AI winters, which come from the disappointment of undelivered overpromises during the summers. We are currently in perhaps the most intense AI summer ever, where just the mere mention of “AI” gets startups 1550% more funds in investment.
But just like past summers, even the current hype cycle is an “AI summer” only for those profiting from building these systems or the researchers who get funding to work on the dominant paradigm of the day. For many people in the AI pipeline—from the exploited workers supplying and labelling data that power these systems and the content moderators who filter out toxic content, to the marginalised groups who live in apartheid states being overpoliced because of ai—
Google and Amazon employees, tech workers are protesting against the use of their labour in creating harmful technology.
The labour movement’s pushback against the proliferation of harmful AI systems is not limited to tech workers: many industries that are affected by the potential uses of AI systems have joined the fight. AI was a key topic of contention in the historic strikes by writers and actors in Hollywood in 2023.
Concept artists hired lobbyists and filed classaction lawsuits against companies that generated “AI art” using their work as training data, without consent or compensation. Creatives refused to accept studio terms stipulating that their material could be used to train generativeAI systems that could then put them out of work or devalue their labour.
Connections #129 Wednesday 18 October 2023
Link to Connections: www.nytimes.com/games/connections...