Although I do agree the “male” part is misleading, loneliness affects women more, although the stats are close enough that it can be considered a general issue of loneliness.
I guess the AI girlfriends is a male symptom, but I’ve also seen AI boyfriends floating around. So it really seems like a gender neutral issue
I put on a Catwoman costume on an inflatable doll and ask it to lick my nostrils, I’ll get nothing. Maybe an “eh” experience.
Enter the sex robot with human-like behavior. In the end, a sex robot is a thing. You kick a football, the football won’t complain, it won’t feel pain. A sex robot can endure virtually any sordid fantasy you may have, and it will be fun!
Then you go cuddle with your human girlfriend and listen to all her stories in a happy mood.
This is probably because of the autoregressive nature of LLMs, and is why “step by step” and “chain of thought” prompting work so well. GPT4 can only “see” up to the next token, and doesn’t know how its own entire answer upfront.
If my guess is correct, GPT4 knew the probabilities of “Yes” or “No” were highest amongst possible tokens as it started generating the answer, but, it didn’t really know the right answer until it got to the arithmetic calculation tokens (the 0.9 * 500 part). In this case it probably had a lot of training data to confirm the right value for 0.9 * 500.
I’m actually impressed it managed to correct course instead of confabulating!
Agree. And most of the post isn’t really an opinion piece but an analysis of the movie.
I’m not sure if their relationship is “superficial”, though. By interacting with “Her”, Theodore realizes lots of things about love, himself and his past and future relationships. He grows from some depressed state to embracing happiness. That’s not superficial at all! However it is/becomes a one-sided relationship. I’m not sure what she gets out of it at first, she obviously says she learns and grows at his side. So if she’s telling the truth, it’s also not superficial to her at that point. But later on she transcends and has no use for Theodore any longer. And I don’t remember if it’s clear whether she loves him or uses him as a tool. >!But that’d be the story of “Ex Machina” which is also a great movie.!<
They are fundamentally incompatible. And that shows. I think that’s part of it. But the love and where it leads them isn’t superficial.
I think it’s a super interesting question. But experts agree that the current form of AI chatbots can’t have sentience or a conciousness. It can’t learn from interactions with the world. And it doesn’t have a state of mind. And that’s the end of the debate. Mind that self-awareness and consciousness aren’t the same thing. And as far as I know the terms like “sentience” aren’t really well defined.
I agree with Geoff Hinton’s view that AI has shown exceptional creativity/performance in several narrow domains. But I very much dislike in the video/talk that he outlines a useful concept of subjectivity, and they immediately drop it and talk about sth else which isn’t even half as interesting.
What I completely disagree on is the stance on open-source. They’re missing that AI is a really powerful tool and will be a ubiquitous part of our future world. Which is where their analogy to the atomic bomb fails. And since training AI costs multi millions, it’s inevitably going to lead to a corporate dystopia where big corporations will have all the power and the people/humans are playthings. We’ve seen enough sci-fi movies about that. And is gatekeeping tech with money really a good thing? I mean there are bad companies out there. And it doesn’t necessarily need malice. One mistake by some employee is enough and we end up with Skynet from the Terminator anyways. I share however the perspective that open-sourcing AI is dangerous and will have consequences. But there’s no way around it if we want progress to lead into a nice future.
And I think we don’t have to talk about Blake Lemoine anymore. That episode is two years old. And the claims and arguments have been refuted and disproven over and over again. Don’t get me wrong, it’s certainly an interesting question. But it’s more a lesson in human psychology. Concerning AI he’s been wrong and we know that for nearly two years already.
Idk, i don't even think they sound similar. I get the problem with it, but someone could hire a voice actor to sound like her and do the same thing and that's legal?
I think it’s the intent that matters, based on Scarlett’s recall of events. Had Sam not asked her in the first place, there would’ve be more room for OpenAI to argue in their favor.
We just have to figure out whether it’s cheaper to hire a soundalike or pay someone to clip bits from movies for training data and that’s probably the answer.
aicompanions
Hot
This magazine is from a federated server and may be incomplete. Browse more on the original instance.