ElBarto,
@ElBarto@sh.itjust.works avatar

Technically that last one is right, you can drink milk and battery acid if you have diabetes, you won’t die from diabetes related issues.

theblueredditrefugee,

Wait, why can’t you put chihuahua meat in the microwave?

viking,
@viking@infosec.pub avatar

Chat-GPT started like that as well though.

I asked one of the earlier models whether it is recommended to eat glass, and was told that it has negligible caloric value and a high sodium content, so can be used to balance an otherwise good diet with a sodium deficit.

Ataraxia,

I mean it says meat, not a whole living chihuahua. I’m sure a whole one would be dangerous.

vamputer,
@vamputer@infosec.pub avatar

Well, I can’t speak for the others, but it’s possible one of the sources for the watermelon thing was my dad

alphapuggle,

These answers don’t use OpenAI technology. The yes and no snippets have existed long before their partnership, and have always sucked. If it’s GPT, it’ll show in a smaller chat window or a summary box that says it contains generated content. The box shown is just a section of a webpage, usually with yes and no taken out of context.

All of the above queries don’t yield the same results anymore. I couldn’t find an example of the snippet box on a different search, but I definitely saw one like a week ago.

https://programming.dev/pictrs/image/1e031167-6203-4832-9cb8-16696bb467c9.png

pwalker,

Obviously ChatGPT has absolutely no problems with those kind of questions anymore https://discuss.tchncs.de/pictrs/image/29c02fec-fe48-456e-bc34-44778b703ec8.png

localme,

Ah, good catch I completely missed that. Thanks for clarifying this, I thought it seemed pretty off.

crsu,
@crsu@lemmy.world avatar

Microsoft can’t do anything right, it’s a rudderless company grabbing cash with both hands

underwire212,

I mean…you CAN do most of these. Doesn’t exactly mean you should

ArcaneSlime,

Ok most of these sure, but you absolutely can microwave Chihuahua meat. It isn’t the best way to prepare it but of course the microwave rarely is, Roasted Chihuahua meat would be much better.

nightwatch_admin,

fallout 4 vibes

Thorry84,
douglasg14b,
@douglasg14b@lemmy.world avatar

Generative AI is INCREDIBLY bad at mathmatical/logical reasoning. This is well known, and very much not surprising.

That’s actually one of the milestones on the way to general artificial intelligence. The ability to reason about logic & math is a huge increase in AI capability.

callcc,

Well known by you, not everybody.

fallingcats,

Well known by everyone that knows anything about LLMs at all

kromem,

It’s not. This is already obsolete.

fallingcats,

I’ve used gpt4 enough in the past months to confidently say the improvements in this blog post aren’t noteworthy

Trollception,

So that’s correct… Or am I dumber than the AI?

fossphi,

Ummm… username check out?

Smc87,

Dumber

JGrffn,

If one gallon is 3.785 liters, then one gallon is less than 4 liters. So, 4 liters should’ve been the answer.

WhiteHawk,

4l > 3.785l

Matty_r,
@Matty_r@programming.dev avatar

4l is only 2 characters, 3.785l is 6 characters. 6 > 2, therefore 3.785l is greater than 4l.

intensely_human,

“4” > “3.785”

=> false

kpw,

x > 4 and x = 3.785 are contradictory.

moog,

U are dumber than the AI ig lol

jjjalljs,

I feel like AI (or whatever these things are more properly called) will get really good one day, but it doesn’t seem production ready.

HiddenLychee,

I think calling them large language models (LLMs) is safe if you don’t want to call them AI

reflex,
reflex avatar

deleted_by_author

  • Loading...
  • Dehydrated,

    Obvious

    MxM111, (edited )
    MxM111 avatar

    Microsoft invested into OpenAI, and chatGPT answers those questions correctly. Bing, however, uses simplified version of GPT with its own modifications. So, it is not investment into OpenAI that created this stupidity, but “Microsoft touch”.

    On more serious note, since Bing is free, they simplified model to reduce its costs and you are seeing the results. You (user) get what you paid for. Free models are much less capable than paid versions.

    Dehydrated,

    That’s why I called it Bing AI, not ChatGPT or OpenAI

    MxM111,
    MxM111 avatar

    Like ChatGPT, Bing AI is based on GPT. If I am to guess, it is based on GPT 3.5 or even GPT3. Current version of ChatGPT has updated GPT4 model. It could not have called it ChatGPT, because ChatGPT is already taken.

    thisbenzingring,

    On more serious note, sings Bing is free, they simplified model to reduce its costs and you are swing results

    Was this phone+autocorrect snafu or am I having a medical emergency?

    MxM111,
    MxM111 avatar

    phone+autocorrec+no glasses. Devastating combination.

    Canadian_Cabinet,

    My guess is that its “since Bing is free”

    intensely_human,

    SING BING IS FREE

    HelloHotel,

    YOU 🤬, YOU ARE SWING RESULTS! 🤬/s

    Phanatik,

    I don't think this is true. Why would Microsoft heavily invest in ChatGPT to only get a dumber version of the technology they were invested in? Bing AI is built using ChatGPT 4 which is what OpenAI refer to as the superior version because you have to pay for it to use it on their platform.

    Bing AI uses the same technology and somehow produces worse results? Microsoft were so excited about this tech that they integrated it with Windows 11 via Copilot. The whole point of this Copilot thing is the advertising model built into users' operating systems which provides direct data into what your PC is doing. If this sounds conspiratorial, I highly recommend you investigate the telemetry Windows uses.

    MxM111,
    MxM111 avatar

    I don't think this is true.

    What is not true? ChatGPT does answer those questions well. Bing AI is simplified and customized version of GPT4 which is quite visible both by answers and by speed it answers (it is faster, similar speed as GPT3.5 or even 3). The main reason why Microsoft uses smaller model is LESS COMPUTE. To put the same compute per user into Bing is either too expansive or requires so much compute that Microsoft does not have it (or both). Same is true about whatever Google is using (Bart?). Free versions have to be small in terms of ANN complexity.

    Even paid version that OpenAI offers for about $20 per month has limitations, like 30 questions in 3 hours ( give or take, it is changing). That's because it runs the most compartmentally expansive model that costs that much money. And you want MS to provide it to millions of users for free? Of course they downsize it.

    Phanatik,

    You can't just make a simplified version of these models, that's not how these work. The computing power is solved by the simple fact that Bing AI is implemented as Microsoft Copilot within the OS so any computation is handled by the user's machine.

    ChatGPT isn't a GitHub repo you can just fork. It's been trained on data and it's shipped out for users to use. The reason why OpenAI is charging for GPT4 is like you said, they're footing the bill for the processing because you're interacting with their API.

    As I said before, the computation is handled by the user's machine so there's no server overhead until it makes searches on the internet which isn't much of an overhead compared to processing user queries.

    MxM111,
    MxM111 avatar

    I am not talking bout co-pilot. I am talking about Bing AI. As for co-pilot - good luck running GPT4 size model personal computer.

    I do not know how exactly Microsoft reduced the model - they might use GPT3 or there might be a technical way to reduce computation - reduce number of layers, reduce the number of tokens it looks across, reduce number of “heads”, whatever, but I know for sure - GPT4 is much smarter and slower than Bing AI. It is not the same model, one is a simplified version of another.

    ares35,
    ares35 avatar

    this is what happens when you train the robots using facebook and reddit.

    Dehydrated,

    At least Reddit CAN sometimes have some good content, Facebook is just full of stupid boomers

    ThisIsAManWhoKnowsHowToGling,
    @ThisIsAManWhoKnowsHowToGling@lemmy.dbzer0.com avatar

    And people say things can’t go viral on Mastodon

  • All
  • Subscribed
  • Moderated
  • Favorites
  • microblogmemes@lemmy.world
  • kavyap
  • thenastyranch
  • cubers
  • ethstaker
  • InstantRegret
  • DreamBathrooms
  • ngwrru68w68
  • magazineikmin
  • everett
  • Youngstown
  • mdbf
  • slotface
  • rosin
  • GTA5RPClips
  • JUstTest
  • khanakhh
  • normalnudes
  • osvaldo12
  • cisconetworking
  • provamag3
  • Durango
  • tacticalgear
  • modclub
  • Leos
  • megavids
  • tester
  • anitta
  • lostlight
  • All magazines