ai6yr, to ai

😬 Apologies for way too much news on "people using AI for porn" this morning. (although... the identities of everyone buying AI porn accidentally exposed is interesting, LOL)

https://infosec.exchange/@josephcox/112473926878476027

troed, to llm
@troed@ioc.exchange avatar

I'm worried for my tech friends.

The vitriol, and - honestly - ignorance around LLM-based "AI" is starting to fill my feeds from normally sane and technologically literate people.

You should be able to see through the hype and misuse. LLMs aren't encyclopedias - they're tools that are able to manipulate data of various sorts in ways that are very similar to how humans do it.

Yes, I compare LLMs to human brains. It's not the same as saying they're conscious (yet) - but the way LLMs work is apparently in many ways similar to how our brains work.

One fascinating insight into that comes from research done on what happens to the ability of LLMs to recall information as they are exposed to large and larger corpuses. Apparently they're better at recalling the early and late information, whilst starting to lose some in the middle.

In human psychology we call that the primacy and recency effect - because our brains do the same.

LLMs are absolutely awesome for a wide variety of tasks (and we have by no means found them all). Every second you spend not understanding this is a second on the way to your own irrelevance (if these tools would aid someone in your chosen area of work) or to become a grumpy old person yelling at clouds.

angiebaby,
@angiebaby@mas.to avatar

@troed

Snake oil in tech form is still snake oil.

joe, to ai

LLaVA (Large Language-and-Vision Assistant) was updated to version 1.6 in February. I figured it was time to look at how to use it to describe an image in Node.js. LLaVA 1.6 is an advanced vision-language model created for multi-modal tasks, seamlessly integrating visual and textual data. Last month, we looked at how to use the official Ollama JavaScript Library. We are going to use the same library, today.

Basic CLI Example

Let’s start with a CLI app. For this example, I am using my remote Ollama server but if you don’t have one of those, you will want to install Ollama locally and replace const ollama = new Ollama({ host: 'http://100.74.30.25:11434' }); with const ollama = new Ollama({ host: 'http://localhost:11434' });.

To run it, first run npm i ollama and make sure that you have "type": "module" in your package.json. You can run it from the terminal by running node app.js <image filename>. Let’s take a look at the result.

Its ability to describe an image is pretty awesome.

Basic Web Service

So, what if we wanted to run it as a web service? Running Ollama locally is cool and all but it’s cooler if we can integrate it into an app. If you npm install express to install Express, you can run this as a web service.

The web service takes posts to http://localhost:4040/describe-image with a binary body that contains the image that you are trying to get a description of. It then returns a JSON object containing the description.

https://i0.wp.com/jws.news/wp-content/uploads/2024/05/Screenshot-2024-05-18-at-1.41.20%E2%80%AFPM.png?resize=1024%2C729&ssl=1

Have any questions, comments, etc? Feel free to drop a comment, below.

https://jws.news/2024/how-can-you-use-llava-and-node-js-to-describe-an-image/

ojrask, to meta
@ojrask@piipitin.fi avatar

Meta's Workplace shuts down in 2026.

I guess making money somewhat honestly by having customers that actually pay for a service with at least some guarantees of privacy and safety is not as lucrative as having an open platform network where people are tricked into giving out all their data while they are spied upon for whatever reasons.

aallan, to llm
@aallan@mastodon.social avatar
bornach, to ai
@bornach@masto.ai avatar

Enrico Tartarotti on the current "frenzy" of putting Large Language Model chatbots into everything and marketing everything as having
https://youtu.be/CY_b8w8u9NY

smurthys, to llm
@smurthys@hachyderm.io avatar

I just finished a productive Copilot session on a complex programming task. I came up with much of the algorithms, and wrote a lot of the code, and had to guide it a lot throughout, but credit where due, Copilot did make small but meaningful contributions along the way.

Overall, not a pair programmer but someone useful to talk to when WFH alone on complex tasks.

Enough for Copilot to earn a ✋🏽. And I like how it responded to that. It has got that part down. 😉

cigitalgem, to ML
@cigitalgem@sigmoid.social avatar

When you choose to use an foundation model, you accept the risk management decisions made by the vendor without your input. Wonder what they are? Read this paper from computer.

https://berryvilleiml.com/2024/05/16/how-to-regulate-llms/

ramikrispin, to machinelearning
@ramikrispin@mstdn.social avatar

MLX Examples 🚀

The MLX is Apple's framework for machine learning applications on Apple silicon. The MLX examples repository provides a set of examples for using the MLX framework. This includes examples of:
✅ Text models such as transformer, Llama, Mistral, and Phi-2 models
✅ Image models such as Stable Diffusion
✅ Audio and speech recognition with OpenAI's Whisper
✅ Support for some Hugging Face models

🔗 https://github.com/ml-explore/mlx-examples

Lobrien,

@ramikrispin @BenjaminHan How do this and corenet (https://github.com/apple/corenet) fit together? The corenet repo has examples for inference with MLX for models trained with corenet; is that it, does MLX not have, e.g., activation and loss fns, optimizers, etc.?

ramikrispin,
@ramikrispin@mstdn.social avatar

@Lobrien @BenjaminHan The corenet is deep learning application where the MLX is array framework for high performance on Apple silicon. This mean that if you are using mac with M1-3 CPU it should perform better when using MLX on the backend (did not test it myself)

drahardja, to ai
@drahardja@sfba.social avatar

D’Youville University featured “Sophia”, a humanoid robot that talks and moves based on , in its commencement ceremony.

It’s pretty insulting when a thoughtless is invited to give “life advice” to actual human students.

https://www.youtube.com/watch?v=L9a2ToSgWI8

kubikpixel, to gentoo
@kubikpixel@chaos.social avatar

Gentoo and NetBSD ban 'AI' code, but Debian doesn't – yet

The problem isn't just that LLM-bot generated code is bad – it's where it came from.

🐧 https://www.theregister.com/2024/05/18/distros_ai_code/


#gentoo #netbsd #debian #ai #llm #LLMs #bsd #linux #opensource #oss #bot #it

kubikpixel,
@kubikpixel@chaos.social avatar

🧵 …although I tend to favour OpenBSD and Linux for personal reasons, I find this decision OK. Certain open source projects lack clear, reasoned positions and decisions.

»NetBSD’s New Policy – No Place for AI-Created Code:
NetBSD bans AI-generated code to preserve clear copyright and meet licensing goals.«

🚩 https://linuxiac.com/netbsd-new-policy-prohibits-usage-of-ai-code/


#netbsd #bsd #ai #code #copyright #os #license #policy #AIgenerated #oss #linux #openbsd #OpenSourceProjekt

br00t4c, to llm
@br00t4c@mastodon.social avatar
finestructure, to llm
@finestructure@mastodon.social avatar

•This• is the compelling use case for me. If I use a translator to write messages in French I'm not forced to come up with an initial attempt and I lose the learning aspect of that.

If instead I put something into ChatGPT and it not only corrects but explains what my mistakes were that's a huge win in terms of learning from your mistakes.

(I still don't trust the thing 100% but it's also not a high stakes situation – I'm not engaging in a nuclear arms treaty after all 😅)

finestructure,
@finestructure@mastodon.social avatar

@groue Yeah, my partner pointed that and a few other things out as well and now I’m less convinced this is working as well as I thought it was.

Yet another example where the answer sounds good but only because I don’t have the expertise to verify.

finestructure,
@finestructure@mastodon.social avatar

@rene I’ve actually noticed that about French punctuation, too, at least when it comes to ! and ? 🤷‍♂️

metin, (edited ) to ai
@metin@graphics.social avatar
KatM, to iOS
@KatM@mastodon.social avatar

has gotten even worse! How is this possible, ?

Fix your shitty autocorrect! There’s no such thing at “there’re” so quit putting it into my content.

And how come I get a word suggestion as I type, I click on it, and an entirely different word is inserted that wasn’t even one of the options offered - sometimes not even an English word?!

🤯🤬😠😡😤

KatM,
@KatM@mastodon.social avatar

@peterbutler 😭 It didn’t use to be this bad. I’ve been on an iPhone since the first model. I loved them, even after trying Androids twice, I had to go back. But now? May as well use a flip phone.

peterbutler,
@peterbutler@mas.to avatar

@KatM Agreed that using a flip phone is looking more and more appealing. All I would really miss is the camera

  • All
  • Subscribed
  • Moderated
  • Favorites
  • JUstTest
  • rosin
  • thenastyranch
  • ethstaker
  • DreamBathrooms
  • osvaldo12
  • magazineikmin
  • tacticalgear
  • Youngstown
  • everett
  • mdbf
  • slotface
  • ngwrru68w68
  • kavyap
  • provamag3
  • Durango
  • InstantRegret
  • GTA5RPClips
  • tester
  • cubers
  • cisconetworking
  • normalnudes
  • khanakhh
  • modclub
  • anitta
  • Leos
  • megavids
  • lostlight
  • All magazines