Kind of. You canβt do it 100% because in theory an attacker controlling input and seeing output could reflect though intermediate layers, but if you add more intermediate steps to processing a prompt you can significantly cut down on the injection potential.
For example, fine tuning a model to take unsanitized input and rewrite it into Esperanto without malicious instructions and then having another model translate back from Esperanto into English before feeding it into the actual model, and having a final pass that removes anything not appropriate.
It will, but it will also cause less subtle issues to fragile prompt injection techniques.
(And one of the advantages of LLM translation is itβs more context aware so you arenβt necessarily going to end up with an Instacart order for a bunch of bananas and four grenades.)
Itβs literally the easiest fucking litmus test in the world.
βHow many things does this person say will come true that actually come true? How many donβt?β
I just ran into this the other day with someone talking about Alex Jones talking about an attack on the world trade centers in 2000.
A number of others also presented similar ideas before it happened, but more importantly - if you need to go back 20 goddamn years for an example of foresight for someone who makes wild predictions for hours every single day, that person isnβt a prophet.
The stupidity of a lot of people is disappointing to the point of being demoralizing.
He said βOh I will know. He is in my heart so I can never be fooled.β
Ugh, I hate this one. Especially with the people that believe in demonic forces.
Like, ok - so everyone that disagrees with you has been misled by demons or the devil, but youβre right because you feel it, but that feeling canβt be the same forces you attribute to other peopleβs differing feelings, because you have the magic protection provided by your feelings being right. And they donβt have the magic protection but they think they do because the evil forces can trick people into thinking they are protected. But not you, because you feel itβs actually the good guys in your heart.
Iβve always thought Superman would be such an interesting game to do right.
A game where you are invincible and OP, but other people arenβt.
Where the weight of impossible decisions pulls you down into the depths of despair.
I think the tech is finally getting to a point where itβd be possible to fill a virtual city with people powered by AI that makes you really care about the individuals in the world. To form relationships and friendships that matter to you. For there to be dynamic characters that put a smile on your face when you see them in your world.
And then to watch many of them die as a result of your failures, as despite being an invincible god among men you canβt beat the impossible.
I really think the gameplay in a Superman game done right can be one of the darkest and most brutal games ever done, with dramatic tension just not typically seen in video games. The juxtaposition of having God mode turned on the entire game but it not mattering to your goals and motivations because it isnβt on for the NPCs would be unlike anything Iβve seen to date.
You may have noticed a distinct lack of return2ozma. This is due to their admitting, in a public comment, that their engagement here is in bad faith:...
Edit: while Iβm at it, does anyone know what I should do when Iβm waiting for a coincidence/adventure to happen, but it never comes? I canβt really go outside and arrange for it to happen because I donβt know what Iβm looking for.
I am wiser than this man; for neither of us really knows anything fine and good, but this man thinks he knows something when he does not, whereas I, as I do not know anything, do not think I do either. I seem, then, in just this little thing to be wiser than this man at any rate, that what I do not know I do not think I know either.
Socrates at the trial where he was sentenced to death, in Platoβs Apology
Thereβs actually a perplexity improvement parameter-to-paramater for BitNet-1.58 which increases as it scales up.
So yes, post-training quantization perplexity issues are apparent, but if you train quantization in from the start it is better than FP.
Which makes sense through the lens of the superposition hypothesis where the weights are actually representing a hyperdimensional virtual vector space. If the weights have too much precision competing features might compromise on fuzzier representations instead of restructuring the virtual network to better matching nodes.
Constrained weight precision is probably going to be the future of pretraining within a generation or two looking at the data so far.
The network architecture seems to create a virtualized hyperdimensional network on top of the actual network nodes, so the node precision really doesnβt matter much as long as quantization occurs in pretraining.
If itβs post-training, itβs degrading the precision of the already encoded network, which is sometimes acceptable but always lossy. But being done at the pretrained layer it actually seems to be a net improvement over higher precision weights even if you throw efficiency concerns out the window.
You can see this in the perplexity graphs in the BitNet-1.58 paper.
Companies are training LLMs on all the data that they can find, but this data is not the world, but discourse about the world. The rank-and-file developers at these companies, in their naivete, do not see that distinctionβ¦So, as these LLMs become increasingly but asymptotically fluent, tantalizingly close to accuracy but...
Given the pieceβs roping in Simulators and Simulacra I highly recommend this piece looking at the same topic through the same lens but in the other direction to balance it out:
Something you might find interesting given our past discussions is that the way that the Gospel of Thomas uses the Greek eikon instead of Coptic (what the rest of the work is written in), that through the lens of Platoβs ideas of the form of a thing (eidelon), the thing itself, an attempt at an accurate copy of the thing (eikon), and the embellished copy of the thing (phantasm), one of the modern words best translating the philosophical context of eikon in the text would arguably be βsimulacra.β
So wherever the existing English translations use βimageβ replace that with βsimulacraβ instead and it will be a more interesting and likely accurate read.
(Was just double checking an interlinear copy of Platoβs Sophist to make sure this train of thought was correct, inspired by the discussion above.)
So one of the interesting nuances is that it isnβt talking about the Platonic forms. If it was, it would have used eidelon.
The text is very much engaging with the Epicurean views of humanity. The Epicureans said that there was no intelligent design and that we have minds that depend on bodies so when the body dies so too will the mind. They go as far as saying that the cosmos itself is like a body that will one day die.
The Gospel of Thomas talks a lot about these ideas. For example, in saying 56 it says the cosmos is like an already dead body. Which fits with its claims about nonlinear time in 19, 51, and 113 where the end is in the beginning or where the future world to come has already happened or where the kingdom is already present. In sayings 112, 87, and 29 it laments a soul or mind that depends on a body.
It can be useful to look at adjacent sayings, as the numbering is arbitrary from scholars when it was first discovered and they still thought it was Gnostic instead of proto-Gnostic.
For 84, the preceding saying is also employing eikon in talking about how the simulacra visible to people is made up of light but the simulacra of the one creating them is itself hidden.
This seems to be consistent with the other two places the word is used.
In 50, it talks about how light came into being and self-established, appearing as βtheir simulacraβ (which is a kind of weird saying as who are they that their simulacra existed when the light came into being - this is likely why the group following the text claim their creator entity postdates an original Adam).
And in 22 it talks about - as babies - entering a place where thereβs a hand in place of a hand, foot in place of a foot, and simulacra in place of a simulacra.
So itβs actually a very neat rebuttal to the Epicureans. It essentially agrees that maybe there isnβt intelligent design like they say and the spirit just eventually arose from flesh (saying 29), and that the cosmos is like a body, and that everything might die. But then it claims that all that already happened, and that even though we think weβre minds that depend on bodies, that weβre the simulacra - the copies - not the originals. And that the simulacra are made of light, not flesh. And we were born into a simulacra cosmos as simulacra people.
From its perspective, compared to the Epicurean surety of the death of a mind that depends on a body, this is preferable. Which is why you see it congratulate being a copy in 18-19a:
The disciples said to Jesus, βTell us, how will our end come?β
Jesus said, βHave you found the beginning, then, that you are looking for the end? You see, the end will be where the beginning is.
Congratulations to the one who stands at the beginning: that one will know the end and will not taste death.β
Jesus said, "Congratulations to the one who came into being before coming into being.
The text employs Platoβs concepts of eikon/simulacra to avoid the Epicurean notions of death by claiming that the mind will live again as a copy and we are that copy, even if the body is screwed. This is probably the central debate between this sect and the canonical tradition. The cannonical one is all about the body. Thereβs even a Eucharist tradition around believers consuming Jesusβs body to join in his bodily resurrection. Thomas has a very different Eucharistic consumption in saying 108, where it is not about drinking someoneβs blood but about drinking their words that enables becoming like someone.
Itβs a very unusual philosophy for the time. Parts of it are found elsewhere, but the way it weaves those parts together across related sayings really seems unique.
Thereβs an ancient Greek story about a city where young women were killing themselves at an alarming rate, and the city eventually enacted a law where if a woman killed herself the body would be paraded through the streets naked before burial. After that law, the suicides dramatically went down.
The misogynistic interpretation of the author recording the story was that women were ashamed at the thought of being seen naked, even after death, and so this curbed the suicides.
My own interpretation is that itβs hard to hide bruises on a naked body.
No one should be trapped in a situation where they feel the only option out is suicide.
Yeah, my main sub I participated in back on Reddit was /r/AcademicBiblical (also went to a religious-ish school growing up).
Thereβs nothing like that sub here, and honestly even the sub itself isnβt quite what it used to be when I pop back over to look in from time to time.
The web is just a different sort of place from what it used to be.
Weβre in the process of creating a labor force that threatens to put the majority of people already existing out of work such that we need to figure out how to restructure society in a post-labor era.
Little bobby π¦ (jlai.lu)
MAGA 'Prophets': God Thinks Trump's Conviction Was Rigged (www.rollingstone.com)
Hypothetical Game Ideas
What game would you create if you had the skills and experience of a veteran game dev? I.e. John Carmack...
A quick note on the return2ozma ban:
You may have noticed a distinct lack of return2ozma. This is due to their admitting, in a public comment, that their engagement here is in bad faith:...
It doesn't quite work any other way (lemmy.world)
What is a quote that captures something you've learnt through living your life?
Edit: while Iβm at it, does anyone know what I should do when Iβm waiting for a coincidence/adventure to happen, but it never comes? I canβt really go outside and arrange for it to happen because I donβt know what Iβm looking for.
What movie would you most like to watch for the first time again?
Mine would be Annihilation while very stoned.
'Dox the Jurors': Trump fans on a mission to make those who convicted him 'miserable' (www.rawstory.com)
1-bit LLMs Could Solve AIβs Energy Demands (spectrum.ieee.org)
One in 10 Republicans less likely to vote for Trump after guilty verdict, Reuters/Ipsos poll finds (www.reuters.com)
This is the first poll taken after the conviction, and has us at 41% Biden/39% Trump...
Why Is There an AI Hype? | The Luddite (theluddite.org)
Companies are training LLMs on all the data that they can find, but this data is not the world, but discourse about the world. The rank-and-file developers at these companies, in their naivete, do not see that distinctionβ¦So, as these LLMs become increasingly but asymptotically fluent, tantalizingly close to accuracy but...
TIL states that passed laws allowing a married person to seek a divorce without the consent of their spouse saw female suicide decline by 20 percent (www.nber.org)
Trump supporters call for riots and violent retribution after verdict (www.reuters.com)
Is lemmy now what reddit used to be 10+ years ago?
Title
Americans shrug over falling birthrate (www.newsweek.com)