How to de-radicalize my mom's youtube algorithm?

She’s almost 70, spend all day watching q-anon style of videos (but in Spanish) and every day she’s anguished about something new, last week was asking us to start digging a nuclear shelter because Russia was dropped a nuclear bomb over Ukraine. Before that she was begging us to install reinforced doors because the indigenous population were about to invade the cities and kill everyone with poisonous arrows. I have access to her YouTube account and I’m trying to unsubscribe and report the videos, but the reccomended videos keep feeding her more crazy shit.

LostCause, (edited )

My mother‘s partner had some small success, on the one hand doing what you do already, unsubscribe from bad stuff and subscribe to some other stuff she might enjoy (nature channels, religious stuff which isn‘t so toxic, arts and crafts..) and also blocked some tabloid news on the router. On the other hand, he tried getting her outside more, they go on long hikes now and travel around a lot, plus he helped her find some hobbies (the arts and crafts one) and sits with her sometimes showing lots of interest and positive affirmations when she does that. Since religion is so important to her they also often go to (a not so toxic or cultish) church and church events and the choir and so on.

She‘s still into it to an extent, anti-vax will probably never change since she can’t trust doctors and hates needles and she still votes accordingly (which is far right in my country) too which is unfortunate, but she‘s calmed down a lot over it and isn‘t quite so radical and full of fear all the time.

Edit: oh and myself, I found out about a book "How to have impossible conversations" and the topics in there can be weird, but it helped me in staying somewhat calm when confronted with outlandish beliefs and not reinforce, though who knows how much that helps if anything (I think her partner contributed a LOT more than me).

niktemadur,
niktemadur avatar

By now it is beyond apparent that corporations cannot be relied upon to regulate themselves. In the mindless dynamic they have set in motion, the mindless endgame is to keep us in a perpetual state of anguish.
How this type of vulgarly cynical content is not considered obscene and banned, is a failure of the highest magnitude.

DieguiTux8623,

Reading this makes me realize I’m not alone as I thought. My mother too has gone completely out of control in the last 3 years with every sort of plot theory (freemasonry, 5 families ruling the world, radical anti-vax, this Pope is not the real Pope, EU is to blame for anything, etc.). I manage to “agree to disagree” most of the time but it’s though sometimes… People, no matter their degree of education, love simple explanations to complex problems and an external scapegoat for their failures. These contents will always have an appeal.

LostCause, (edited )

Yeah you are not alone at all, there is an entire subreddit for this r/qanoncasualties and while my mother doesn‘t know it, a lot of this stuff comes from this Q thing. Like why else would a person like her in Austria care so much about Trump or Hillary Clinton or a guy like George Soros? Also a lot of it is thinly veiled borderline Nazi stuff (it‘s always the jews or other races or minority groups at fault), which is the next thing, she says she hates Nazis and yet they got her to vote far right and support all sorts of agendas. There is some mental trick happening there where "the left is the actual Nazis".

Well she was always a bit weird with religion, aliens, wonder cures and other conspiracy stuff, but it really got turned towards these political aims in the last years. It feels malicious and it was distressing to me, though I‘m somewhat at peace with it now, the world is a crazy place and out of my control, I just try to do the best with what I can affect.

HunterBidensLapDog,
@HunterBidensLapDog@infosec.pub avatar

Replace her existing account with a new one. Make up some excuse. Use the don’t recommend channel option when a bad recommendation comes up.

Sarsaparilla,
Sarsaparilla avatar

I used to watch a lot of controversial/conspiracy/right-wing stuff because I wanted to keep tabs on their crazy. Eventually it all became way too toxic and bad for my mental health but I needed to fix my Youtube feed. There was lots of unsubscribing and channel blocking, but mostly watching of lots of trivial videos was what helped to clean it up.

I don't think downvoting helps, but upvoting does. The algorithm knows people will engage more with content that makes them angry/afraid ... don't let it know when you feel that way.

Maybe scroll through a lot of shorts and then like and subscribe to videos & channels that are less innocuous, or completely trivial (like cooking tips, makeup & fashion for older women, celebrity gossip entertainment, comedy, feel good compilations etc). If you haven't already, definitely subscribe to Daily Dose of Internet. Youtube also seems to like pushing Lo-fi, meditation, and binaural beats (especially at night) ... sub a few of those.

Block those channels who have Rogan or Tate or similar in their video titles. Unsub from the likes of Dr John Campbell & Russell Brand. If she likes news and current affairs, try to find world news channels that are more highly regarded but less likely to be "ultra left" channels that she would recognise from her anti-browsing. Channels like Reuters, AFP, or maybe news stations from other countries (like ABC news in Australia, or CBC and Global News in Canada). Block "Rebel News" and unsub from "News Nation", or anyone else that pushes clickbait headlines. Swap her subscription from "Sky News Australia" (block) to simply "Sky News" - The Australian one is made by NewsCorp (Murdoch) and targets the American right-wing audience, while the other is a legitimate UK news channel which is not owned by Murdoch - she wont notice the subtle difference in the logos, but the content will be dramatically different.

Also, I would recommend a channel called The Why Files ... he does videos on conspiracies, lays out all the conspiracy evidence, and then also lays out any debunking facts. He treats the subjects seriously, so watchers don't feel stupid for believing something, but then does a great job of presenting the debunking facts as well, so viewers get a really balanced presentation about controversial topics.

I hope this is helpful. Good luck.

P.S. If you haven't done so yet, I would recommend watching a documentary called "The Social Dilemma" if you are able to find a copy (it's on Netflix).

001100010010,
@001100010010@lemmy.dbzer0.com avatar

I’m a bit disturbed how people’s beliefs are literally shaped by an algorithm. Now I’m scared to watch Youtube because I might be inadvertently watching propaganda.

Entropywins,
Entropywins avatar

I watch a lot of history, science, philosophy, stand up, jam bands and happy uplifting content... I am very much so feeding my mind lots of goodness and love it...

masquenox,

I have to clear out my youtube recommendations about once a week… no matter how many times I take out or report all the right-wing garbage, you can bet everything that by the end of the week there will be a Jordan Peterson or PragerU video in there. How are people who aren’t savvy to the right-wing’s little “culture war” supposed to navigate this?

shortgiraffe,

You should use an extension like blocktube.

masquenox,

I probably should… but I have to admit that I kinda enjoy reporting them.

Thanks - I’ll certainly look into it.

static, (edited )
static avatar

My normal YT algorithm was ok, but shorts tries to pull me to the alt-right.
I had to block many channels to get a sane shorts algorythm.

"Do not recommend channel" really helps

AstralPath,

It really does help. I’ve been heavily policing my Youtube feed for years and I can easily see when they make big changes to the algorithm because it tries to force feed me polarizing or lowest common denominator content. Shorts are incredibly quick to smother mebin rage bait and if you so much as linger on one of those videos too long, you’re getting a cascade of alt-right bullshit shortly after.

Andreas,
@Andreas@feddit.dk avatar

Using Piped/Invidious/NewPipe/insert your preferred alternative frontend or patched client here (Youtube legal threats are empty, these are still operational) helps even more to show you only the content you have opted in to.

niktemadur, (edited )
niktemadur avatar

You watch this one thing out of curiosity, morbid curiosity, or by accident, and at the slightest poke the goddamned mindless algorithm starts throwing this shit at you.

The algorithm is "weaponized" for who screams the loudest, and I truly believe it started due to myopic incompetence/greed, not political malice. Which doesn't make it any better, as people don't know how to take care of themselves from this bombardment, but the corporations like to pretend that they people can, so they wash their hands for as long as they are able.

Then on top of this, the algorithm has been further weaponized by even more malicious actors who have figured out how to game the system.
That's how toxic meatheads like infowars and joe rogan get a huge bullhorn that reaches millions. "Huh... DMT experiences... sounds interesting", the format is entertaining... and before you know it, you are listening to anti-vax and qanon excrement, your mind starts to normalize the most outlandish things.

EDIT: a word, for clarity

Jaywarbs,

Whenever I end up watching something from a bad channel I always delete it from my watch history, in case that affects my front page too.

emptyother,
@emptyother@lemmy.world avatar

Huh, I tried that. Still got recommended incel-videos for months after watching a moron “discuss” the Captain Marvel movie. Eventually went through and clicked “dont recommend this” on anything that showed on my frontpage, that helped.

Sludgehammer,
@Sludgehammer@lemmy.world avatar

I do that, too.

However I’m convinced that Youtube still has a “suggest list” bound to IP addresses. Quite often I’ll have videos that other people in my household have watched suggested to me. While some of it can be explained by similar interests, but it happens a suspiciously often.

Drunemeton,
@Drunemeton@lemmy.world avatar

I can confirm the IP-based suggestions!

My hubs and I watch very different things. Him: photography equipment reviews, photography how to’s, and old, OLD movies. Me: Pathfinder 2e, quantum field theory/mechanics and Dip Your Car.

Yet we both see stuff in the other’s Suggestions of videos the other recently watched. There’s ZERO chance based on my watch history that without IP-based suggestions YT is going to think I’m interested in watching a Hasselblad DX2 unboxing. Same with him getting PBS Space Time’s suggestions.

weeahnn,
@weeahnn@lemmy.world avatar

At this point, any channel that I know is either bullshit or annoying af I just block. Out of sight out of mind.

youthinkyouknowme,

Same. I have ads blocked and open YouTube directly to my subbed channels only. Rarely open the home tab or check related videos because of the amount of click bait and bs.

weeahnn,
@weeahnn@lemmy.world avatar

Ohh I just use BlockTube to block channels/ videos I don’t want to see.

jerdle_lemmy,

I mean, you probably are, especially if it’s explicitly political. All I can recommend is CONSTANT VIGILANCE!

Thorny_Thicket,

I find it interesting how some people have so vastly different experience with YouTube than me. I watch a ton of videos there, literally hours every single day and basically all my recommendations are about stuff I’m interested in. I even watch occasional political videos, gun videos and police bodycam videos but it’s still not trying to force any radical stuff down my throat. Not even when I click that button which asks if I want to see content outside my typical feed.

livus,
livus avatar

My youtube is usually ok but the other day I googled an art exhibition on loan from the Tate Gallery, and now youtube is trying to show me Andrew Tate.

scottyjoe9,

At one point I watched a few videos about marvel films and the negatives about them. One was about how captian marvel wasn’t a good hero because she was basically invincible and all powerful etc etc. I started getting more and more suggestions about how bad the new strong female leads in modern films are. Then I started getting content about politically right leaning shit. It started really innocuously and it’s hard to figure out if it’s leading you a certain way until it gets further along. It really made me think when I’m watching content from new channels. Obviously I’ve blocked/purged all channels like that and my experience is fine now.

DaGuys470,
DaGuys470 avatar

Just this week I stumbled across a new YT channel that seemed to talk about some really interesting science. Almost subscribed, but something seemed fishy. Went on the channel and saw the other videos, immediately got the hell out. Conspiracies and propaganda lurk everywhere and no one is save. Mind you, I'm about to get my bachelor's degree next year, meaning I have received proper scientific education. Yet I almost fell for it.

Mikina,

My personal opinion is that it’s one of the first large cases of misalignment in ML models. I’m 90% certain that Google and other platforms have been for years already using ML models design for user history and data they have about him as an input, and what videos should they offer to him as an ouput, with the goal to maximize the time he spends watching videos (or on Facebook, etc).

And the models eventually found out that if you radicalize someone, isolate them into a conspiracy that will make him an outsider or a nutjob, and then provide a safe space and an echo-chamber on the platform, be it both facebook or youtube, the will eventually start spending most of the time there.

I think this subject was touched-upon in the Social Dillema movie, but given what is happening in the world and how it seems that the conspiracies and desinformations are getting more and more common and people more radicalized, I’m almost certain that the algorithms are to blame.

Ludrol,
@Ludrol@szmer.info avatar

If youtube “Algorithm” is optimizing for watchtime then the most optimal solution is to make people addicted to youtube.

The most scary thing I think is to optimize the reward is not to recommend a good video but to reprogram a human to watch as much as possible

Mikina,

I think that making someone addicted to youtube would be harder, than simply slowly radicalizing them into a shunned echo chamber about a conspiracy theory. Because if you try to make someone addicted to youtube, they will still have an alternative in the real world, friends and families to return to.

But if you radicalize them into something that will make them seem like a nutjob, you don’t have to compete with their surroundings - the only place where they understand them is on the youtube.

MonkCanatella,

fuck, this is dark and almost awesome but not in a good way. I was thinking the fascist funnel was something of a deliberate thing, but may be these engagement algorithms have more to do with it than large shadow actors putting the funnels into place. Then there’s the folks who will create any sort of content to game the algorithm and you’ve got a perfect trifecta of radicalization

floofloof,

Fascist movements and cult leaders long ago figured out the secret to engagement: keep people feeling threatened, play on their insecurities, blame others for all the problems in people’s lives, use fear and hatred to cut them off from people outside the movement, make them feel like they have found a bunch of new friends, etc. Machine learning systems for optimizing engagement are dealing with the same human psychology, so they discover the same tricks to maximize engagement. Naturally, this leads to YouTube recommendations directing users towards fascist and cult content.

MonkCanatella,

That’s interesting. That it’s almost a coincidence that fascists and engagement algorithms have similar methods to suck people in.

archomrade,

100% they’re using ML, and 100% it found a strategy they didn’t anticipate

The scariest part of it, though, is their willingness to continue using it despite the obvious consequences.

I think misalignment is not only likely to happen (for an eventual AGI), but likely to be embraced by the entities deploying them because the consequences may not impact them. Misalignment is relative

nLuLukna,
@nLuLukna@sh.itjust.works avatar

Reason and critical thinking is all the more important in this day and age. It’s just no longer taught in schools. Some simple key skills like noticing fallacies or analogous reasoning, and you will find that your view on life is far more grounded and harder to shift

Redonkulation,

Texas basically banned critical thinking skills in the school system

MonkCanatella,

imagine if they taught critical media literacy in schools. of course that would only be critical media literacy with an american propaganda backdoor but still

cynar,

Just be aware that we can ALL be manipulated, the only difference is the method. Right now, most manipulation is on a large scale. This means they focus on what works best for the masses. Unfortunately, modern advances in AI mean that automating custom manipulation is getting a lot easier. That brings us back into the firing line.

I’m personally an Aspie with a scientific background. This makes me fairly immune to a lot of manipulation tactics in widespread use. My mind doesn’t react how they expect, and so it doesn’t achieve the intended result. I do know however, that my own pressure points are likely particularly vulnerable. I’ve not had the practice resisting having them pressed.

A solid grounding gives you a good reference, but no more. As individuals, it is down to us to use that reference to resist undue manipulation.

tinfox,

deleted_by_author

  • Loading...
  • cynar,

    The only way you can’t be manipulated is if you are dead. All human interaction is manipulation of some sort of another. If you think your immune, your likely very vulnerable. If it’s delivered in the correct way, since your not bothering to guard against it.

    An interesting factoid I’ve ran across a few times. Smart people are far easier to rope into cults than stupid people. The stupid, have experienced that sort of manipulation before, and so have some defenses against it. The smart people assume they wouldn’t be caught up in something like that, and so drop their guard.

    In the words of Mad-eye Moody “Constant vigilance!”

    Dark_Arc,
    @Dark_Arc@lemmy.world avatar

    I think it’s worth pointing out “no longer” is not a fair assessment since this is regularly an issue with older Americans.

    I’m inclined to believe it was never taught in schools, and is probably more likely to be a subject teachers are increasingly likely to want to teach (i.e. if politics didn’t enter the classroom it would already be being taugh, and might be in some districts).

    The older generations were given catered news their entire lives, only in the last few decades have they had to face a ton of potentially insidious information. The younger generations have had to grow up with it.

    A good example is that old people regularly click malicious advertising, fall for scams, etc, they’re generally not good at applying critical thinking to a computer, where as younger people (typically though I hear this is regressing some with smartphones) know about this stuff and are used to validating their information (or at least have a better “feel” for what’s fishy).

    clobubba,

    deleted_by_author

  • Loading...
  • Mikina,

    It’s even worse than “a lot easier”. Ever since the advances in ML went public, with things like Midjourney and ChatGPT, I’ve realized that the ML models are way way better at doing their thing that I’ve though.

    Midjourney model’s purpose is so receive text, and give out an picture. And it’s really good at that, even though the dataset wasn’t really that large. Same with ChatGPT.

    Now, Meta has (EDIT: just a speculation, but I’m 95% sure they do) a model which receives all data they have about the user (which is A LOT), and returns what post to show to him and in what order, to maximize his time on Facebook. And it was trained for years on a live dataset of 3 billion people interacting daily with the site. That’s a wet dream for any ML model. Imagine what it would be capable of even if it was only as good as ChatGPT at doing it’s task - and it had uncomparably better dataset and learning opportunities.

    I’m really worried for the future in this regard, because it’s only a matter of time when someone with power decides that the model should not only keep people on the platform, but also to make them vote for X. And there is nothing you can do to defend against it, other than never interacting with anything with curated content, such as Google search, YT or anything Meta - because even if you know that there’s a model trying to manipulate with you, the model knows - there’s a lot of people like that. And he’s already learning and trying how to manipulate even with people like that. After all, it has 3 billion people as test subjects.

    That’s why I’m extremely focused on privacy and about my data - not that I have something to hide, but I take a really really great issue with someone using such data to train models like that.

    Cheers,

    Just to let you know, meta has an open source model, llama, and it’s basically state of the art for open source community, but it falls short of chatgpt4.

    The nice thing about the llama branches (vicuna and wizardlm) is that you can run them locally with about 80% of chatgpt3.5 efficiency, so no one is tracking your searches/conversations.

    MelonTheMan,

    Great advice in here. Now, how do I de-radicalize my mom? :(

    alphacyberranger,
    @alphacyberranger@lemmy.world avatar

    I too faced this dilemma. So I uninstalled every ad blocker and made it very tedious videos. It kinda helped.

    MrFagtron9000,

    I had to log into my 84-year-old grandmother’s YouTube account and unsubscribe from a bunch of stuff, “Not interested” on a bunch of stuff, subscribed to more mainstream news sources… But it only works for a couple months.

    The problem is the algorithm that values viewing time over anything else.

    Watch a news clip from a real news source and then it recommends Fox News. Watch Fox News and then it recommends PragerU. Watch PragerU and then it recommends The Daily Wire. Watch that and then it recommends Steven Crowder. A couple years ago it would go even stupider than Crowder, she’d start getting those videos where it’s computer voice talking over stock footage about Hillary Clinton being arrested for being a demonic pedophile. Luckily most of those channels are banned at this point or at least the algorithm doesn’t recommend them.

    I’ve thought about putting her into restricted mode, but I think that would be too obvious that I’m manipulating the strings in the background.

    Then I thought she’s 84, she’s going to be dead in a few years, she doesn’t vote, does it really matter that she’s concerned about trans people trying to cut off little boy’s penises or thinks that Obama is wearing ankle monitor because he was arrested by the Trump administration or that aliens are visiting the Earth because she heard it on Joe Rogan?

    luis123456,

    I did the same thing, although putting some videos with other things will declutter the algorithm too

    Jackolantern,

    Oof that’s hard!

    You may want to try the following though to clear the algorithm up

    Clear her YouTube watch history: This will reset the algorithm, getting rid of a lot of the data it uses to make recommendations. You can do this by going to “History” on the left menu, then clicking on “Clear All Watch History”.

    Clear her YouTube search history: This is also part of the data YouTube uses for recommendations. You can do this from the same “History” page, by clicking “Clear All Search History”.

    Change her ‘Ad personalization’ settings: This is found in her Google account settings. Turning off ad personalization will limit how much YouTube’s algorithms can target her based on her data.

    Introduce diverse content: Once the histories are cleared, start watching a variety of non-political, non-conspiracy content that she might enjoy, like cooking shows, travel vlogs, or nature documentaries. This will help teach the algorithm new patterns.

    Dislike, not just ignore, unwanted videos: If a video that isn’t to her taste pops up, make sure to click ‘dislike’. This will tell the algorithm not to recommend similar content in the future.

    Manually curate her subscriptions: Unsubscribe from the channels she’s not interested in, and find some new ones that she might like. This directly influences what content YouTube will recommend.

    ScaNtuRd,

    Migrate her over to a Federated alternative. No aggressive algorithms here

    ilco,

    block youtube/facebook/and any social media . though dns .with pihole for the time being . and in the meantime try to reset her youtube(or delete and create new one -if she doesnt use gmail)

    Ozymati,
    @Ozymati@lemmy.nz avatar

    Log in as her on your device. Delete the history, turn off ad personalisation, unsubscribe and block dodgy stuff, like and subscribe healthier things, and this is the important part: keep coming back regularly to tell YouTube you don’t like any suggested videos that are down the qanon path/remove dodgy watched videos from her history.

    Also, subscribe and interact with things she’ll like - cute pets, crafts, knitting, whatever she’s likely to watch more of. You can’t just block and report, you’ve gotta retrain the algorithm.

    lingh0e,

    Would it help to start liking/subscribing to videos that specifically debunk those kinds of conspiracy videos? Or, at the very least, demonstrate rational concepts and critical thinking?

    RGB3x3,

    Probably not. This is an almost 70 year old who seems not to really think rationally in the first place. She’s easily convinced by emotional misinformation.

    Probably just best to occupy her with harmless entertainment.

    driving_crooner,
    @driving_crooner@lemmy.eco.br avatar

    We recommend her a youtube channel about linguistics and she didn’t like it because the Phd in linguistics was saying that is ok for language to change. Unfortunately, it comes a time when people just want to see what already confirms their worldview, and anything that challenges that is taken as an offense.

    sergih123,

    Yeah, when you go on the feed make sure to click on the 3 dots for every recommended video and “Don’t show content like this” and also “Block channel” because chances are, if they uploaded one of these stupid videos, their whole channel is full of them.

    brainwashed,

    If she has no account maybe try to disable youtube from setting cookies. So shell start over every time.

    ChameleonMan,

    She probably uses the TV app and doesn’t watch it on a PC

    rustydrd,
    @rustydrd@lemmy.world avatar

    Could also help to deactivate the personalized advertising functionality in the Google/YouTube settings (basically wipe currently stored preference, the forbid YouTube from making suggestions based on your interests). This will keep her feed fairly generic (and bad, oh boy) so that she would have to actively search for or subscribe to these videos.

    KeisukeTakatou,

    I just want to share my sympathies on how hard it must be when she goes and listens to those assholes on YouTube and believes them but won’t accept her family’s help telling her what bullshit all that is.

    I hope you get her out of that zone op.

    zombuey,

    youtube has a delete option that will wipe the recorded trend. then just watch a couple of videos and subscribe to some healthy stuff.

    _g_be,

    Very cool, had no idea it was that simple

    zombuey,

    support.google.com/youtube/answer/55759?hl=en#zip…

    when you delete your account it also deletes your history. I assume she isn’t a content creator so the rest won’t be concerning. The next time you login using your gmail it will just be like when you first logged into youtube.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • nostupidquestions@lemmy.world
  • tacticalgear
  • thenastyranch
  • ethstaker
  • everett
  • Durango
  • rosin
  • InstantRegret
  • DreamBathrooms
  • magazineikmin
  • Youngstown
  • mdbf
  • slotface
  • GTA5RPClips
  • kavyap
  • JUstTest
  • tester
  • cubers
  • cisconetworking
  • ngwrru68w68
  • khanakhh
  • normalnudes
  • provamag3
  • Leos
  • modclub
  • osvaldo12
  • megavids
  • anitta
  • lostlight
  • All magazines