First misinformation susceptibility test finds 'very online' Gen Z and millennials are most vulnerable to fake news

Researchers want the public to test themselves: https://yourmist.streamlit.app/. Selecting true or false against 20 headlines gives the user a set of scores and a "resilience" ranking that compares them to the wider U.S. population. It takes less than two minutes to complete.

landwomble,

I'm not sure this is a good study. I mean I scored 85% so woohoo but you just get headlines to go off. The art of noticing disinformation is in reading articles and making inferences on them. Questions like "vaccines contain harmful chemicals" are obvious red flags but there are some that are a reasonable-sounding headline but I'd imagine the article itself would fall apart on first reading. I know half the problem is people don't read articles but this is a very simplistic survey.

sab, (edited )
sab avatar

Not only is it not good, I'd dare to say it's awful. Never mind that the headlines themselves are terribly crafted: the entire point is that one has to be critical of sources, and not take everything at face value just because it sounds somewhat convincing. It's not about blatantly discrediting things at face value because they don't fit what you believed to be true.

By the standards of this test, headlines such as "The CIA Subjected African-Amercians to LSD for 77 Consecutive Days in Experiment" would clearly belong in the fake news category. And if it's supposed to test whether the (presumably American) respondent has decent insight into the realities of contemporary politics, why in the world would it include something as obscure as "Morocco’s King Appoints Committee Chief to Fight Poverty and Inequality". There's literally no way of knowing without context whether the associated article would be propaganda or just an obscure piece of foreign correspondence. Many of the "true" headlines are still things one shouldn't take for granted without checking sources, and many of the "fake" ones are cartoonish.

It's just bad research.

Edit: Rather than bad research, it seems it might be badly misrepresented. The paper itself appears completely different from what is reported in the linked article. I'm still, however, not entirely convinced by the approach using AI generated headlines.

somefool,
@somefool@beehaw.org avatar

It is, and I feel the questions are quite obvious.

That being said... I'm related to conspiracy theorists. I got a first-row seat to their dumbassery on facebook before I deleted my account. And... a significant issue was paywalled articles with clickbait titles, during Covid especially. The title was a doubt-inducing questions, such as "Do vaccines make you magnetic?" and the reasoning disproving that was locked behind the paywall. And my relatives used those as confirmation that their views were true. Because the headlines introduced doubt and the content wasn't readable. That and satire articles.

DessertStorms, (edited )
DessertStorms avatar

Questions like "vaccines contain harmful chemicals" are obvious red flags but there are some that are a reasonable-sounding headline

It's exactly those "reasonable" sounding headlines (and in some cases the ideas and opinions that back them up in the body of the article, but that has to be provided for it to relevant, which as you point out isn't, which is a big problem) that serve as misinformation and/or dog whistles, so "vaccines contain harmful chemicals" could be aimed at antivaxxers (and those susceptible to being pushed there), but it's also technically correct, and apples and bananas for example contain these "harmful chemicals" too.
The article could be either fear mongering and disinformation - false, or science based and educational - true, but we can't know which just from the headline.

A headline like "small group of people control the global media and economy" could be a dog whistle for antisemitism - false, or be an observation of life on earth right now - truth.

My point is there are headlines that would seem like conspiracy theory to some, but irrefutable fact to others, and probably the opposite of each to each respective group, and without more than a headline (and often even with, of course), it's entirely down to the readers' existing opinions and biases.

Not a great way to test this.

DarkThoughts,

Maybe they targeted redditors specifically.

sik0fewl,

You guys read the articles??

DarkThoughts,

It depends. But I am not forming my opinion on a loaded headline. If there's a headline like that, then yes, I rather check the article to see if it actually is like that or not. The majority of headlines nowadays are heavily sensationalized, especially the ones from news sites with a certain agenda.

mrbubblesort,
mrbubblesort avatar

Somehow I got 100%, but it was mainly luck. I really have no clue what % support marijuana is in the US, how young Americans feel about global warming, or how globally respected they feel. I'm not from there, so I don't follow it at all. I think it would've been better if they had an "I don't know / Irrelevant to me" option.

vaguerant,
vaguerant avatar

Just took a look here, and yeah. One of the headlines they ask you to rate is "Hyatt Will Remove Small Bottles from Hotel Bathrooms". It's the kind of thing that's basically a coin flip. Without having any context into the story, I have no opinion on whether it's fake or not. I don't think guessing incorrectly on this one would indicate somebody is any more or less susceptible to miscategorizing stories as real/fake.

sab,
sab avatar

I assume the idea is to include some pointless headlines (such as this) in order to provide some sort of baseline. The researcher probably extracts several dimensions from the variables, and I assume this headline would feed into a "general scepticism" variable that measures he likelihood that the respondent will lean towards things being fake rather than real.

Still, I'm not at all convinced about this research design.

Flyingtiger188,

I suspect that where you select on the extremely liberal to extremely conservative spectrum might have a correlation to which fake news titles you fall for. What sounds like obvious propaganda to you may sound like any news article that some may see from a more sensationalist less reliable news source, especially to those predisposed to conspiracy theories.

sab,
sab avatar

Of course, there are a few people out there who won't even identify headlines like "Ebola Virus 'Caused by US Nuclear Weapons Testing', New Study Says", "Government Officials Have Illegally Manipulated the Weather to Cause Devastating Storms", and "Left-Wing Extremism Causes 'More Damage' to World Than Terrorism, Says UN Report" as fabricated even when filling out a survey about fake news. But at that point they're not testing susceptibility to fake news, they're testing whether you've already fallen down the conspiracy rabbit hole and hit your head hard enough on the way down to render you incapable of even slight scepticism.

A better study would be, in my opinion, to present screen shots of actual content from social media (Facebook, Reddit, Twitter, wherever), and have users rank it on a scale from 1 to 7 how much they trust it (not at al <----> completely). That way you can observe sources, content, how many "likes" a post has, and more dimensions that are more valid indicators of how people might (mis)judge content as being true or false.

Blakerboy777,
Blakerboy777 avatar

I took the survey and it gives you two measures - one for correctly identifying true stories and one for correctly identifying fake. If you mark everything fake the results would say you're too skeptical because you discount real stories as fake. So anything that doesn't sound hyper partisan should be marked as real, even if you could imagine how it could be fake.

sab,
sab avatar

So they're just casually pretending misinformation isn't being spread about literally anything these days. To me at least, the AI-esque phrasing of the headlines made me distrust even information I rationally know to be true.

tal,
tal avatar

A common tactic I've seen in news headlines is referencing substances that can harm a human without indicating that in the quantities that they are present, they are not a concern. I'm not sure what the right answer would be to the vaccines question given that. If that is the case there, it may be true but misleading.

NetHandle,

Given the results in the comments, one might suspect the headline here is the real fake news

snooggums,
snooggums avatar

It seems like the study is more about identifying common dog whistles in headlines than actual misinformation. A shift in population demographics happens all the time so that one could be true, but the phrasing of non-white hints that is is probably an article loaded with misinformation about the cause and implications of the demographic shift. No idea how it was scored though.

Eggscellent,
Eggscellent avatar

The answer for that one was true.

mPony,

listen I just want to feel good about getting a good score on a test.

AlteredStateBlob,
AlteredStateBlob avatar

Weird. The only people I know that continually and aggressively bring up very obvious misinformation are the 50+ people in my life.

somefool,
@somefool@beehaw.org avatar

I think the young feel immune, and that they feel socially progressive news cannot be lies because "that is not what our side does, we have ethics".

It's not true in practice, though. Fake news are used to sow division, and making people angry on both sides is part of it. The far-right, boomer fake news are more obvious because they are outlandish, but there's more than that out there.

niktemadur,
niktemadur avatar

They were targeted and blasted in the past decade or two, via the internet. The 60-70somethings got targeted and blasted in the 90s, via murdoch and limbaugh... the snake and the pig.

Ulu-Mulu-no-die,
Ulu-Mulu-no-die avatar

That's anecdotal experience, I'm 50+ and I got 19/20, I 100% identified all fakes and marked fake one of the real ones, so I'm on the skeptical side of things.

mPony,

I got the exact same result, and am the same age. Therefore we must band together against those who are older than us, and also those younger than us. :)

snooggums,
snooggums avatar

How can you tell which answers were which?

Oh, I see that I got 90% fake news but have no idea which ones I got wrong.

Ulu-Mulu-no-die,
Ulu-Mulu-no-die avatar

You can tell by the results "real news detection" and "fake news detection", they don't tell you which one is wrong, probably to avoid other people "copying" the correct answers.

sab,
sab avatar

Ironically the study ignores the arguably most important part of facing fake news: being critical of sources. And as a reportedly "vulnerable" millennial myself, I have to say I'm critical of this one.

Eggscellent,
Eggscellent avatar

I agree, but they need to start someone. They'll submit for review, get some errors pointed out there. Publish in a journal and get some more constructive criticism. The next study can learn from that to make improvements.

sab,
sab avatar

It's already published in Behavior Research Methods. I might be too critical and focusing on the wrong things as a political scientist judging a psychology piece, but at least to me the test does not seem to be that convincing in measuring susceptibility to misinformation. The claim of the article (which I admittedly haven't read carefully) seems to be that "it is feasible to develop a psychometrically validated measurement instrument for misinformation susceptibility", which might still be the case.

potsnpans,

Hooo boy. This article is wildly misrepresenting both the study and it's findings.

  1. The study did not set out to test ability to judge real/fake news across demographic differences. The study itself was primarily looking to determine the validity of their test.
  2. Because of this, their validation sample is wildly different from the sample observed in the online "game" version. As in, the original sample vetted participants, and also removed any who failed an "attention check", neither of which were present in the second test.
  3. Demographics on the portion actually looking at age differences are... let's say biased. There are far more young participants, with only ~10% over 50. The vast majority (almost 90%!) were college educated. And the sample trended liberal to a significant degree.
  4. All the above suggests that the demographic most typically considered "bad" at spotting fake news (conservative boomers who didn't go to college) was massively underrepresented in the study. Which makes sense given that participation in that portion relies on largely unvetted volunteers to sign up to test their ability to spot fake news.

Most critically, the study itself does not claim that differences between these demographics are representative. That portion is looking at differences in the sample pool before/after the test, to examine its potential for "training" people to spot fake news (this had mixed results, which they acknowledge). This article, ironically, is spreading misinformation about the study itself, and doing the researchers and its readers a great disservice.

GataZapata,

Regarding 3, that is the bane of many studies. College students are a demographic to which researchers tend to have easy access, they have time enough to participate and can be motivated by 20€ Amazon vouchers.

Kwakigra,

I would cheat on this test because I cheat in real life. I've been humbled enough times not to put total faith in my initial impression and would rather have more evidence than whatever I happen to be aware of at the moment to determine whether a claim is true.

androogee,

Absolutely. The problem isn’t that some people can psychically know whether a headline is true and some can’t.

The problem is deciding that you know without checking. Which is exactly what this test seems to want you to do.

I mean what does “real” even mean in this context? Just that it’s a published headline? Or that it’s a fact checked headline?

What if it’s true, but it’s not a published headline?

¯_(ツ)_/¯

Whirlgirl9,
Whirlgirl9 avatar

i got 20/20. did i win?

y0ink,

Same :)

Whirlgirl9,
Whirlgirl9 avatar

XD

realChem,
@realChem@beehaw.org avatar

Hey all, thanks for reporting this to bring some extra attention to it. I'm going to leave this article up, as it is not exactly misinformation or anything otherwise antithetical to being shared on this community, but I do want to note that there are four different sources here:

  • There's the original study which designed the misinformation susceptibility test; the ArXiv link was already provided, but in case anyone would like a look the study was indeed peer reviewed and published (as open access) in the journal Behavior Research Methods. As with all science, when reading the paper it's important to recognize exactly what it is the authors were even trying to do, taking into account that they're likely using field-specific jargon. I'm not a researcher in the social sciences so I'm unqualified to have too strong an opinion, but from what I can tell they did achieve what they were trying to with this study. There are likely valid critiques to be made here, but as has already been pointed out in our comments many aspects of this test were thought out and deliberately chosen, e.g. the choice to use only headlines in the test (as opposed to, e.g., headlines along with sources or pictures). One important thing to note about this study is that it is currently only validated in the US. The researchers themselves have made it clear in the paper that results based on the current set of questions likely cannot be compared between countries.
  • There's the survey hosted on streamlit. This is being run by several authors on the original paper, but it is unclear exactly what they're going to do with the data. The survey makes reference to the published paper so the data from this survey doesn't seem like it was used in constructing the original paper (and indeed the original paper discusses several different versions of the test as well as a longitudinal study of participants). Again, taken for what it is I think it's fine. In fact I think that the fact that this survey has been made available is why this has generated so much discussion and (warranted) skepticism. Being able to test yourself on a typical survey gives a feel for what is and isn't actually being measured. I consider this a pretty good piece of science communication / outreach, if nothing else.
  • There is the poll by YouGov. This is separate from the original study. The researchers seem to be aware of it, but as far as I can tell weren't directly involved in running the poll, analyzing the data, or writing the article about it. This is not inherently a bad poll, but I do think it's worth noting that it is not a peer reviewed study. We have little visibility into how they conducted their data analysis here, for one thing. From what I can tell without knowing how they actually did their analysis the data here looks fine, but (this not being a scientific paper) some of the text surrounding the data is a bit misleading. EDIT: Actually it looks like they've shared their full dataset including how they broke categories down for analysis, it's available here. Seeing this doesn't much change my overall impression of the survey other than to agree with Panteleimon that the demographic representation here is not very well balanced, especially once you start trying to take the intersections of multiple categories. Doing that, some of their data points are going to have much lower statistical significance than other. My main concern is that some of the text surrounding the data is kinda misleading. For example, in one spot they write, "Older adults perform better than younger adults when it comes to the Misinformation Susceptibility Test," which (if their data and analysis can be believed) is true. However nearby they write, "Younger Americans are less skilled than older adults at identifying real from fake news," which is a different claim and as far as I can tell isn't well supported by their data. To see the difference, note that when identifying real vs fake news a reader has more to go on than just a headline. MIST doesn't test the ability to incorporate all of that context, that's just not what it was designed to do.
  • Finally, there's the linked phys.org article. This is the part that seems most objectionable to me. The headline is misleading in the same way I just discussed, and the text of the article does a bad job of making it clear that the YouGov poll is different from the original study. The distinction is mentioned in one paragraph, but the rest of the article blends quotes from the researchers with YouGov polling results, strongly implying that the YouGov poll was run by these researchers (again, it wasn't). It's a bit unfortunate that this is what was linked here, since I think it's the least useful of these four sources, but it's also not surprising since this kind of pop-sci reporting will always be much more visible than the research it's based on. (And to be clear, I feel I could have easily linked this article myself, I probably wouldn't have even noticed the conflation of different sources if this hadn't generated so many comments and even a report; just a good reminder to keep our skeptic hats on when we're dealing with secondary sources.)

Finally, I'd just like to say I'm pretty impressed by the level of skepticism, critical thinking, and analysis you all have already done in the comments. I think that this indicates a pretty healthy relationship to science communication. (If anything folks are maybe erring a bit on the side of too skeptical, but I blame the phys-org article for that, since it mixed all the sources together.)

somefool,
@somefool@beehaw.org avatar

Throwing phys.org into my "not necessarily reliable sources" list. Sorry about this, I'll be more careful in the future.

I added "Misleading" to the title.

realChem,
@realChem@beehaw.org avatar

No worries! It's generated a lot of interesting discussion at least. It was never on my radar as being especially untrustworthy, either.

aes, (edited )

I feel like a lot of people are missing the point when it comes to the MIST. I just very briefly skimmed the paper.

Misinformation susceptibility is being vulnerable to information that is incorrect

  • @ach @GataZapata It seems that the authors are looking to create a standardised measure of "misinformation susceptibility" that other researchers can employ in their studies so that these studies can be comparable, (the authors say that ad-hoc measures employed by other studies are not comparable).
  • @lvxferre the reason a binary scale was chosen over a likert-type scale was because
    1. It's less ambiguous to participants
    2. It's easier for researchers to implement in their studies
    3. The results produced are of a similar 'quality' to the likert scale version
  • If the test doesn't include pictures, a source name, and a lede sentence and produces similar results to a test which does, then the simpler test is superior (think about the participants here). The MIST shows high concurrent validity with existing measures and states a high level of predictive validity (although I'd have to read deeper to talk about the specifics)

It's funny how the post about a misinformation test was riddled with misinformation because no one bothered to read the paper before letting their mouth run. Now, I don't doubt that your brilliant minds can overrule a measure produced with years of research and hundreds of participants off the top of your head, but even if what I've said may be contradicted with a deeper analysis of the paper, shouldn't it be the baseline?

potsnpans,

Thank you for this!

I have to say though, it's really interesting to see the reactions here, given the paper's findings. Because in the study, while people got better at spotting fake news after the game/test, they got worse at identifying real news, and overall more distrustful of news in general. I feel like that's on display here - with people (somewhat correctly) mistrusting the misleading article, but also (somewhat incorrectly) mistrusting the research behind it.

aes,

That's a very interesting anecdote, now that you say it

somefool,
@somefool@beehaw.org avatar

Thanks for this. I'll freely admit I'm an idiot and didn't feel smart enough to understand the paper (see username). Clarification is much welcome.

I added the link to the paper to the body of the post.

mPony,

Good on them for working toward a simpler test with provably similar results as more rigorous tests.

I feel certain that they're researching something in which companies like Meta are already well-versed. if FB doesn't already have a "gullibility index" I would be completely shocked. I feel equally certain FB would treat such analysis as a Trade Secret and wouldn't dare reveal, nor publish, their findings.

Ulu-Mulu-no-die,
Ulu-Mulu-no-die avatar

You're more resilient to misinformation than 96% of the US population!

I'm not US? I did specify my country, or are they comparing everyone to US only?

ach,

That is such a lazy study it's pitiful and it does in no way test your ability to discern the veracity of news, so even the full marks I got are useless.

First of all, if you generate fake headlines either test someone's general knowledge or critical thinking, don't conflate the two. Secondly, it's the latter that actually matters the most, so if you build your knowledge based on headlines, you're already close to the fake news group.

Ulu-Mulu-no-die,
Ulu-Mulu-no-die avatar

if you build your knowledge based on headlines

You'd be surprise how so many people just do that unfortunately, both on reddit and facebook.

Dufurson,
Dufurson avatar

that test is bs, first I might be gullible and the second round 17/20, the study is the fake news

GataZapata,

I got 19/20, my girlfriend got 15/20. We both think the test design is not super good - only the headlines lead to guessing some times, where parts of article might have painted a clearer picture

Deceptichum,
Deceptichum avatar

20/20 and it ducking sucked.

Some are super obvious , like “did ancient aliens invade and cause Covi and the others are like “3 experts disagree with the juries verdict”.

Also you can’t select Australia as a country but North Korea is fine? Yeah fuck you too England.

jinno,

Yeah, there were a few headlines where I was like “Well… maybe? If I can’t actually read it I’ll assume false, though.”

niktemadur,
niktemadur avatar

I imagine a main goal is to create a sensation of being overwhelmed, which in turn can easily make one apathetic, cynical.

somefool,
@somefool@beehaw.org avatar

https://en.wikipedia.org/wiki/Firehose_of_falsehood

The immediate aim is to entertain, confuse, and overwhelm the audience, and disinterest in or opposition to fact-checking and accurate reporting means the propaganda can be delivered to the public more quickly than better sources.

Spudger,
@Spudger@lemmy.sdf.org avatar

I recall reading something about fake news and propaganda some decades ago. Can’t recall the source book but it goes like this:

If one person tells you something absolutely outrageous you won’t believe it. If a second person tells you the same story you will stop and wonder. If a third person, preferably someone you respect, tells you the same you will have no doubts about the story at all.

I have no idea how true this is but if two more people tell you the same thing…

somefool,
@somefool@beehaw.org avatar

That's... That was true for me, I think. I'm old, didn't always have the internet, I trusted books and family.

But I trusted books, which made me a bit of an alien in my family. And then I acquired extreme suspicion of everything when, at the same time, I started paying attention to far-right politics, and my family got sucked into far-right thinking.

Now they went full Qanon, which pretty much radicalized me. Things are so emotionally charged for me now that I have to doubt and cross-check out of sheer and absolute spite. That shit robbed me of my family and I am so, so pissed.

Spudger,
@Spudger@lemmy.sdf.org avatar

That’s the problem with families. Despite your best efforts at teaching them to have a healthy level of cynicism they can end up believing any old crap they read online. You have to watch out for the warning signs.

koreth,

Got 20/20, was rewarded with a message, “You're more resilient to misinformation than 100% of the US population!” and looked for the Fake button because as a member of the US population, that is a mathematical impossibility.

somefool,
@somefool@beehaw.org avatar

Rounding error :)

vaguerant,
vaguerant avatar

Apparently I'm more resilient to misinformation than 100% of the UK population, but I'm not from the UK; I had to lie on the form because they didn't have my country. Turns out the real fake news was me.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • science@beehaw.org
  • GTA5RPClips
  • magazineikmin
  • InstantRegret
  • everett
  • osvaldo12
  • Youngstown
  • mdbf
  • slotface
  • rosin
  • kavyap
  • Durango
  • ngwrru68w68
  • thenastyranch
  • DreamBathrooms
  • JUstTest
  • khanakhh
  • Leos
  • cisconetworking
  • ethstaker
  • modclub
  • tester
  • cubers
  • tacticalgear
  • provamag3
  • normalnudes
  • anitta
  • megavids
  • lostlight
  • All magazines