BorisBarbour

@BorisBarbour@mastodon.social

Neuroscience (CNRS, ENS); pubpeer.com; peeriodicals.com; referee3.org. Views my own. Inactive https://mastodon.social/@BorisBarbour on Twitter

This profile is from a federated server and may be incomplete. Browse more on the original instance.

BorisBarbour, to random

Désolant d'entendre que le gouvernement est jaloux de l'Italie et de l'Espagne à cause de leur rendement en publications. L'équation est très simple, un APC à (par exemple) Hindawi coûte toujours moins cher qu'un ANR.

https://www.radiofrance.fr/franceculture/podcasts/le-temps-du-debat

erinnacland, to academicchatter

"Wiley to stop using “Hindawi” name amid $18 million revenue decline"

"In March, Clarivate removed 19 Hindawi journals from its Web of Science index for failing to meet editorial quality criteria. Wiley later shut down four Hindawi journals it had identified as 'heavily compromised by paper mills.'"

"[Retraction Watch has] logged more than 3,400 retractions from Hindawi journals [...] In April, an executive said the publisher would retract 1,200 more"

https://retractionwatch.com/2023/12/06/wiley-to-stop-using-hindawi-name-amid-18-million-revenue-decline/
@academicchatter

BorisBarbour,

@erinnacland @academicchatter

That strategy could also backfire badly, if they don't get a grip of the problem and contaminate their mother brand.

BorisBarbour, to random
adredish, to random

An interesting consequence of the hypothesis that human social structures are built on "assurance" or coordination games rather than prisoner's dilemmas:

Coordination games have two stable states. If you are living in a world where everyone else is cooperating, it is in your best interests to cooperate as well. If you are living in a world of cheaters, cooperation is for suckers.

This means your perception of your community has a big impact on your own behavior.

While I agree that we do need things like @deevybee 's defense against the dark arts and @PubPeer and the enforced "share the data as is" regulations that @BorisBarbour has been talking about, I think we also need to make sure that we CELEBRATE openness, integrity, and we make sure that we report it to the world. We do not want all of our news reports to be about fraud.

So, can I recommend a policy? For every fraud that gets reported, find a positive success to talk about. I guarantee they are out there. In fact, I bet they are so common, we don't notice them.

All the people who share their code and fix the bugs that others find. The labs that say "come on by and we'll show you how we do stuff". The people who work with others to make their data useful and not just "out there". There are lots and lots of these positive examples. I worry they get lost because they are so common. We need a hashtag for celebratory cooperation in the sciences. I'm open to suggestions.

PS. For those who don't know it, the coordination game is structured so that for player A (given player B choice): C(C) > D(C) > D(D) > C(D), as compared to the prisoner's dilemma which is: D(C) > C(C) > D(D) > C(D).

In the coordination game, it is best to do what the other player is doing. In the prisoner's dilemma it is best to defect. There are n-player extensions of this as well.

BorisBarbour,

@adredish @deevybee @PubPeer

In other words, researchers should be evaluated on their practices and not on the numbers of their publications and the impact factors of their journals?

To some extent we can make that happen, but there are many other places than social media that may be more effective. Promotions, hiring, grants, reviews...

elduvelle, to random
@elduvelle@neuromatch.social avatar

So just made an agreement with... the predatory publisher

I know not everyone liked it but I was actually finding ResearchGate quite useful. Now this is completely trashing the little reputation they had left in the toilet. MDPI might have a few good journals but it is overall famously known for its predatory practices like not listening to the reviewers and not having enough time for proper review between submission and publication.

This is a really disappointing move from ResearchGate that's really not in their interest. I guess they got a big check for it? Makes you wonder what other unscientific content are they getting paid to promote.

I'm going to write to them (e.g. press@researchgate.net) to ask that they cut all ties with MDPI and any other predatory publishers. I hope that many users will complain too. If it has no effect, I'll just have to close my account and never go there again...

"MDPI’s commitment to delivering a high-quality service for our authors" :rofl:​

I guess you could say that not having proper peer-review is a service to the authors in a way?? 🤔​

source: publisher announcement from ResearchGate

BorisBarbour,

@brembs @Etche_homo @thartbm @elduvelle

Re: parallel universes of predatory and non-predatory papers, I'd say that the separation exists but is not hermetic and is probably becoming less so.

@deevybee recently covered that ground:

https://deevybee.blogspot.com/2023/11/defence-against-dark-arts-proposal-for.html

BorisBarbour, to random
BorisBarbour,

Note also the pathetic irrelevance of promises to share the data.

eLife, with a supposedly robust data-sharing policy, are also perfectly content to hide behind a promise that the authors do not keep:

https://pubpeer.com/publications/62FDB39609D2177EA85FC04E84A4BF#5

BorisBarbour,

@albertcardona

Out of curiosity I asked the authors to share the data. This was Dalli's reply:

"If you wish to access the raw data files we are also happy to facilitate this. Due to ongoing IP restrictions, and in line with the funders policy (Wellcome Trust), we are not in a position to release these files... We instead can facilitate data access either virtually or in person where I will be happy to take you through the raw data myself."

BorisBarbour,

@albertcardona

Presumably that data policy was allowed by an eLife editorial decision following a prior access request (not from me). They chose not to enforce a completely reasonable request under their policy. They have contributed to the rearguard defence of this garbage.

Sadly, eLife data-access policy, much like the PLoS one, has devolved into window-dressing and open-washing. It's not observed and not enforced. The PLoS one is moreover egregiously self-contradictory even as written.

BorisBarbour,

@albertcardona

Of course, nothing prevents authors from sharing data if they wish, wherever they publish, but a lot of the benefits for quality and integrity only flow if access is mandatory. Otherwise the dodgy stuff just remains hidden, as in this case. Journals have largely failed at this and it seems that it's going to be up to funders (governments).

BorisBarbour,

@adredish @albertcardona

I disagree with almost all of that... Most of my arguments are laid out here:

https://referee3.org/2019/12/25/data-sharing-should-be-mandatory-public-and-immediate/

Instead of me litigating each point, if you are interested, let me know which points you disagree with after reading the blog.

The problem is that when you need the data because a problem is suspected, as in this specific case, you can't get it. That's bascially what always happens. The dodgiest data is the hardest to get. The excuses are jokes in this case.

BorisBarbour,

@neuralreckoning @adredish @albertcardona

Although fabrication of whole datasets very much does happen (when a papermill paper is investigated), it's obviously much rarer for "real" research. However, if the data were always fine, they really shouldn't be that difficult to obtain upon request. Yet they are. Pressing blood from a stone doesn't cover it.

I'll thread a few replies to your points, David.

BorisBarbour,

@neuralreckoning @adredish @albertcardona

  1. I think biology places far too much weight on "concepts" that are not fully supported by the data. We don't have much that compares with principles like the conservation of energy. So it can be critical to examine the underlying data, sometimes in ways that the authors didn't plan for or might not wish to facilitate.

If you want or need to examine a conclusion, you can do a lot with the data and code. Full replications are rarely feasible.

BorisBarbour,

@neuralreckoning @adredish @albertcardona

2a. I don't know of any systematic study of a correlation between data--sharing and quality either, but my own experience of having to publish data and a (barely) reproducible analysis certainly made me feel that the results were more robust. And there's a lot more (still anecdotal) evidence for the converse, that crap data tends to remain hidden.

BorisBarbour,

@neuralreckoning @adredish @albertcardona

2b. Preparing data for sharing can be a burden (and therefore a cost), but I'd argue that it's mostly the burden of producing high-quality research. For now I'm arguing that data should be shared "as is". If your work is sloppy and your data are in crap shape, that's precisely when and how they should be shared. No extra burden. If you are not prepared to share your data, why should your paper be trusted?

BorisBarbour,

@neuralreckoning @adredish @albertcardona

  1. Agree that wholesale fabrication is rare in the papers we'd be interested in. However, through involvement in PubPeer, I have just seen sooo many cases where:
  • there are screamingly legitimate concerns
  • somebody desperately wants to analyse the data
  • the data are never shared

The papers of the initial post are a good example. Really nice concept (SPMs). Comically bad science. 80 publications. Impossible to get the data.

BorisBarbour,

@neuralreckoning @adredish @albertcardona

  1. So it turns out we agree. Share the data as it is. We'll slowly learn how to format it better. If somebody is interested, they'll work it out.

Is that really such a significant extra burden that would make "science more expensive and move more slowly"?

BorisBarbour, to random

News a week old, but:

https://www.nature.com/articles/d41586-023-03398-4

“UNLV... is committed to maintaining the highest standards for research integrity campus wide”.

BorisBarbour,

'... says Karl Ziemelis, chief physical sciences editor at Nature. “What the peer-review process cannot detect is whether the paper as written accurately reflects the research as it was undertaken.” '

Yet

“Virtually every serious condensed-matter physicist I know saw right away that there were serious problems with the work,” says Peter Armitage, an experimental physicist at Johns Hopkins University in Baltimore, Maryland.

Quite apart from the high-profile prior problems.

BorisBarbour, to random

"According to Schrag, he, Patrick, and Bik might file a federal whistleblower lawsuit to receive a portion of any NIH funds the government claws back from USC if federal authorities deem Zlokovic’s work fraudulent."

Seems like a good idea:

  1. deserved compensation when nobody else will help
  2. institutions will only act when it costs them more not to, and funders (cough NIH) could but aren't making that happen.
BorisBarbour,

It's also worth taking a step back to consider how many young scientists would be corrupted or broken in this sort of lab

BorisBarbour, to random

What a house of cards. Much success, little truth.

https://www.science.org/content/article/misconduct-concerns-possible-drug-risks-should-stop-stroke-trial-whistleblowers-say

Congratulations to Charles Piller and Science.

A few other points of interest below.

BorisBarbour,

Two more sleuths come out: Kevin Patrick and Mu Yang. Congratulations!

BorisBarbour,

“USC takes any allegations relating to research integrity seriously.”

"NIH told Science it takes research integrity concerns very seriously"

Let us know if you actually do something?

BorisBarbour,

Anonymous scientist quoted in the article:

"My main residual question is, why? Why would one bother to go to these lengths to change images, when the guy has the resources to generate loads of great papers without doing this?”

It's amazing how many people forget that real experiments wouldn't give the desired answers.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • anitta
  • InstantRegret
  • mdbf
  • ngwrru68w68
  • magazineikmin
  • thenastyranch
  • rosin
  • khanakhh
  • osvaldo12
  • Youngstown
  • slotface
  • Durango
  • kavyap
  • DreamBathrooms
  • JUstTest
  • tacticalgear
  • ethstaker
  • provamag3
  • cisconetworking
  • tester
  • GTA5RPClips
  • cubers
  • everett
  • modclub
  • megavids
  • normalnudes
  • Leos
  • lostlight
  • All magazines