jonny, (edited )
@jonny@neuromatch.social avatar

Apparently #Clarivate's Web of Science, one of the major proprietary indexes that employers use to determine whether papers in a journal can be considered in tenure & promotion decisions, denied @joss 's request to be indexed without even telling us. This is not the first time JOSS has been rejected

I checked their "objective review criteria" and JOSS easily passes all the qualifications.

Speaking strictly as my own opinion, not in my role as a JOSS editor or reviewer, but as a matter of fact these indexers are a fucking racket.

https://github.com/openjournals/joss/issues/1283

Edit: here's the cause of rejection -

We received a desk rejection (based on an initial check) as we don't have :

Editor titles and affiliations listed.
A postal address for the publisher.

https://github.com/openjournals/joss/issues/1283#issuecomment-1971277233

But they are listed: https://joss.theoj.org/about

And the submission was in last may and we just got the response.

jonny,
@jonny@neuromatch.social avatar

@rmounce wrote a piece on this recently, and imo his conclusions are correct

"I talked about a tale of two open access journals catering for the same authors, one of which has author-side article processing charges (APCs): SoftwareX, and the other: Journal of Open Source Software (JOSS), which does not charge APCs.

they publish a high volume of papers, with over 300 in SoftwareX and over 400 in JOSS in 2023, challenging the notion that “diamond open access can’t scale”. However, that is where their similarities end.

Yet two proprietary journal indexers have not given these journals equal treatment. Scopus (Elsevier) and Web of Science (Clarivate) have accepted SoftwareX into their indexes but have refused to index JOSS, despite multiple applications from the JOSS team

The best solution here is not to beg for JOSS to be included in these proprietary indexes, but rather to call institutions and departments relying on Scopus and Web of Science to review and change their policies."

https://council.science/current/blog/open-science-round-up-january-2024/

Refusal to index JOSS is transparently an attempt to deter submissions to a journal that costs next to nothing to operate while providing high quality, collaborative, open peer review that perfectly matches the needs of the community it serves. JOSS is too compelling of an example of what waits on the other side after the abolition of commercial publishing - and refusal to index shows us what barriers remain to reach it.

jonny,
@jonny@neuromatch.social avatar

To anyone in any position to affect T&P criteria in their institution - heres a smoking gun demonstrating that using these indexes are not based on "rigorous, objective review criteria," but instead tools by which the commercial publishers prop up their system of profit extraction, stamping out free to publish, free to read journals while pocketing billions in public funding from their APC-driven prestige treadmill.

Restricting the allowable journals to those indexed by Scopus and WoS serves no ones interests but the commercial publishers.

giorginolab,
@giorginolab@mstdn.science avatar

@jonny Self-subjugation to proprietary indices is a serious part of a serious problem. It should be patently obvious that's a recipe for disaster (already in progress). Very few (but >0) forward-thinking institutions are taking note. The root of the problem is that the actual service indices provide is "shielding from accountability (when convenient)", a highly sought-after commodity in an increasingly bureaucratised environment.

brembs,
@brembs@mastodon.social avatar

@jonny @joss

Obviously they're a racket! They let the journals decide which impact factor they get:

https://bjoern.brembs.net/2016/01/just-how-widespread-are-impact-factor-negotiations/

To really drive this home, compare these two WoS entries and decide whether to laugh or cry:

villavelius,
@villavelius@mastodon.online avatar

@brembs @jonny @joss
The Impact Factor is primarily meant for the marketing department of the publisher in question. Its impact on the scientific community is collateral damage.

jonny,
@jonny@neuromatch.social avatar

@villavelius
@brembs @joss
If by collateral damage you mean the whole means by which that marketing number is effective then yes.

manisha,
@manisha@neuromatch.social avatar

@jonny @joss so they don't consider email addresses as sufficient contact information and want a postal address listed on the website (for a purely digital journal)? what kind of archaic requirement is that?!

Anyway, if you don't have a biz address yet nor wish to add any personal addresses on the website (which is wise!) but still wish to be indexed by them, have you considered adding a virtual business address like those provided by companies like the ones listed here? That's what several purely digital companies and non-profits use.

jonny,
@jonny@neuromatch.social avatar

@manisha
@joss
I personally think we should go the shady Delaware LLC route and once a year have a summit at the storage container we rent and use as an address

nicolaromano,
@nicolaromano@qoto.org avatar

@jonny @joss If I could boost this multiple times I would. I've seen institutions and grant/job applications specifically asking for IF of papers and/or asking for 'x papers with IF>y' as a tenure/promotion requirement. I have even heard 'too bad you published in <very decent and actually quite famous journal>, it would have been better not to have that on your CV. This is (part of) what drives bad science and often it's driven by entitled people who don't realise they got lots of high IF papers because they come from 'big name's laboratories that have the political power to publish anything wherever they want.
I think we all know too many examples of very shaky science published in big journals...

jonny,
@jonny@neuromatch.social avatar

@nicolaromano nothing to add but amen.

albertcardona, (edited )
@albertcardona@mathstodon.xyz avatar

@nicolaromano @jonny @joss

In a way that's fortunate: such signal clearly indicates which institutions you most definitely do not want to work at. Because those are institutions led by cowards who wouldn't for anything in this world use their own judgement to evaluate a piece of research or a scientist, and instead hide behind the judgement of others such as journal editors and their pick of reviewers, salted by self-reinforcing power dynamics. Many in leadership positions aren't even aware that this is their strategy, instead being convinced of having a standing on a high moral ground – that supposedly granted by quantitative bibliometrics with its (false) promise of impartiality. They were promoted supposedly for their prowess as scientists – if not mismeasured by impact factors; yet such is the depth of their incompetence as leaders and as academic administrators.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • tacticalgear
  • DreamBathrooms
  • mdbf
  • InstantRegret
  • ngwrru68w68
  • magazineikmin
  • thenastyranch
  • Durango
  • rosin
  • Youngstown
  • slotface
  • khanakhh
  • kavyap
  • ethstaker
  • JUstTest
  • cubers
  • cisconetworking
  • normalnudes
  • modclub
  • everett
  • osvaldo12
  • GTA5RPClips
  • Leos
  • anitta
  • tester
  • provamag3
  • megavids
  • lostlight
  • All magazines