Wellcome Open Research's Weak Flex

Four years in, the F1000-powered site remains confusing and unpopular with authors

Wellcome Open Research's Weak Flex

Wellcome Open Research has issued a report boasting about new heights, higher growth, and lower costs. But when you scratch the surface, you see something stalled out, unpopular with authors, and lacking a clear purpose.

Let’s call it another weak flex, akin to similar attempts by eLife and Gates Open Research last year — instead of impressing, it draws attention to fundamental shortcomings.

In this case, the report seems to underscore that Wellcome Open Research may be as much about content and brand marketing as anything else — after all, papers published in Wellcome Open Research have to include at least one Wellcome-funded author. It’s a house organ.

Wellcome Open Research uses the F1000 Research infrastructure and review approach, where a paper is published and then reviewers are asked to rate it as “Approved,” “Approved with Reservation,” or “Not Approved.” There is no blinded, pre-publication review process, and no editorial accountability beyond the open review process. If one reviewer rates a published submission “Approved” and another rates the same item “Approved with Reservation” or “Approved,” it gets indexed in PubMed (via the PMC back door) and Scopus. Submissions not reaching these lofty standards remain available. For 2020 submissions, 44% have not yet been approved (128 articles). Wellcome Open Research has a long tail of unapproved items — at the end of 2020, 14% of items submitted in 2019 were still not approved. Overall, ~34% of submissions have not been approved, yet all remain publicly available — behavior reminiscent of a preprint server.

F1000 Research and the weak flex

The F1000 Research approach to approval has caused consteration at NLM and elsewhere back to 2012, when the idea was first posited. F1000 Research reset its editorial standards when Scopus required at least one “Approved” for indexing. The quality of the reviews is uneven, and reviewers have been shown to prefer looking at shorter items, while leaving many authors of longer works hanging — something traditional peer-review rarely does. Reviews also tend to be far shorter than those authors receive from closed peer-review.

The process seems to presume virtue and stability, when neither is assured. For instance, if a submission receives two “Approved” statuses, gets indexed, remains open to review, and later receives two or more “Not Approved” statuses, is it de-indexed? What about a submission that receives numerous “Approved with Reservations” and one “Approved,” gets indexed, with a revised version posted, and the revised version receives numerous “Not Approved” statuses? What about a paper that gets one “Approved” and one “Approved with Reservations,” with the next version approved only by the reviewer who had reservations? Is this two “Approveds” if the first reviewer doesn’t sign off on the changes? Or did only one reviewer approve the “Approved” version?

What “Approved” means may be less substantial than you think, according to definitions offered in a F1000 Research application for PMC indexing back in 2012:

Regardless of the convoluted ideas here, the approach outsources accountability in a way that makes it impossible to enforce, with parallels to the now-infamous Parler adjudication system in which five random jurors were asked to review controversies, with their judgment final, while the CEO avoided responsibility or accountability.

Wellcome Open Research has had an identity crisis since it was first discussed in 2016. Back then, Robert Kiley of Wellcome Trust was reluctant to call it a “journal,” with both F1000 Research and Wellcome Trust initially calling it a “publishing initiative.” Kiley specifically said “Wellcome Open Research is not a preprint server,” yet earlier this week nods toward Wellcome Open Research bringing the “benefits of pre-printing.” Just as with current preprint servers, there is no delete button and no review bad enough to remove something once posted.

The weak flex becomes more apparent in the graphic adorning an accompanying blog post by Kiley and F1000 Publishing Director Michael Markie, which boasts that a Wellcome-funded and -sponsored publication venue has become — hope you’re sitting down for this:

I can’t help but feel like this is a grocery store claiming it’s the #1 store for workers using their employee discount. Could the fact that Wellcome-funded researchers don’t have to pay a publication fee or APC if they use Wellcome Open Research have something to do with this dubious achievement?

Even with such incentives, it’s more of a horse race than you might think, with two of high-volume Springer Nature OA journals each fully publishing — with peer-review, editorial accountability, and so forth — nearly as many papers by Wellcome-funded authors:

  1. Wellcome Open Research — 292
  2. Nature Communications — 286
  3. Scientific Reports — 206
  4. PLOS One — 187
  5. eLife — 166
  6. BMJ Open — 116
  7. PNAS — 105
  8. Nature — 85
  9. PLOS Neglected Tropical Diseases — 73
  10. Neuroimage — 64

Just among these outlets, only 23% of the funded research was published in Wellcome Open Research. The percentage when calculating all Wellcome-funded research published across the literature is probably around 8%, based on prior year disclosures of Wellcome APC spending and articles.

On top of this, there doesn’t seem to be a growing fan base for Wellcome Open Research among Wellcome-funded authors, either — only 12% of authors who have ever published with Wellcome Open Research have published there more than once, despite the ability to publish smaller units of information.

Of the 292 items published in Wellcome Open Research in 2020, 149 were research articles. This more representative count would put Wellcome Open Research just ahead of BMJ Open, and far lower on the list. Throughout its existence, only 51% of Wellcome Open Research outputs have been research articles — and it’s unclear how many of these have been “Approved.” Spot-checking the 2020 articles, a decent percentage are probably not “Approved,” based on the F1000 Research criteria. With the journals, you know the counts represent reviewed and approved research outputs.

The lower counts and lower yields of Wellcome Open Research suggests a few things:

  • Wellcome Open Research is not as prestigious or attractive a venue as others for 88-92% of Wellcome-funded researchers
  • Authors may be shifting lesser works to Wellcome Open Research to test the platform or satisfy the funder
  • Wellcome Open Research counts quantity rather than quality when boasting about their accomplishments

Not to say some good papers haven’t made their way into Wellcome Open Research. After all, a broken clock is right twice a day, and they have a few legitimately good papers to boast about, both involving Covid-19. However, the range of items in the outlet — from strong articles to unapproved, malingering articles — doesn’t seem healthy or necessary.

Wellcome Open Research also boasts about its costs to publish being 66% lower than the average APC Wellcome paid in 2018-19. Well, if you don’t have to pay for editors, peer-reviewer management systems, professional staff, or normal overheads — farming out most costs to F1000 Research via a contract — costs can be a lot lower. But if what you deliver is full of compromises, many of which may be inappropriate or unhelpful, it may be that you’re missing the point of scientific and scholarly publishing.

Virtue-signaling based on cutting corners to save money while achieving substandard results? Not a great look.

Based on the report, F1000 Research receives ~$1,125 from Wellcome for each item published, whether it’s a research article, a letter, or a data note. For 2020, Wellcome was invoiced $328,500 by F1000 Research, up from $283,000 in 2018, or $952 per item published. That translates to an 18% increase in costs from 2018 to 2020, or 9% per year. That’s a pretty steep annual increase. Given its eccentric model, the switching cost if Wellcome wanted to move away from F1000 Research may be financially, technically, and reputationally impossible to surmount down the road, even if the costs outstrip average APC costs elsewhere in a few years.

Despite paying more for the platform, Wellcome Open Research is slowing down, with times to “Acceptable” increasing 20% between 2019 and 2020, up to 94 days. Speed may not be as much of a competitive advantage as Wellcome or F1000 Research seem to think. Most journals take longer, and as we saw above, most Wellcome papers end up in such journals. But if speed is an advantage they promote, they are falling behind.

Why would Wellcome spend hundreds of thousands of dollars per year with F1000 Research to make sure 8% of its funded research gets released via a house organ? It’s a good question. Content marketing is something more corporations are doing — custom publishing, custom platforming, and so forth are being used to control distribution and stage a narrative. In Wellcome’s case, that narrative seems to be infused with a “publishers are expensive and unnecessary” bitterness. But after four years, their funded authors don’t appear to feel the same way — they aren’t using or returning to Wellcome Open Research, and the vast majority are using more traditional outlets.

Flex away, Wellcome Open Research. We see what you’ve got.


Subscribe now

Give a gift subscription

Subscribe to The Geyser

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe