Finally, our most recent review got published in the journal Biology. For my taste, its title is too long: Proteolytic Cleavages in the VEGF Family: Generating Diversity Among Angiogenic VEGFs, Essential for the Activation of Lymphangiogenic VEGFs. On top of this, the title contains an abbreviation: VEGF.
Most publishers are in for the money
This is the first time we published with MDPI. Although the criticism of MDPI has not entirely gone away after its 2015 vindication from Beall's List, the critique is now mostly centered around the accusation that MDPI is after the money and not the quality. That said, most publishers are in it for the money. Especially the stock-market listed publishers are by law forced to be in it for the money: Elsevier, John Wiley & Sons, etc. And many others such as Springer Nature are desperately trying to become also a member of the stock-market listed club. We as scientists could probably use a little bit of this attitude because we are naturally bad at making money (we can generate knowledge, but not revenues).
Concerns over review quality
Additionally, MDPI's review process has been criticized as being not very rigorous. What was our experience? Our paper was apparently scrutinized by four reviewers and you can read the reviewers' comments here: https://www.mdpi.com/2079-7737/10/2/167/review_report.
How did we experience the reviewers' quality of feedback?
Three out of the four reviewers gave real feedback. Reviewer 3 could be - for all what matters - replaced by Grammarly.com or the Microsoft Word spell checker. Reviewer 2 focuses also almost entirely on style and presentation. Don't get me wrong: Good style and presentation are very important! But, first of all, let's get the science right!
The remaining two reviewers were apparently not extremely familiar with the research topic vascular biology. This agrees with my own experience, that I often get requests from MPDI journals to review papers outside or only tangential to my area of expertise (which I immediately reject since I anyway have too many papers to review).
You could argue in favor of MDPI, that our manuscript was already quite good when we first submitted it. But please: we cobbled this together in a hurry during the month of December and I know that there are still quite a few errors (which we realized a few minutes after the paper was published).
We wanted to try an Open Review, but failed
It strikes me that we had opted for open review (i.e. only to use reviewers that were agreeing to publish their identity together with their reviews). However, none of our reviewers revealed their identity. There is probably some fine print somewhere that says that the editor can override this choice when no reviewers are found that agree to give up their anonymity. To a certain extent, I can feel the editor's pain. It is difficult enough to find good peer reviewers in the first place, because - despite many attempts for a change - reviewing manuscripts is not rewarded in the current scientific system.
Is it fast? Imho, some of it was too fast
The review process was fast. The revisions were even faster. And the publishing was much too fast. I was terrified when I saw that we might not get a second set of proofs. The first proofs had to be modified extensively because the layout had changed the image placement. As a consequence, almost all the references needed renumbering. Moreover, during the proof generation, some parts of the figure legends got mistaken as body text. Therefore we had to shift large portions of text around during the proofing. And if you ever have used Word, you know that this is nothing to be excited about. At least, MS Word doesn't crash anymore as it used to do in the old days. Back then, pushing the Crtl-S key combo after every minute of editing was outsourced from the brain to the spinal cord. But also this time, Word did not disappoint us by introducing unwanted formatting changes that were impossible to undo.
None of us managed to have even a look at the second proofs this Monday before the paper went online, after which the link to the second proofs expired. We have lots of other things to do: prepare lectures, participate in faculty meetings, take care of students, and - last but not least - we occasionally also like to do some research. Fast publishing is not a virtue in itself. But it certainly helps to keep the expenses in check.
After the game is before the game.
And this time it's not a review, but original research and we will choose another publisher. The experience was definitely not a catastrophe, but I can clearly see that such an over-streamlined process can go wrong once in while with less scrupulous authors. And there are many examples (see the Wikipedia entry).
For those interested, here is the link to the translation of the Norwegian article discussing the MDPI quality issue.
After this experience, a scientific analysis of MDPI journals was performed, looking at self-citations, citation cartels, special issues, APC charges, and review- and acceptance times: https://doi.org/10.1093/reseval/rvab020. I only became aware of this article more than half a year after its publication.
The take-home message of it is that MDPI, as well as the publisher OMICS Interntional, engage in editorial behaviours that fall under the umbrella of predatory practices. At the same time, their operation "has reached such a level of sophistication that they totally or partially comply with the formal criteria that serve to differentiate between predatory and legitimate journals". The authors of this analysis use the term "non-evident/hidden predatory publisher".
At the same time, many legitimate and respected scientists are working on the Editorial Boards of MDPI journals. I personally know quite a few. As always, the truth is not black or white but some shade of grey. Besides, this article was published by the reputable publisher Oxford University Press, which could be interpreted as a conflict of interest.