Discussion about this post

User's avatar
Neural Foundry's avatar

The beige flag framing is perfect. WPS translation errors explain the mechanism, but the real issue is how these papers passed review with such obvious formatting problems and circular discussions. I ran into somthing similar when reviewing a submission last year where Google Translate artifacts were everywhere but the editorial system just waved it through. The author declarations about no GenAI use while using WPS AI tools is also kinda wild, like they're technically correct but missing the piont entirely.

Gabor Schubert's avatar

Apart from the Kong moth, the references in the MDPI sweetener review (https://doi.org/10.3390/foods14183182) are pretty messed up. I just checked some random references about specific sweetener types in the article:

Monellin [101] points to an article about Neotame (https://doi.org/10.2903/j.efsa.2025.9480)

Neotame [102] points to an article about Advantame (https://doi.org/10.1016/j.fct.2011.06.046)

Lugduname [111] - points to an article about Sucralose, Aspartame (https://doi.org/10.17807/orbital.v16i4.21357, Lugduname mentioned once in the text, but the article is not about that one)

Curculin [178] - points to an article Glycyrrhizic acid (https://doi.org/10.1016/j.ultsonch.2021.105696)

It seems that many references are off or points to irrelevant articles. It might be human mix-up or it could be AI-generated, or both. The usefulness of such a review article is doubtful.

5 more comments...

No posts

Ready for more?