[visionlist] Plagiarism checks in Empirical Manuscripts

Mufti Mahmud muftimahmud at gmail.com
Mon Jul 10 13:05:15 -05 2017


Hello,

I'm sorry to have heard that. Your frustration can easily be understood.

You correctly identified the high number of false positives. Cross-check is
way far for being smart and I totally agree that we need better systems!
But from my associate editorial experience, I have found cross-check to be
a useful tool in identifying submissions which don't deserve even to be
submitted to a journal. Alas! for a handful of dishonest authors, each
manuscript now goes through an additional screening process. It's
unfortunate that the editorial manager of your journal had activated the
automated messaging rather than a manual check of the detected matches in
the manuscript.

In my humble opinion, despite many publishing companies charge hefty amount
as 'article processing charge', the editorial and peer-review processes are
still community services; and there should be a system to stop bad paper to
be sent to the reviewers. Cross-check does that at an elementary level and
the responsibility goes to the journal editors to manually screen the
matches detected in the manuscripts and act accordingly.

Best regards,
Mufti



On Mon, Jul 10, 2017 at 4:59 PM, Persike, Malte <persike at uni-mainz.de>
wrote:

> Dear Vision Community,
>
>
>
> during publishing of a recent manuscript, I received a request from the
> editorial office to alter a number of sections in said manuscript. The
> request was triggered by an automated plagiarism check using CrossCheck.
> The whole process left me so puzzled that I thought I‘d share my experience
> here, combined with a humble request for a broader debate about the issue
> of plagiarism in empirical research.
>
>
>
> First, what happened? The report contained a whopping 24 different items,
> each asserting plagiarism of the works of others. The email from the
> editorial office was phrased accordingly. It asked to “amend the affected
> sections by either identifying the fact that it has been reproduced or by
> using original words”, thus presuming all 24 instances of supposed
> plagiarism to be veridical. Most of them were not.
>
>
>
> After a very thorough debate about all 24 items with my co-authors and an
> expert for good scientific conduct at our university’s library, 22 out of
> all 24 items were discarded. The remaining 2 items were far from verbatim
> copies of whole sections. They were small parts of larger sentences
> together with explicit citations of the sources from which those parts were
> derived. The other 22 items were discarded not due to subjective reasoning
> but due to obvious glitches in the plagiarism checking algorithms. This
> amounts to a rate of 91.6% false positives. I’ll describe some of the more
> silly instances at the end of this text, but that is not the reason for my
> posting here.
>
>
>
> Instead, the point I very much like to discuss with you is the handling of
> possible plagiarism in empirical studies. Do we have an agreed code of
> conduct for authoring pieces of empirical science? Let me highlight only a
> few points.
>
>
>
> 1) How do we treat Materials and Methods? The Stimuli section will
> necessarily contain similar phrasings when reporting about research that
> uses identical paradigms. The Apparatus section will also be quite similar
> between related studies, as will the Participants section. The same holds
> for the Ethics Statement and the Measures and Analysis. Is it really
> desirable that we need to come up with ever so slightly different
> formulations for identical things, only to avoid verbatim copies? Are there
> not limitations as to how a temporal 2-AFC task can be described with
> appropriate brevity? And would it – particularly for Methods and Results
> –perhaps even be prudent to stick to a rather formulaic language protocol
> in order to make reception easier? I for one would certainly not wish to
> read a Methods section which goes like “Stimuli were created according to
> XY (2004). Handling of Participants was similar to XY (2010). Apparatus was
> as described in XY (1998). Task was taken from XY (2001). Analysis and
> measures are according to XY (1992).” This does not help me to efficiently
> understand what’s being done.
>
>
>
> 2) How do we handle self-citations? Many of us work on the same topics
> over long stretches of time, sometimes decades. Good scientific research
> usually means to advance present knowledge step by step, pulling only very
> few levers at once for each new experiment. Is it not to be expected that
> at some point we have arrived at concise, well-formulated, and most
> comprehensible ways to verbally introduce specific concepts. Is it really
> necessary that we find ever new ways to phrase the exact same ideas?
>
>
>
> 3) Is it the prime virtue of empirical research to be phrased originally?
> Is it not first and foremost the results and their implications that define
> original and interesting work? Even if we set high standards of originality
> for the prose in empirical articles, how should brief verbatim copies be
> handled? Let me give one example. Suppose, the abstract of a paper reads
> like “We used faces, non-face objects, Gabors, and colored Gaussian blobs
> to investigate the role of stimulus complexity on visual processing.” I
> find it highly questionable to then have a sentence like “XY (2001) used
> faces, non-face objects, Gabors, and colored Gaussian blobs to investigate
> the role of stimulus complexity.”, written by another author, qualify as
> plagiarism or needing quotes. Why should the latter author attempt to
> rephrase something that had already been so concisely summarized by the
> original authors?
>
>
>
> 4) Is it common consensus that automated plagiarism checking without
> editorial oversight is the yardstick against which to evaluate the
> originality of scientific manuscripts?
>
>
>
> I’d very much love to have an informed discussion with you. In part
> because I imagine that the plagiarism report I received may turn out to be
> the rule rather than the exception, hence we might all face hours of
> checking and re-checking during future publishing attempts.
>
>
>
> Kind regards to all of you
>
>
> ​​
> Malte Persike
>
>
>
> --
>
>
>
> And here come some of the highlights from the plagiarism report issued to
> me.
>
>
>
> (i) My institutional address “Johannes Gutenberg University Mainz,
> Wallstr. 3, D-55122 Mainz, Germany” and the immediately following heading
> “Abstract” were flagged as plagiarism.
>
>
>
> (ii) The E-Mail addresses of the authors were flagged as a plagiarism.
>
>
>
> (iii) Citations and year numbers, e.g. “(Persike et al., 2015)” were
> included in the word count for multiple items. This had ridiculous
> consequences. To name only two of the most blatant ones: one plagiarism
> item was defined by the words “Author et al., 1993 […] et al. […] the […]
> et al., 1997” (with a few unflagged words inbetween), another item was
> defined by the words “Author1 and Author2, 1994; Author3 and Author4, 2001)
> and”.
>
>
>
> (iv) Mathematical symbols, brand names and notational terms had been
> included in the report. One item therefore consisted almost entirely of
> parts of a mathematical formula, the product names “ViSaGe” and “ColorCal
> colorimeter”, the brand name “Cambridge Research Systems LLC”, the term
> “Michelson contrast”, and the phrase “were run in Matlab”.
>
>
>
> (v) Many of the report items contained common phrases used in neuroscience
> research, some of which were even multiply counted. For example, one item
> was defined by the mere phrase “to the ability of the visual system”,
> counted two times, plus a reference. A quick Google search turned up more
> than 150,000 hits for this exact phrase and Google Scholar yields more than
> a hundred authors who have also used this phrase in their works.
>
>
>
> (vi) The CrossCheck system invented false positives. One item contained
> the phrase “V2 neurons are highly selective”, another item referred to the
> phrase “to a particular combination of line components”. These phrases were
> not copied from anywhere but are original. In fact, they are so original
> that Google Scholar yields precisely zero search results for each of them.
> The sources from where these phrases were claimed to be derived do not
> include such sequences of words anywhere in the entire texts.
>
>
>
>
>
> --
>
> Dr. Malte Persike
>
> Department for Statistical Methods
> Psychological Institute
> Johannes Gutenberg University Mainz
> Wallstr. 3
> D-55122 Mainz
>
> fon:    +49 (6131) 39 39260 <+49%206131%203939260>
> fax:    +49 (6131) 39 39186 <+49%206131%203939186>
> mobile: +49 (1525) 4223363 <+49%201525%204223363>
>
>
> _______________________________________________
> visionlist mailing list
> visionlist at visionscience.com
> http://visionscience.com/mailman/listinfo/visionlist_visionscience.com
> <https://mailtrack.io/trace/link/4cc6bba5e714ee34dc8be587107fe4eb6e7677f7?url=http%3A%2F%2Fvisionscience.com%2Fmailman%2Flistinfo%2Fvisionlist_visionscience.com&userId=72088&signature=5eabf0af9aa1edb1>
>
>


-- 
Mufti Mahmud, PhD
​Postdoctoral Research Fellow
NeuroChip Lab
<https://mailtrack.io/trace/link/ddd72560bafb77892ec327251f682cd74ba1231b?url=http%3A%2F%2Fwww.vassanellilab.eu%2F&userId=72088&signature=5211a0afdab75aef>
- Dept. of Biomedical Sciences
<https://mailtrack.io/trace/link/4f9ff17dea868f1aa94bccfbe81a5a2bba158ad8?url=http%3A%2F%2Fwww.biomed.unipd.it%2F%3FL%3D1&userId=72088&signature=b3181e05470f3159>
University of Padova
Via f. Marzolo 3
35131 - Padova, Italy
Tel.: +39 049 827 5308
Fax: +39 049 827 5301
https://sites.google.com/site/muftimahmud/
<https://mailtrack.io/trace/link/d8b9186fe1923f7ed23a3fa94c666adb117a8ff8?url=https%3A%2F%2Fsites.google.com%2Fsite%2Fmuftimahmud%2F&userId=72088&signature=2592319ddc5112ef>
http://orcid.org/0000-0002-2037-8348
<https://mailtrack.io/trace/link/9d0360018e65800bdd1d041cdc94bbc5057b43a7?url=http%3A%2F%2Forcid.org%2F0000-0002-2037-8348&userId=72088&signature=0132ce5e83ad6d36>
https://scholar.google.com/citations?user=L8em2YoAAAAJ&hl=en
<https://mailtrack.io/trace/link/b9545a957bb4a13fc1fa64c3cc664f6d9234a340?url=https%3A%2F%2Fscholar.google.com%2Fcitations%3Fuser%3DL8em2YoAAAAJ%26hl%3Den&userId=72088&signature=0bbfff01f0d3bb43>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://visionscience.com/pipermail/visionlist_visionscience.com/attachments/20170710/4696ab1c/attachment-0001.html>


More information about the visionlist mailing list