Recently in the Times Higher Education a paper was reviewed which considers in a serious manner the hypothesis that “students cheat more because of the existence of the internet”. This paper was reviewed by Jack Grove.
Regarding Jack Grove’s review of David Ison’s paper in the Journal of Academic Ethics. I feel that the headline of “Older papers had higher similarity index than the more recent ones” is rather misleading. When I read David’s paper I found that the difference between the similarity indexes of the newer and older dissertations was far smaller than the standard deviation in the values. To me this indicates that todays PhD students are neither better or worse than the older students. The fact that the standard deviation on both numbers was about 60 % of the mean values suggests to me that the degree of similarity is subject to considerable variation between students.
The incidence of the extreme incidents of plagiarism (Simindex > 40 %) in Dr Ison’s paper are small in number, only three such events exist in his data set. Using the statistical idea (Poisson) used commonly in radioactive decay counting (SD = N1/2, Standard deviation = square root of number of events observed). This count of 3 events with a SD of 1.7 is not a good measurement of the incidence of such extreme plagiarism events. A 16 % chance exists that fewer than 0.7 % of dissertations fall into this shocking class, equally a 16 % chance exists that more than 2.6 % of dissertations fall into this class.
A classic method in radiometric work to improve the statistics is to increase the count number, I would like to suggest that the study be repeated with at least ten times as many dissertations. If this new data set includes 30 such documents then the SD on this measurement will be 5.5. Such an improved measurement would give us an additional insight into plagiarism. I would like to suggest that those dissertations with the high similarity indexes be examined in detail, firstly to rule out “false positives” and secondly in an attempt to reconstruct what the students did. I would also like for a study to be made of the incidence of plagiarism as a function of subject area.
I think that while the existence of web pages has made it more easy to cheat, the existence of the internet also makes it much more easy to catch a cheat. Some years ago someone I know had a student who cheated in a literature review assignment by copying the text of someone else. It was very easy to spot this vile misconduct, the student’s style of writing suddenly changed for the better. The style of the student changed from that of a typical student into that of a very polished and experienced academic. The problem was proving the cheating had occurred, the academic had to search high and low for the source article. In the end the academic found it, but now days automated tools such as “Turnitin” will find it very quickly for you. Some years ago I tested one such tool by writing a text full of total random nonsense with the occasional sentence taken from another source dropped in. I used things like obscure bible verses (these were found), UN reports and other even more exotic documents. The search tool within 24 hours had given me a list of the URLs of all of the sources I used for this test of the system.
My advice to students is “do not try to cheat by copying text off the internet”, it is a form of cheating which I think is rather silly. People who cheat in this way are very easy to catch.
Filed under: Uncategorized |