Software tools to assist with the detection of non-originality are well established but variable in their efficacy (Satterwhite & Gerein 2001). This paper will report upon the rates of non-originality detected by two different tools over three sequential cohorts of computing students at London South Bank University (LSBU), measured at the start and end of their undergraduate progression. The tool used for the first year students was OrCheck which makes programmatic use of the Google search engine (Culwin & Lancaster 2004). The tool used for the final year students was the JISC supplied Turnitin service which makes use of its own private detection engine technology. The use of the tools is only a part of a wider and more comprehensive pro-active academic misconduct policy (Carroll 2002, Culwin & Lancaster 2001)] in the computing department at LSBU.
Further discussion on the methodologies involved in collating evidence regarding the extent of academic misconduct. Followed by the pro-active context in the department where the data was collected. Essentially students are actively introduced to issues of misconduct at the start of the first year, as described in Culwin (2006); reminded in a core unit at the start of the second year and again at the first project lecture at the start of the final year. This is complemented by publicised Intranet resources and advice and also by wall poster displays.
This paper was submitted to the International Integrity & Plagiarism Conference which ran between 2004-2014. The paper was peer reviewed by an independent editorial board and features in the conference proceedings.