PgmNr E8035: Student peer review: an educational and assessment tool for upper year genetic courses.

Authors:
Krassimir Yankulov


Institutes
University of Guelph, Guelph, ON, CA.


Abstract:

Peer reviews are the generally accepted mode of quality assessment in scholarly communities, however they are rarely used for evaluation at college levels. Over a period of seven years I have performed a peer review simulation at a senior level course in molecular genetics at the University of Guelph. Briefly, I have asked groups of two authors to write research proposals on one of four topics that are announced in the beginning of the semester. On average 10-12 proposals are written each year. Specific guidelines on how to compose the proposal are given ahead of time to the whole class. These proposals are submitted to a dedicated web tool available at our university (PEAR, Peer Evaluation, Assessment and Review) and anonymously directed to the class members for a peer review. This simulation of grant submissions and review has a significant educational value and is very well accepted by the students. The actual PEAR site is designed to fully protect the identity of the authors and the reviewers and meets the highest standards of double-blind review process.

I have used the data in PEAR to analyze the metrics of this simulation exercise. I show that student peer marks are highly variable and not suitable as a precise performance evaluation tool. Subsequent analyses have shown ways to improve precision of student evaluation, but their applicability is yet to be established. Interestingly, student peer reviews can clearly recognise substandard performance, but the peers struggle to distinguish between good and excellent performance. These finding provide provocative insight on the process of peer review in general.