I have just spent the morning reviewing proposals for the Vocational education and Training Network strand at the European Conference for Educational Research.
I have never enjoyed reviewing papers. I worry that my own knowledge of the subject is often too little, still more that I only have an abstract idea of what comprises quality.
However, as a community building process, I find it more interesting. I haver been involved with VETNET for fourteen years. In the early days nearly everything used to be accepted. But as time went on a discussion emerged over improving the quality of VETNET and a formal review procedure was developed.
VETNET remains a somewhat traditional academic conference with paper and sumposium presentations. I suspect that the community’s desire for Vocational Education and Training to be taken seriously as a part of mainstream education research has tended to make us somewhat conservative in our approaches to formats and quality.
Over time as a community we have started defining quality indicators – even though they may be contested. We have had long debates about the relation between research focused on a particular system or country and its relation to wider European agendas. We have discussed how important the quality of language (English) is in assessing a contribution? Should leeway be given to emerging researchers to encourage them to contribute to the community? How important is a clear methodology when considering a submission?
This years debate has been over work in progress. It started innocuously enough with one reviewer emailing that he was concerned that many submissions referred to research which was not yet finished. Should we only consider completed research with clear results, he suggested? This provoked a flurry of replies with major differences between the reviewers. Some agreed with the original email; others (including myself) saw presentations based on work in progress as a potentially useful contribution to the community and a means of researchers testing their ideas in front of a wider international audience. In the normal way of things this debate will be reviewed at the VETNET board meeting at this years conference and revised guidelines agreed for next years conference.
In this way I think the review process does work well. It allows community rules and standards to emerge over time.
The other big change in the review system has been the use of an electronic reviewing system ‘conftool‘. The major benefit is to support the management fo the review process. VETNET receives some 120 proposals each year. The use of the system ensures every paper receives at least two reviews. More interestingly it makes transparent where there are disagreements between reviewers, providing a view showing the overall score for each proposal and the span between reviewer’s scoring. I was allocated nine proposals to review. Four of them have already been reviewed by a second reviewer. And somewhat to my surprise the span between my score and the other reviewers was small (the highest of the four was 1.7 (however the other reviewer has recommended rejection of this proposal and I have recommended acceptance!).
I welcome that when we have finished our reviews we are able to see other reviews of the same submission. This provides for me an opportunity for reflection and learning – and strengthens the potential of the academic review becoming part of the process of community emergence.