27 September 2019 - Top tips for evaluation of technological teaching tools

EUNIS19

 

Based on presentations and conversations at the EUNIS 2019 conference in Trondheim which I attended courtesy of a ucisa bursary, and other resources, this blog post hopes to give you some food for thought on how to evaluate technology used in your learning and teaching.

Surveys! The most talked about evaluation tool at the conference was the trusty survey. I think this is because of how rich the data set can be (if you get enough respondents of course). However, if you design the questions well (some tips here) and keep it short and sweet, you can get both qualitative and quantitative data to measure your impact.

Stats: How many people have used the technology? How many have clicked onto the resources or loaned out the technology? Although this doesn’t necessarily tell you how well they used it or how much impact it had on their learning or teaching, the stats help build a case of evaluation.

Studies: A scientific research study is the strongest tool for measuring the impact of learning technologies, take this study from Loughborough University Mathematics Education Centre around students learning mathematical proofs. This found that although the students report they think they’ve learnt more with the e-learning tool, results show they actually did worse…

Student feedback: Your institution will have its own mechanisms of collecting feedback routinely from the students for example, within module feedback questionnaires or staff student liaison committees. If you can add a question in around the learning technology into these existing processes, it could be an easy way to gather feedback. Although it may not give you a lot of depth to the results, it could be a good way to get a feel for how the implementation or use of the technology is going.

Speak to people: Some may argue that the big issue with measuring the impact of such tools is how subjective and qualitative the evidence can be but don’t let that stop you from speaking to colleagues internally and externally about the tool. Other people’s experiences are vital to helping you build up a picture of the benefit of using the technology.

The Jisc team presented the Digital Insights survey at EUNIS19 which looked to be a great way of collecting a rich data set about your institutions learning technologies, more information can be found here.

The 2018 TEL survey run by ucisa gives some very interesting results particularly around how institutions have measured evaluation and impact in Question 4. Surveys and focus groups being some of the most popular choices, see here for more information.

References:
Alcock, L. Hodds, M. Roy, S. & Inglis, M. 2015, ‘Investigating and Improving Undergraduate Proof Comprehension’, Notices of the AMS, vol. 62, no. 7, pp. 742-752.

Harvard University Program on Survey Research, Questionnaire Design Tip Sheet, Harvard University