In preparation for a staff meeting this week we have been asked to think about what we would like to measure in the LRC this year, preferably that impacts teaching and learning, and what qualitative feedback would we like to obtain from our work with students. That got me thinking about how my role in resources and content impacts teaching and learning in the College - what impact does my work have and how can I measure it?
To help answer this I outlined what my role entails, what we currently measure and anything new that I could record:
1. Responding to stock requests, placing orders and looking after existing stock: stock requests often come through teaching staff so I would like to look at usage stats for ordered items - how much are they being used? I would also like to know from students what type of resources they would prefer to support their individual learning styles.
2. Several institutions attempt to measure usage through browsing - they encourage students to leave items on desks, or return them to a designated trolley, rather then returning them to the shelves. Staff then record each time they are used. Figures from these could be amalgamated with our issue and renewal stats to show a fuller picture of resource use.
3. Managing our online presence in our VLE (Moodle): I already track visitor stats to our different pages but it would be interesting to hear a student's perspective - what do they think of it?
4. Monitoring usage statistics for our e-resources: I already record usage statistics from our e-resources. A frequent response in our annual student questionnaire is that they don't know about their LRC eResources. This suggests that either we aren't promoting them enough or we're not promoting them in the right way. As part of our student focus groups I would like students to think about how they would like their resources marketed to them. What can we do to remind them often of specific and relevant resources and what formats should we use to make most impact?
5. Promoting resources through physical display and online advertising: we already record usage of display items (items used in the online advertising are more often than not the items that are also used in the physical display). Usage for a lot of displays is appallingly low, however (see my previous post discussing displays). The most successful has been our display of revision guides. This suggests that either we're not displaying material that students want to borrow, or they don't feel that they can lift them off the display stand. We've also tried to promote resources to help teaching to tutors when there have been national events, such as Anti-Bullying Week for example. However, they are just as reluctant to come and borrow related items.
6. Housekeeping for our LMS (Heritage): I already record statistics for issues, renewals and returns and can pull numerous reports from Heritage.
7. Delivering inductions: we record the number of inductions delivered and the students who receive them. A colleague is looking into how we can measure their impact by relating their results against whether they received an induction or not.
A lot of our work involves making assumptions on how we think students and staff want to use and hear about resources and services. One thing is clear - we need to make sure we maintain ongoing communication with our users (students and staff) to discover what they want and how they want it. We then need to see if they're using what they want and, if not, find out why!