Making good better: using peer review to improve services

Driven by the desire to improve online communications with customers, the team at Barrington Library (Cranfield University) invited all library staff to peer review the online enquiries they had been answering. Lauren Vizor and Rachel Daniels share the story of what happened.


Like all librarians, we care a lot about customer service and the customer experience. As over 85% of our Barrington Library customers are part time, the majority of our customers’ experiences with us are online – via our website and email. Our written customer support therefore has to be strong, consistent, accurate, and effective. In 2016 we started to consider how could we effectively review, reflect and improve in this area?

The conversation started in our Customer Experience Working Group, a group made up of library staff of all levels (but most are information advisors) who actively investigate any physical or digital area where we can improve the Barrington Library customer experience. In June 2016, the group took the first step: determining “What qualities make a customer support email successful?”

The group were asked to recall times when they, themselves, had been the customer in experiences outside of the Library. Times they felt they had received amazing customer support either because it was a great service or because something that went wrong was handled well.

We landed on the following 9 criteria:

  • Accurate: Guidance is factually correct and consistent with best practice.
  • Timely: Response is sent within an appropriate timeframe.
  • Clear: Guidance is well written, understandable, logical.
  • Complete: All questions are answered fully.
  • Focused: Answer is personalized and excludes irrelevant information.
  • Language: Text includes appropriate grammar and spelling, excludes unnecessary technical jargon.
  • Format: Layout makes text easy to read (appropriate paragraphs, bullet points, etc).
  • Tone: Response is polite (either professional or casually friendly reflecting the tone of the enquiry).
  • Next steps: Leave the door open for customers to ask more questions. They are actively invited to contact us again.

Now we had to figure out how to determine if our customer support emails met this standard. Our proposal was to use an Enquiry Peer Review.

How did the review work?
Advance notice of the upcoming review was given to all staff and included our checklist of qualities. We started with a limited, three-month trial period, with option to extend and repeat. We used an online survey of randomly-selected anonymous enquiries. Each week all staff were emailed a couple of enquiries on Monday to review. These were chosen by a random number generator. All customer names replaced with ‘Carl’, all staff names replaced with ‘Beverly’. Staff had the rest of the week to complete the online review which was produced using Google Forms (free and easy-to-use survey software). For each question, reviewers chose between Yes, Mostly, and No. We translated these answers into number scores. Yes = 1, Mostly = 0.5, No =0.  Library staff could also include (anonymous) commentary to add context to their scores.

Feedback to and from staff
We ensured all staff received the results which included specific enquiry feedback, general feedback and tips for all staff, and a monthly summary report of results and suggestions for managers for appropriate action / approval.

We also conducted a staff feedback survey so that we understood how they felt about taking part in the process and were asked if they felt this exercise had been effective. Engagement is obviously important to the success of peer review.

So what was the outcome?
Staff reactions:
Library staff liked it! About two-thirds of library staff regularly participated in the review. The feedback survey confirmed staff were overwhelmingly in favour of continuing the reviews, despite a few initially not feeling totally confident about scoring others’ enquiries. It was interesting to note that with each review period, staff grew in confidence:

  • Number of confident staff in first survey – 64% (7 of 11)
  • Number of confident staff in second survey – 91% (10 of 11)

Enquiry scores:
Most category scores improved during the three month review so it appeared that the act of doing the review may have helped before even making any changes. It should be noted that we reviewed just under 10% of enquiries during each timeframe. We sacrificed more statistically significant scores to make it easy for staff to participate.

Learning from others:
Staff copied from the best and reflected on the ‘less-well-loved’ responses. The review changed the way staff felt they wrote their own enquiry responses (10/11 agreed). Favoured phrases began popping up in other enquiries. Picking apart why they didn’t like a particular response helped them avoid the same pitfalls in their own work.

Changes based on results:
The peer review results and recommendations brought about change policy changes. Here are just a few examples:

  • We re-wrote our standard customer renewal notice text for items borrowed from the Reports Section – a special collection.
  • Created an FAQ on our policy regarding book posting and room bookings
  • Improved the information posted on our website about the best methods for accessing full-text off-site - desk staff began to use this frequently.

These reviews are now part of our business as usual.

Want to give it a go yourself? Here are our tips:

  • Start with a trial period, make iterative process improvements.With each new round we did tweak the process where appropriate.
  • We moved from dividing staff into 3 groups to review a wider range of enquiries, to having all staff review the exact same enquiries.
  • By staff request we provided more feedback, more quickly.
  • Think about your culture. Building staff buy-in and confidence is crucial
    • It helped enormously that the ‘criteria’ we used to judge enquiries came from front-line library staff who deal with customer enquiries on a daily basis. The system was not imposed by senior management or by external consultants.
    • Staff did genuinely seem to enjoy the process but some felt a little underqualified or not confident in scoring some enquiries.
  • Don’t be surprised if library staff are hard on themselves. Our staff are *still* not convinced we answer enquiries quickly enough!
  • Make it easy for managers to approve suggestions. Monthly reports included concrete recommendations for staff training and policy changes. 

We’re not finished!

Things we may want to try in the future…

  • Joining this up with feedback from customers.
  • Timing – splitting the reviews up throughout the year.
  • Expanding the reviews to all three Cranfield libraries

If you would like more information, or are happy to share your experiences and ideas about peer review, then please do get in touch!

l.vizor@cranfield.ac.uk
r.j.daniels@cranfield.ac.uk
_______________________________

This case study was presented at Internet Librarian International 2019.
____________________________________________________