More than ever libraries and information services need to be accountable to their users, to their institutions and to their funding bodies. They need to demonstrate their value, outcomes and impact - but how?
Traditionally we have gathered statistics - slightly obsessively perhaps - as if the very act of gathering the variables provided us with some solace and comfort that we were performing well. We could inform our funding bodies just how many books we had, how many we had issued, how many we had shelved - counting them out and counting them back in again - a myriad of variables pouring in from every service point, from every transaction, from access and egress, from dawn to dusk, week to month, year to year - we had been quantified. However this data described the activity around library services rather than our achievements; it described quantity not quality, efficiency not effectiveness, industry not impact.
Impact is where Sharon Markless and David Streatfield come in, they have provided in a single, slim volume a 'tool book' for evaluating this impact and a very readable discourse on the subject. Both are prolific writers, consultants and respected educationalists. Markless is a lecturer at King's College London and contributor to Information Management Associates consultants where Streatfield is the Principal.
The book itself is a by-product of the consultancy and facilitation work they both undertook as part of the SCONUL (Society of College, National and University Libraries)/ LIRG (Library and Information Research Group) Impact Measurement Initiative. This national research project led by Phillip Payne and Angela Conyers explored the impact of Higher Education libraries on learning, teaching, and research, with the aim of developing a toolkit of impact research methodologies for HE libraries.
The research began as a one-year pilot project, running from July 2003 - July 2004, with ten Higher Education libraries taking part, each identifying an impact theme and a topic that they wished to explore in depth. A second phase of the initiative then ran from July 2004 with a new cohort of twelve HE libraries, including my own institution, Staffordshire University. The details and the results from the participants and the project can be found at the lis-impact JISCmail Web site .
The demand for evidence and impact requires that we use the best evidence we can possibly obtain; qualitative and quantitative, using rigorous and authoritative methods. Impact Analysis methodology is discussed and tools, hints, reflections and suggestions are provided throughout.
The authors open by setting the context and describing the concept, asking and proffering real life explanations of what impact and impact methodology means. At Staffordshire this meant setting objectives, establishing success criteria and identifying the methodology by which we could measure impact. Ours was relatively straightforward: a merged help desk - a single point of access for IT and Library enquiries. The objectives were to provide ease of access to IS learning resources and services, to optimise staffing and resources and provide a service where staff can address, answer or refer the majority of learning support enquiries.
Our methodology involved critical incident interviews - when, what, why, how - and an analysis of usage data, we interviewed pre- and post-help desk merger, we held focus groups pre- and post-merger, we held interview-led questionnaires and benchmarked joint help desk services with other institutions - a range of robust techniques and methods. Impact usually means impact on people and we were looking for changes in behaviour (doing things differently), competence (doing things better), levels of knowledge and an improvement in attitudes from our customers to our services and to our staff.
We were very pleased with the result - staff were engaged in the process and planning of the project, in defining roles and responsibilities, policies and procedures and importantly also in the evidence-gathering and impact analysis.
Part 2 of the book provides the methodology. This is where the 'tool book' kicks in; in fact the authors invite those 'whose wishes are purely practical' to 'please go on' to this section. As described above in the Staffordshire experience, the methodology involves setting objectives, establishing success criteria, activities and outputs, evidence-gathering, analysis, presenting and planning.
The third part of the book provides more discourse and analysis, delving deeper into the concept, exploring the bigger questions and offering more information and sources on evidence-based library work.
Interspersed throughout the book are the 12 'Laws of Impact Evaluation' which are not really laws or rules at all but serve as useful synopses. A companion Web site provides additional tools and materials . This could be better and I advise readers to visit the lis-impact archives  mentioned above for a more comprehensive Web experience.
The strength of the book lies in its duality as both a tool and as an entertaining and insightful analysis of the context, concepts and methodologies required to demonstrate the effectiveness of your library.
- JISCmail archive for lis-impact list http://www.jiscmail.ac.uk/archives/lis-impact.html
- Facet Publishing companion Web page to the publication: Evaluation tools and materials http://www.facetpublishing.co.uk/evaluatingimpact/