Friday, May 4, 2012

PL 19/12: Academic indicators

Filed under: statistics — plinius @ 8:00 am

In the academic library sector we find only one set of indicators.

This consists of the indicators officially recommended by ABM for use in academic libraries. The recommendation has gone through a series of revisions. The version below is the most recent one. It was published in the report Indikatorer for norske universitets- og høgskolebibliotek (Oslo: ABM-utvikling, 2010).

We should note that ABM was responsible for library statistics from 2003 till 2010. Afterwards the task passed to the National Library. This has not, so far, led to any changes in the indicators used or recommended.

In the introduction to the report (p. 8) ABM outlines the role of indicators:

We can observe a growing demand for documentation and management by objectives (rapportering og målstyring) in our educational institutions. The project Benchmarking with result and management indicators (Benchmarking med bruk av resultat- og styringsindikatorer) was established in the spring 2008. … Indicators for libraries in higher education have existed for many years. … The project group has continued this work.

ABM recommends indicators as a management tool. Using indicators libraries should, in the ideal case, be able to measure today’s service levels and hence their ability to innovate and adapt to future demands and changes. Several indicators ought to be used together in order to give a complete picture of the library’s activities and quality.

The actual indicator set looks as follows:

A. Resources / Access / Infrastructure / Development

  • A1. Hours open (with staff)
  • A2. Users per FTE
  • A3. Share of primary users who use BIBSYS (percent). [BIBSYS: national library system for academic libraries]
  • A4. Users per seat
  • A5. Turnover (loans/stock)
  • A6. Share of total loans supplied by own collection (percent)
  • A7. Share of reported scientiific publications included in the institutional repository
  • A8. Share of working hours used on electronic publishing (hours per FTE)
  • A9. Share of working hours used on organized teaching (hours per FTE)
  • A10. Share of working hours used for pre-booked advisory sessions (hours per FTE)
  • A11. Share of working hours dedicated to (own) professional training (hours per FTE)

B. Use/Visits

  • B1. User satisfaction
  • B2. Use of digital documents purchased by the library (downloads)
  • B3. Use of collections (loans + downloads)
  • B4. Loans to external users
  • B5. Physical visits per primary user
  • B6. Use of seats
  • B7. Use of collections in written papers

C. Economic indicators

  • C1. Salary and media expenses as a share of the institutional budget (percent)
  • C2. Media costs per primary user
  • C3. Cost per loan/downloaded document
  • C4. Share of (salary + media expenses) that go to media
  • C5. Share of media expenses that go to electronic ressources
  • C6. Media costs per academic staff person

The set of recommended indicators has changed over time. But there is no reference to actual results from the past.  The report points to a growing demand for documentation – but ABM did not document its own experiences with indicators in the past. The language seems to float in the air:

  • indicators … have existed
  • libraries should, in the ideal case
  • several indicators ought to be used

There is a curious lack of practical knowledge or interest. Indicators are presented as “a good idea”, but not as an on-going practice. The organization that asks academic libraries to use performance indicators avoid them in its own practice. In its annual reports on statistics from academic and special libraries, ABM restricted itself to variables (absolute frequencies). Statistics Norway, in charge of all official statistics,  does the same. Not an indicator in sight.

During the last decade many libraries in higher education have actually tried to use indicators in their daily work. Vestfold University College is an outstanding example. But their experiences are not part of the process. Our failures and successes are analyzed, compared and brought into the open professional debate. We do not learn from the past.

ABM may argue that they have studied and discussed such experiences. But busy librarians can not learn about indicators from debates that take place in committees rooms and behind closed doors. Like the sciences, professions need to discuss and exchange arguments in public.

That means, first of all, on the open web. Here, librarianship differs from traditional academic disciplines. In the academic world, most discussions take place in small networks of colleagues (“invisible colleges”), often at seminars and conferences, and through the established mechanism of peer review. In young professions like teaching, nursing, accountancy and librarianship, practitioners get their information from easy-to-read articles on the web and in technical (not academic) magazines. Their professional discussions mostly take place with colleagues on the job – not with specialists abroad.


As a small experiment I am publishing the paper Indicators without customers, for our satellite conference in Turku, as a series of blog posts.



  1. PL 15/12: Top-down or bottom-up?.
  2. PL 16/12: Change work.
  3. PL 17/12: Public library indicators.
  4. PL 18/12: Big public libraries.

1 Comment »

  1. […] PL 19/12: Academic indicators […]

    Pingback by PL 22/12: NPM from below « Plinius — Sunday, May 6, 2012 @ 9:21 am

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

Blog at

%d bloggers like this: