Plinius

Sunday, May 6, 2012

PL 22/12: NPM from below

Filed under: statistics — plinius @ 9:03 am

In Norway, during the last seven or eight years, we have tried hard to develop a greater interest in library statistics.

This has not been easy. In the middle of the decade we conducted a few statistics courses for librarians, but had to struggle to recruit participants. The librarians in the public sector preferred more cultual topics, such as presentations of the latest books. Only now, after half a decade of hard work, have we seen a change, both at the official and the practical level.

The official recognition of indicators as an important field is new. The most recent White Paper on libraries (2006) states:

– An important task ahead will be to strengthen statistical and analytical work. This also includes following and conveying what is happening internationally. Various indicators of what constitutes good service performance both in the archive, library and museum sectors will be developed. These indicators – and specialised studies – should be used to generate up-to-date reports on the archive, library and museum fields [my translation]

The development process that led to the new indicator sets (ABM24 and ABM30) had many weaknesses, in particular with regard to the public sector. The committee members were recruited from the field of practice, but the actual committee work was controlled by ABM. The technical discussions that took place were not shared with the field. The pre-testing of the indicators has never been documented. Draft versions of the reports were presented for feedback – a good thing in itself – but there was no professional debate. This is typical of the modern administrative approach: we will listen to your views, but we will not respond to them.  

The official interest in indicators does not come from below (libraries) or from the people at ABM itself. It represents New Public Management. NPM is a European and even a global trend towards the intensive use of empirical data in public management. In Norway, KOSTRA is a supreme example of state-conducted indicator development. Indicators support control by revealing information. KOSTRA combines knowledge and power (Foucault).

NPM from below

In the Norwegian library sector we are trying hard to find an answer to NPM from below. We support the trend towards greater use of statistics and indicators. But we want to influence the design of the statistical systems run by the government and to develop better systems for local production of statistics. To influence the political environment – both locally and nationally – we will have to provide statistical arguments on a regular basis.  Ad hoc studies may lead to discussions and a bit of good will then and there. But their impact does not last for several years. We need fewer projects, in the traditional sense, and more development of our statistical routines.

Statistics is a subject with its own professional standards. When we measure something – using our best statistical tools – we must treat the results with respect. We should not argue or take for granted that statistics will always show libraries and librarians in a flattering light.

Statistical reasoning has its own logic. It is almost meaningless to separate statistics for advocacy from statistics for management. Learning the one you also learn the other. It is useful to emphasize advocacy, since this helps us focus on the consequences rather than the mechanics of library services. But the skill we need to develop is that of effective reasoning based on statistics. Evidence-based advocacy and evidence-based practice are processes that need a regular flow of standardized data. Meaningful relationships between numbers and concepts are important. We must produce indicators that politicians and their advisors can understand.

I am happy to report that much has been achieved during the last three years. In 2006 we set up a network of library organization with an interest in statistics (Samstat). All the major players partcipate, from the Norwegian Library Association to the trade unions that organize librarians. In 2009 we received a grant from ABM to conduct a statistical workshop lasting three full weeks: one in October, one in November and one in Februar 2010/11. Ten committed librarians participated. The graduates have later become statistical resource persons in their own libraries and regions.

The nineteen county libraries, which function as development units for the public library sector, are now supporting basic training in statistics. Samstat did two Statistics for Advocacy courses in 2011 and three more in the spring 2012. Additional courses are in the pipeline. Buskerud County Library organized a regional statistical network in 2011. This year the participating libraries are carrying out a series of standardized traffic observation studies. The surveys, based on methods developed by the present author, generate indicators designed for comparative use.  All these activities reflect initiatives that come from below. We have no conflict with the center, however, and look forward to cooperate with all interested parties.

Global developments

Library statistics is clearly developing. In a global perspective there are many positive signs of growth. IFLA has taken a strong interest in statistics for advocacy and has adopted a statistical manifesto. In Germany, the library index BIX is well established both in the public and in the academic library sector. Libraries participate on a voluntary basis. But steady work with real data over many years has made an impact.

In the United States, the new LJ Index for public libraries is professionally designed and very well presented. In its Global Libraries Programme, the Bill and Melinda Gates Foundation insists on systematic data collection and evaluation. A dozen countries has benefitted from this hard-nosed approach: Botswana, Bulgaria, Chile, Latvia, Lithuania, Mexico, Poland, Romania, Ukraine, Vietnam,  and – not least – the United States itself. The Gates financed studies of data use in American libraries show statistical survey methods at their best.

In academic libraries, LibQual is well established – and has contributed to a culture of assessment. Like the German indicator system BIX, LibQual is also moving into other countries. The web facilitates horizontal interaction and can support the social (as well as the centralized) approach to indicator development. networking A fair number of countries now make detailed library statistics available on the web. I am aware of the following – but there may be others:

  • The Nordic group: Denmark, Finland, Norway, Sweden
  • The Commonwealth group: Australia, Canada, New Zealand
  • The Netherlands

In public libraries, workable indicators of web traffic have started to appear – with Denmark as the front runner. Tools for observing user behavior inside libraries have started to appear – in Canada, in the US, in Norway and in Sweden.

Contexts and communities

Libraries are service organizations. To understand libraries, we have to understand the environments they serve. There are important differences between the academic and the public library sector.  Academic libraries belong to professional organizations. Universities and colleges are – by their very nature – specialized, knowledge intensive institutions. Their libraries are shaped by the academic context. Academic and special libraries tend to reflect the standards and the culture of their mother organizations. Public libraries are much more local in nature. They range from large and professional metropolitan institutions to poorly equipped libraries in provincial towns and distant villages. Public libraries reflect the economic, social and cultural situation of their communities. To understand public library statistics, we have to understand the communities in which they work.

I think it is fair to say that international work in the field of library indicators has tended to concentrate on definitions, standards and general methods. These topics are relatively independent of the social context.  But we cannot study the collection, analysis, ïnterpretation, publication and actual use of empirical data without looking at concrete countries and cases. The actual situations we find – going from Germany to Australia to Poland to Botswana – are extremely diverse. In some countries, there are also great differences between urban and rural areas. The way we work with indicators has to reflects these variations.

I strongly believe that the field of library needs more professional debate, based on more systematic data collection and wider sharing of statistical information and analyses. By professional I mean peer-based coversations guided by statistical, sociological and economic reasons rather than by ideological, administrative or political concerns. So far, only a few countries combine decent statistical systems with lively professional debate. Practical indicator development requires both.

In general, evidence-based librarianship is (slowly) gathering momentum. We can build on that when we work with indicators. There are many tasks ahead.  Library researchers and library agencies need to present their methods, their data and their discussions on the open web. If this is not done, ordinary librarians will lack access to the results of empirical studies. The professional debate among experts will also be hampered by lack of information. Our library authorities spend too much effort on repetitive data collection – and far too little on professional presentation and analysis of the results. As a consequence, ordinary librarians take little interest in the statistics. Many countries lack functioning statistical systems at the national level. In such countries we need to offer simple methods for regular data collection that interested libraries can manage on their own.

Resources

As a small experiment I have published the paper Indicators without customers, for our satellite conference in Turku, as a series of blog posts.

Summary

Sections

  1. PL 15/12: Top-down or bottom-up?.
  2. PL 16/12: Change work.
  3. PL 17/12: Public library indicators.
  4. PL 18/12: Big public libraries.
  5. PL 19/12: Academic indicators
  6. PL 20/12: Academic libraries
  7. PL  21/12: The social approach
  8. PL 22/12: NPM from below

References

  • Baathuli Nfila (2009). Innovative system and generation of management and operation statistics for decision making in academic libraries – the case of the University of Botswana.
  • Bibliotekreform 2014. White Paper on librraies.
  • Derfert-Wolf, Lidia; Marek M. Górski and Marzena Martcinek (2005). Quality of academic libraries – funding bodies, librarians and users perspective: a common project of Polish research libraries on comparable measures
  • Klug, Petra (2003). Setting benchmarks with BIX – the German Library Index.
  • McRostie, Donna and Margaret Ruwoldt (2009). The devils in the detail – The use of statistics and data for strategic decision making and advocacy.
  • Wimmer, Ulla.

Leave a Comment »

No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Blog at WordPress.com.

%d bloggers like this: