rilogo.jpg (8K)

CONSORTIAL PURCHASING



Time runs out for unnatural selection


The future of consortial purchasing and site licensing is under review. Mark Rowse, CEO of Ingenta, asks whether the days of the 'Big Deal' are numbered.


In 2002, the Ingenta Institute, a non-profit organisation funded by Ingenta plc, embarked on a major international investigation into the historical and likely future development of the consortial site license, and its perceived advantages and disadvantages for institutions, libraries, publishers and intermediaries. In particular, the research focused on the implications of the 'Big Deal' for all stakeholders in the scholarly communication process, and has provided substantive new insights into the way this market works, and how it is evolving.

The rapid uptake of consortial licenses over the past five years or so has been far-reaching: the research showed that, in the sample of libraries surveyed, the proportion of consortia-member libraries' holdings derived from consortium deals averages around 50 to 60 per cent, while many large and medium-sized serials publishers now rely on library consortia for between 25 and 58 per cent of their total revenues.

With consortia sales fast becoming the standard purchasing method within the library industry, the 'Big Deal' has played a significant part in the rapid uptake of electronic content by library users throughout the world.

First pioneered by Elsevier Science and Academic Press, and now offered by the majority of major academic publishers, the Big Deal may consist of hundreds of titles - often the publisher's entire journals' list - sold in a bundled package to a consortium of libraries on a one-price, one-size-fits-all basis. Typically, pricing is based on historical subscription purchases, and a publisher might supply a whole list for the price of the sum of the original print subscriptions of a library consortium, with an electronic premium added, generally in the range of between 5 and 15 per cent. Members of the consortium gain access to a greatly expanded pool of content for relatively little additional cost.

As many of these deals were struck for three- to five-year periods, many libraries and publishers have yet to undergo a full-scale renewal process. But as contracts run their course, it is clear that we are approaching a turning point.

The extra funding that initially secured the 'Big Deal' package may not be available at renewal, with static or contracting budgets expected at the next round of contract negotiation and renewal.

The question now being asked is, how long will the 'Big Deal' last in its current form; whether it was the transitory product of an exceptional period of time, as libraries and publishers migrated from a primarily print-based information economy to an electronic one; and whether new models will emerge that accommodate the benefits of the Big Deal, but dispense with some of the problems associated with this form of bulk purchasing? These are some of the issues addressed by the three independent studies commissioned by the Institute, undertaken by Professor Donald W. King of the School of Information Sciences, University of Pittsburgh; by Key Perspectives, the UK-based research consultancy; and by Professor David Nicholas, Department of Information Sciences, City University London and CiBER.

The Advantages of the Big Deal
The Big Deal has clearly been successful on a number of levels. One of the major beneficial effects has been the rapid and widespread penetration of electronic content, and the consequent increase in usage of that content.

Smaller libraries have been given access to a greatly increased number of electronic titles. One of the features of the Big Deal highlighted by the research was that the small library tends to be the most reliant on the Big Deal to supply the largest part of their holdings and access, while larger libraries are still purchasing a relatively high proportion of content independently.

Libraries of all sizes have the benefits of budgetary stability via price-capped, multiple year deals, with agreed upon inflationary increases. Where average annual price increases for print holdings had been running at 16 to 22 per cent, Big Deal annual increases have averaged around 6 to 7 per cent.

There are further advantages for the library to be gained from whole-list deals: inter-library lending, document delivery, shelving and processing costs may be reduced. Some consortia have also been able to take advantage of specialised servers and software provided by some Big Deal arrangements, and many have benefited from efficiencies in centralised license negotiations.

Participating publishers, too, have experienced many benefits. Although they may have begun to offer Big Deal packages in response to competitor activity, they soon discovered the advantages of additional revenue streams, stable revenues over long license periods and the ability to protect existing subscriptions through the introduction of non-cancellation policies, not to mention the step-increases in market share as greater amounts of content were made available to a vastly expanded user base.

For both libraries and publishers alike, the increase in usage has been greatly welcomed. Although there is still some uncertainty over early usage data from consortia, it would appear that titles previously unavailable to users are accessed to a significant extent once they are made available in electronic form to a consortia user base. Participating publishers have reported dramatic increases in downloads, and such increase in usage is believed, in turn, to lead to an increase in citation, and ultimately protection against cancellation. Authors have also benefited from greater exposure to their work, and readers from access to a broader information resource and from efficiencies in the discovery and access of articles.

Why, then, given these manifest benefits to all stakeholders in the information value chain is there doubt about the Big Deal's future? Is this all-you-can eat model not a win/win/win for publishers, libraries, and users?

Disadvantages of the Big Deal
Despite the advantages, there are reservations within the library community about the effect that the Big Deal is having on the collection-building role of the librarian. The all-you-can-eat model, coupled with non-cancellation policies, leaves little room for librarian choice. Acquisition decision-making has, in many cases, moved upstream as consortia purchasing becomes more centralised, while the all encompassing nature of the Big Deal means a library may be burdened with titles that have little relevance to its users, or which are of low quality, with reduced funds left over for the purchase of additional subscriptions from smaller or society publishers.

Publishers, too, are suffering. Those that have not yet entered into consortia selling have found themselves left out in the cold, as consortia budgets become allocated to already-established players. Subscriptions that are not part of consortia big deal packages also look likely to become the target of cancellation drives as libraries look to make savings elsewhere. The research confirmed that the Big Deal has had a negative impact on non-participating publishers, particularly the small and learned/professional society publisher.

Even some of the larger publishers have reservations about bulk sales. While libraries may be worried that the Big Deal brings an over-reliance on the publisher, publishers too are concerned that the Big Deal makes them overly dependent on large consortia, and vulnerable to whole-list cancellations on a scale that would have a dramatic impact on revenue and market share. The implications of price-capped deals, and high levels of market penetration, also mean that there are now more limited opportunities for future revenue growth for those publishers now dominating the consortia marketplace.

Market Now Evolving
After rapid adoption of consortial purchasing, the market is now evolving, and alternative models look set to introduce changes on a number of counts. One theme that emerged strongly was that there will be a return to greater selectivity in title purchasing. Some libraries expressed a desire to move away from whole-list deals to license agreements for high-quality journals or packages of content bundled by subject. Others believed there could be more of a role for intermediaries (who so far have been more notable for their absence), and assert that the subscription agent could play a role in selecting and clustering publishers' content on consortia's behalf.

Others believe that consortia themselves will evolve. Currently, the majority of consortia tend to be made up of a heterogeneous membership, often consisting of academic, public and special libraries. It has been suggested that consortia, made up of libraries of similar type and purpose, could prove the most successful in future, able to negotiate licenses for collections of content that are more consistently appropriate to their members' needs. This would be coupled with an increase in the trend of individual libraries belonging to an increasing number of consortia.

We are also beginning to see the emergence of new initiatives designed to help the small and society publisher to participate in international consortial sales. ALPSP, the Association of Learned and Professional Society Publishers, is considering the development of a multi-publisher consortium that would help bring clusters of smaller lists to market. PCG, the Publishers Communication Group, a subsidiary of Ingenta, has recently launched Consortialink, a new initiative that makes it possible for small and medium-sized publishers to participate in consortia negotiations via multi-publisher packages and centralised professional sales negotiation. Over 260 journals are currently included in the Consortialink package, giving libraries the opportunity to purchase titles from smaller publishers in a cost-effective and efficient way.

Usage data is another critical area highlighted for innovation and development. At present, there is wide variation in the usage statistics remitted to libraries by publishers and intermediaries. Only a minority of consortia which load data locally (OhioLink being one) are able to analyse usage data closely. All participants agree that there needs to be an improvement in the way usage of electronic resources is measured. While libraries need to gain more detailed information about how the content they have licensed is being used, publishers similarly would like a better understanding of how the products they are selling are being accessed. Project Counter, a working group made up of many key industry players, is currently working on an internationally agreed Code of Practice, which should pave the way for significant improvements in the consistency and accuracy of usage data.

These improvements will be fundamental to future innovations in purchasing models. Usage statistics are thought to be particularly critical to the formulation of new pricing strategies. There will inevitably be a move away from prices based on historical subscriptions, and both publishers and libraries alike believe that usage will be key in determining the value of content.

Conclusion
While the research highlighted much that has been gained from the Big Deal, preliminary results indicate that it is highly unlikely that things will remain as they are. Both publishers and libraries see the current system as transitory, and most predict that the Big Deal will not be with us in five years' time.

One of the central problems is that the library serials budget is not increasing. Although many libraries secured one-off increases in funds to cover initial big deal purchasing, libraries will find it increasingly difficult to cover and to justify the Big Deal bill, and will need to find savings elsewhere to keep up payments for bulk packages of content - for example, in the book budget or through journal cancellations, or by cutting back on inter-library loan and document delivery. This will have negative impact, both on library collections and on the small, society and book publishers, who are the early candidates for cutbacks.

Publishers will need to focus on adding further value to their collections to secure existing subscriptions, and to tap additional budgets for new sales. Those with high market penetration will need to innovate if they are to generate increases in income, over and above revenue, from set annual increases in big deal pricing. They will need to develop the quality, coverage, and user-features of their products, and may need to penetrate markets beyond the academic if they are to achieve significant organic growth over time.

Nevertheless, while there are pressures for change, inertia in the journals market exerts a strong force. Some foresee a scenario in which hybrid purchasing models will emerge. Library consortia will negotiate licenced access to well-used material, and will turn to document delivery, inter-library loan or transactional systems to access those journals that are used infrequently. It is likely that we shall see growing sophistication in the services and tools offered to libraries to enable them to track and audit usage, and that new price points will emerge based on different types and levels of use.

The traditional distinctions between single article sales, single subscription purchasing and bulk licensing will begin to blur as more and more content goes online-only. Rather than dedicating a significant proportion of their budgets to large collections of predetermined content, libraries will prefer those deals that offer a core collection of high quality content, with the option to access more marginal titles on a more occasional basis.
The full reports referred to in this article will be published as the Proceedings of the Ingenta Institute 2002, available by e-mail.


back to main features page
home