A question of stewardship

Share this on social media:

Terry Hulbert reports on discussions at STM’s Annual Spring Conference

‘We’re running out of money’; ‘We’re producing too many PhDs’; ‘We can’t keep up with the data we’re producing’; ‘We seem to be doing each other’s job’.

These are some of the sound bites from STM’s Annual Spring Conference held in Washington DC, USA at the end April. With an overarching theme of ‘the future of scholarly publishing,’ the topics covered were wide and varied including stewardship, funding, metrics and the now-ubiquitous big data.

In the opening keynote, José-Marie Griffiths, vice president for academic affairs at Bryant University, USA, gave a stark introduction to the challenges of scholarly stewardship. She made the observation that many initiatives that seek solutions to this challenge often don’t lead to any longer-term solution or ideas. What’s more, she argued, frequently the participants in such initiatives are only those who can afford to take part - already a self-selecting group.

As Griffiths observed the disappearance of some academic institutions, she asked, perhaps poignantly, ‘what will happen to their output?’ She went further by wondering if universities should focus on their core missions and avoid the trap of becoming publishers.

Panellists in the next session at the conference echoed these thoughts. Publishers are mission driven, seeking to provide researchers with the tools to record, store, and use findings, the panellists said. However, publishers are not alone in this since authors, academic institutions, societies, and governments all share elements of this broad mission. Some of the panellists observed overlapping and conflicting priorities. Who should be taking the lead in which area? Inefficiencies are introduced into the system as we find different parties doing the same thing. This will not help build a sustainable system and process for the future.

Griffiths returned to her earlier theme wondering if universities are falling prey to mission-creep, pointing out that they have many more things that they feel they could and should do in a climate of falling budgets. These observations suggested that stewardship and preservation might be areas for the publishing industry to resolve, a topic that Clifford Lynch of the Coalition for Networked Information (CNI) returned to elsewhere in the event.

Alan Thornhill of the United States Geological Survey (USGS) highlighted the size of his organisation, 3,000 researchers and an annual budget of $1b, but also asked if the government was a reliable long-term steward. He highlighted in particular those examples where data is jointly published with another US agency or overseas organisation. Questions such as who owns what, and how much are still far from resolved.

The final session of the first day addressed the impact of budget cuts. Alan DeCherney of the National Institute of Child Health & Human Development (NICHD) offered a chilling opinion on funding research in the USA. While budget cuts are felt everywhere, in defence, public housing as well as labs and researchers, DeCherney claimed that ‘research is an easy target to cut, but has profound long-term negative results’. 

NIH funding has been falling over the last four years while the number of grant applications is increasing. In his view, the gap between available funding and medical research will never be fixed, however quickly the funding might suddenly be increased. He suggested that ‘perhaps we’re turning out too many PhDs’.

The growth in researcher numbers and changes in research patterns has another familiar consequence too: the growth in discussions about ‘big data’, an area that has its own emerging vocabulary, as Mark Davis of Dell Computers discussed at the conference. He spoke of the concepts of data momentum – where big data is hard to move - and data gravity – where ecosystems of solutions grow up around data if it is persistent and reliable.

The scale of the big data issue was illustrated in a presentation from Alex Szalay of Johns Hopkins University about the Sloan Digital Sky Survey, a project that ran from 1992 until 2008. At the outset, it was anticipated that the survey would result in 2.5 terapixels and over 10 terabytes of raw data. In fact, the eventual output far outstripped this, with over 5 terapixels of images and over 400 terabytes of raw data collected. The sheer scale and growth of data for such projects can be overwhelming.

So what does all this mean for publishers? Lynch of CNI made the provocative, although some would say accurate, comment that publishers have not yet achieved much in the digital age. Content has been digitised and reference and citation pathways have been created, but, despite this, the process and output of science would still be recognisable to a scientist from 100 years ago. A sobering opinion. He believes there is still a need to do something truly profound and re-think how we can represent research results.

Big data does provide an opportunity for the publishing community to codify and modify the new norms of behaviour in this area. Everything is new and uncharted and publishers can seize this domain, set the standards and show the value that they add to the publishing process. Lynch predicted that ‘data curation will be dynamic; it won’t simply be in one place forever’.

Publishers may feel they have come far but as Lynch suggested in his presentation this may not be so…and he may have a point. Many areas remain in their infancy even if we believe otherwise. Turmoil and uncertainty remain, but paradoxically this provides excitement and opportunity. There is still much to apply our minds to, problems to solve, challenges to meet…and the industry is clearly engaged.  Even though we operate in turbulent times, the intellectual curiosity and stamina evident in Washington was a strong counter to any who might accuse the publishing industry of complacency.

Terry Hulbert is a consultant in the scholarly publishing industry and helped to organise the STM Innovations Seminar and STM Annual Spring Conference