Librarians still have vital role in the Web 2.0 era

Share this on social media:

Information professional Hervé Basset shares his observations about the role of Web 2.0 technology in science intelligence in industry

In the past months much has been written about the hyped Web 2.0. Evangelists have talked about applying this to almost everything published or diffused online. Many people expect scientists to be leaders of the Web 2.0 pack. Observing the behaviour of my end users, however, it is not yet clear to me whether this prophecy has been achieved.

The concept of science intelligence (SI) in science can be compared with business intelligence for economics. SI is defined as a combination of technology, methods and tools used by an organisation to watch its scientific environment, in order to maintain its level of knowledge and face challenging issues.

It includes the processes of gathering, analysing, storing and diffusing the information. It is about making a science company more innovative, more efficient, more compliant and more competitive.

The concept is not a brand new one, but companies rarely have a global view of the entire system. One group might be excellent on gathering information, while another might focus on internal repositories, for example. For a science organisation, therefore, it can be a temptation to be attracted by religious promises of Science 2.0 as a way that it might improve some of these processes.

Information gathering

Online science has achieved great success in the last few years, thanks to close collaboration between publishers and library services. Over 90 per cent of STM journals are online, more content is available to more users than ever before, and all this is for a lower cost per article than ever. Researchers are less and less dependent on physical libraries, and, according to a 2006 report by the Publishing Research Consortium (PRC), 90 per cent of scientists think that virtual libraries have enabled them to become more efficient and to save significant time in finding and retrieving articles. In science, researchers can actually spend more time analysing than gathering information. This is really wonderful, but this success is not at all related to Web 2.0.

In terms of information gathering, scientists are pretty conservative. The peer-review system is the “gold model” and congress networking is the pattern. They prefer small circles of colleagues based on personal contacts. Globally, scientists are satisfied with traditional ways of staying informed on science progress: the PRC study found that 97 per cent of biomedical researchers believe that they are very up-to-date with the current literature in their area. Databases (literature, patents, etc.) and beloved journals sustain them on a daily basis. They don’t need to go on emerging models like blogs, wikis and social networks to feed their brain.

To collect the science information, a handful of initiatives have introduced some pieces of Web 2.0 (like Novoseek to search Medline), but PubMed is still the reference, despite its old-fashion interface. STM publishing blockbusters like Web of Science and Derwent seem reluctant to introduce disruptive functions. EndNote and ISI impact factors are still the standards, despite various competitors.

In addition, different studies have shown that paid tools are still preferred by companies’ researchers, rather than free web services. Time saving, relevance of results, authorativeness and awareness systems, are the main arguments given by end users. They also like the emails. I notice that my end users are not really enthusiastic about RSS feeds installed for instance, into our intranet. To keep up to date with collected information, most scientists seem to prefer receiving basic emails (despite their daily complaints that they are overloaded with emails!). They are probably afraid of missing some critical data. And RSS feeds as a push system are not applicable to business information surveillance; the risk of missing some critical news is too high if you don’t look at your reader for a week.

Expertise and knowledge storage

Information analysis is clearly the most complex and sensitive part of the SI system, because it directly influences end users’ decisions. This stage is the most time consuming, but also the most interesting. Researchers can probably benefit from efficient tools to help them with this tedious task. Librarians can educate or assist end users with resources such as comprehensive guides to analyse documents, automatic tools to digest large amounts of data (summarisers, data mining solutions, clustering displays, etc.). In the life sciences, databases like Thomson Pharma or Elsevier Illumin8 provide readers with a number of visualisation tools to help out with data and Natural Language Processing treatment in order to improve requests. Once again, these tools don’t come from the Web 2.0 generation. They are closer to Web 3.0 concepts, where systems introduce pieces of intelligence on search results.

In terms of knowledge management, there is nothing really exciting either under the umbrella of 2.0. Microsoft SharePoint, which tends to be the EDMS standard in companies, is far from user-centred in its features. Security is still preferred to ease of use although the 2010 release will certainly bring more flexibility and customisation.

Open archives initiatives are often associated with Web 2.0, because both have the ambition to free the knowledge. Such initiatives are great but their birth still seems difficult, and companies are reluctant to adopt this disruptive way of publishing.

Collaboration

There is no doubt that collaboration is the stage where the most visible Web 2.0 changes could be expected, because blogs, wikis and social networks have deeply changed our private communication. However, Science 2.0 is far from a massive success. Technorati refers to 180 million blogs worldwide, but only about 40,000 of these are related to science. ScienceBlogs.com feeds its content from only 70 blogs, while ResearchBlogging (based on a peer-reviewed system) only takes its content from a few dozen. Some 90 per cent of French researchers have never published a post, according to research by INIST and CNRS. It is not because they don’t want to diffuse their knowledge though: 85 per cent of them are involved with external communication projects, and 60 per cent feel ready to contribute to scientific forums and wikis.

The lack of uptake is also not because of the relative newness of these technologies. After all, some killer applications such as iPod have conquered large audiences in only a few months. In 2008, David Crotty posted a brilliant summary of the situation in his Bench Marks blog: ‘when you step away from the enthusiasts and speak with the majority of scientists, you find out that they don’t have much interest in using many of these new technologies.’ Major reasons for this failure are well known: scientists don’t have time and don’t need to complete their traditional resources with blogs maintained by young researchers, non-prominent specialists or science writers; they don’t rely on web information in general; they don’t like accumulation of private things, non-relevant widgets, links to commercial shops, redundant and outdated news, ego-centered content, etc. All these non-relevant things have contributed to a global lack of credibility.

As far as social networks are concerned, the excellent BioMedExperts is going a good way towards leading the pack, with 80,000 users in its the first year. But despite various initiatives such as Nature Networks and SciLink, none of them have reached yet the success of the two proto-social sites of the 1990s, ChemWeb and BioMedNet, which had more than one million members at their height, reminds David Bradley, the author of ScienceBase. In the same way, you can count on your fingers the number of successful science wikis. Scitopics is among the few ones worth reading.

Globally, the Science 2.0 landscape appears very confused with too many redundant services and a lack of incentives. Neither blogs nor social networks are close to replacing the traditional networking over coffee at conferences or information from traditional publishing.

A low adoption

At the time writing, I do not think that Science 2.0 deeply affects the main stages of science intelligence within organisations. Major benefits are not obvious – not yet, anyway, the Web 2.0 evangelists would say.

It is probably a shame but it is a reality that, as a German blogger wrote recently, ‘most of today’s scientists use the internet for science at roughly the level of 1994: browsing and email’. Some surveys announced the advent of Science 2.0 advent in five years. Promises and expectations are high.

It is not yet obvious how Web 2.0 could bring many benefits to an organisation’s science intelligence system. Web 2.0 is wonderful for private or casual purposes but not for critical business. This is why so few companies have already embraced Science 2.0.

The role of librarians

What’s more, surprisingly, Web 2.0 does not offer a real autonomy to researchers in terms of search quality. A recent survey by Akel & Associates showed how corporate end users rely on librarians: among researchers who work with librarians, 90 per cent believe that librarians make significant contributions to their R&D efforts. Studies tend to demonstrate that information professionals’ presence drives successful research efforts and helps companies to stay competitive. ‘The more info, the more important the info pro,’ said industry consultant Mary Ellen Bates. This reason and many others mean that librarians continue playing the role they have always played, as facilitators between information and end users.

Hervé Basset is a librarian at a major pharmaceutical company, where his responsibilities include coordinating different R&D libraries in Europe and in the USA. His current interests focus on monitoring technologies (science intelligence) and the application of Web 2.0 to science business

Further information and bibliography