This essay was originally published in the Current Contents print editions July 18, 1994, when Clarivate Analytics was known as the Institute for Scientific Information (ISI).

The Clarivate Analytics Impact Factor, as explained in the last essay1 , is one of the evaluation tools provided by Clarivate Analytics Journal Citation Reports (JCR)Many features of the JCR can be applied to the real-world task of journal evaluation, and the specific needs of the user ultimately determine which of those components is the most appropriate for the task. Doomsday predictions about the exponential growth of scientific literature have not come to pass. While the growth has been slower than forecasted, it nevertheless warrants concern. Even though the reality of the current situation is not nearly as frightening as had been anticipated, the need to be selective in journal management is all the more imperative.

As Bradford’s Law predicts, a small percentage of journals accounts for a large percentage of what is published. An even smaller percentage accounts for what is cited. In other words, there are diminishing returns in trying to cover the literature exhaustively. Careful selection is, therefore, an effective way to avoid “documentary chaos.” This term, coined by Samuel C. Bradford, the former librarian of the Science Museum in London, refers to the anxiety that one feels in contemplating the information explosion. Recognizing the need of readers to scan the most significant journals published was the raison d’etre for Current Contents.

It is understandable that publishers are concerned that their journals are selected by Clarivate Analytics for inclusion in its database. Indeed, it is sometimes argued that the survival of a particular journal depends on Clarivate Analytics decision to cover it in Current Contents. A journal’s ultimate success depends upon its quality, distribution, and many other competitive factors including cost and timeliness. Any one of these factors, including coverage by Clarivate Analytics, can make the difference between success and failure.

Many publishers regularly use the JCR to conduct market research. A concrete example of the JCR‘s role in journal market research was presented in an essay about pathology journals.2 As a result of evaluating JCR data, it was possible to show that a journal of applied virology was needed. Not long after that essay appeared, such a journal was established.

The JCR can benefit the user in a number of ways. Not only are rankings important, but even more interesting are trends that can be gleaned from the various listings, including the source data, the half-life, and the cited and citing journal listings.

A journal’s reputation may not tell the complete story about its impact on the scholarly community. In fact, a study by Christenson and Sigelman on social science journals suggests quite the opposite.3, 4 Their research showed that there is a nonlinear relationship between a journal’s reputation and its impact, especially at the extremes of the prestige scale. They conclude that citation data “permit scholars to evaluate the importance of journals based not on opinion but on the frequency of citations” and that “frequency of citation implies scholarly acceptance, or at least acknowledgment of importance through utilization of others’ work.” The researchers go on to mention that “journals have prestige, but their prestige is only derived from the usefulness of the articles they publish.”

The JCR satisfies the need for quantitative measures. It provides a detailed picture of the scientific literature. It shows the journal-to-journal relationships and permits the discerning user to track important trends or changes over the years, such as a shift from pure to applied research. The changes are not always reflected in the names of the journals. For instance, while the title of the Journal of Experimental Medicine conveys one image, its primary focus today is in fact immunology.

Realizing the need for selectivity and recognizing the JCR as a valuable tool for finding information about journals are both key to effective management of library collections. Strategies to implement effective selection plans include use of the impact factor to determine cost-effectiveness and to identify appropriate journals for a collection.

To deal with essentially static budgets in the face of rising journal costs, Prof. Henry H. Barschall of the University of Wisconsin suggests that the ratio of printed character cost to journal impact is a good indicator of a journal’s cost-effectiveness.

Tony Stankus, science librarian at the College of the Holy Cross in Worcester, Massachusetts, has written several articles and books on the use of citation data to characterize publishing trends. In an article coauthored with Carolyn Mills, Stankus suggests that a good rule of thumb is to include in a science library collection the journals that have held impact factor leadership within their specialty over the course of a 10-year period. Those journals, in turn, will lead to others cited by them.

Evaluation of journals is a formidable but necessary task considering the wide range of choices available. Limited funding and space, as well as other factors, dictate the need for a carefully planned strategy of journal selection. The JCR offers many valuable indicators—including the impact factor—to help deal with the series of decisions involved in the establishment and maintenance of an effective library collection.

Dr. Eugene Garfield 
Founder and Chairman Emeritus, ISI

 

References

 

  1. Garfield E.The impact factor. Current Contents® (25):3-7, 20 June 1994.
  2. ——————.Citation analysis of pathology journals reveals need for a journal of applied virology! Essays of an Information Scientist. Philadelphia: ISI Press®, 1977. Vol. 1. p. 400-3.
  3. Christenson J A, Sigelman L.Accrediting knowledge: Journal stature and citation impact in social science. Soc. Sci. Quart. 66:964-75, 1985.
  4. Garfield E.Prestige versus impact: Established images of journals, like institutions, are resistant to change. Essays of an Information Scientist. Philadelphia: ISI Press®, 1989. Vol. 10. p. 263-4.
  5. ——————. Journal citation studies. III. Journal of Experimental Medicinecompared with Journal of Immology; or, How much of a clinician is the immunologist? Essays of an Information Scientist. Philadelphia: ISI Press, 1977. Vol. 1. p. 326-9.
  6. Barschall H H.Cost-effectiveness of physics journals. Phys. Today 41(7):56-9, 1988.
  7. O’Neill A L.The Gordon and Breach litigation: a chronology and summary. LibraryResources and Technical Services 37(2):127-33, 1993.
  8. Stankus T, Mills C V.Which life science journals will constitute the locally sustainable core collection of the 1990s and which will become “fax-access” only? Predictions based on citation and price patterns 1979-1989. Science and Technology Libraries 13(1):73-114, 1992.