Among the many discoveries that Derek de Solla Price made during his lifetime, I find Figure 1 the most inspiring (1). In this picture, de Solla Price provides a graphic illustration of the exponential growth of scientific journal literature since the appearance of the first journals in 1665. De Solla Price was fascinated with journals and their exponential growth in size and numbers ever since his first study of the Philosophical Transactions of the Royal Society of London from its very beginning in 1665 (2, 3).

On the basis of an experimental version of the Science Citation Index in 1961, de Solla Price formulated a program for mapping the sciences in terms of aggregated journal-journal citation structures as follows:

“The total research front of science has never, however, been a single row of knitting. It is, instead, divided by dropped stitches into quite small segments and strips. From a study of the citations of journals by journals I come to the conclusion that most of these strips correspond to the work of, at most a few hundred men at any one time. Such strips represent objectively defined subjects whose description may vary materially from year to year but which remain otherwise an intellectual whole. If one would work out the nature of such strips, it might lead to a method for delineating the topography of current scientific literature. […] Journal citations provide the most readily available data for a test of such methods” (4)

Organization of knowledge

Over the past 20 years, I have addressed the question of whether the aggregated citation relations among journals can be used to study clusters of journals as representations of the intellectual organization of the sciences. If the intellectual organization of the sciences is operationalized using journal structures, three theoretically important problems can be addressed:

1. In science studies, this operationalization of the intellectual organization of knowledge in terms of texts (journals), as different from the social organization of the sciences in terms of institutions and people, would enable us to explain the scientific enterprise as a result of these two interacting and potentially coevolving dimensions (5, 6, 7).

2. In science policy analysis, the question of whether a baseline can be constructed for measuring the efficacy of political interventions was raised by Kenneth Studer and Daryl Chubin (8; cf. 9, 10). Wolfgang van den Daele et al distinguished between parametric steering, in terms of more institutional activities due to increased funding, versus the relative autonomy and potential self-organization of scientific communication into specialties and disciplinary structures (11).

3. While journal Impact Factors are defined with reference to averages across the sciences (12, 13), important parameters of intellectual organization, such as publication and citation frequencies, vary among disciplines (14). In fact, publication practices across disciplinary divides are virtually incomparable (15, 16, 17). The Impact Factor is a global measure that does not take into account the intellectual structures in the database.

Mapping the data

De Solla Price conjectured that specialties would begin to exhibit ‘speciation’ when the carrying community grows larger than a hundred or so active scientists (18). Furthermore, the proliferation of scientific journals can be expected to correlate with this because new communities will wish to begin their own journals (4, 19). New journals are organized within existing frameworks, but the bifurcations and other network dynamics feed back on the historical organization to the extent that new fields of science and technology become established and existing ones reorganized.

De Solla Price's dream of making sciometric mapping a relatively hard sociual science can, with hindsight, be considered as fundamentally flawed.

Whereas the variation is visible in the data, the selection mechanisms remain latent and can therefore only be hypothesized. On the one hand, these constructs are needed as dimensions for the mapping of the data. On the other hand, constructs remain ‘soft’; that is, open for debate and reconstruction. De Solla Price’s dream of making scientometric mapping a relatively hard social science (20) can, with hindsight, be considered as fundamentally flawed (21, 22). When both the data and the perspectives are potentially changing, the position of the analyst can no longer be considered as neutral (23).

 Fig 1

Figure 1 – In this graph showing the number of journals founded (not surviving) as a function of date (NB, the two uppermost points are taken from a slightly differently based list), de Solla Price provides a graphic illustration of the exponential growth of the scientific journal literature since the first journals in 1665 (1).

Professor Loet Leydesdorff
Amsterdam School of Communications Research,
University of Amsterdam

Contact him directly


(1) de Solla Price, D.J. (1961) Science Since Babylon. New Haven, USA: Yale University Press.
(2) de Solla Price, D.J. (1951) “Quantitative measures of the development of science”, Archives Internationales d’Histoire des Sciences, Vol. 14, pp. 85–93.
(3) de Solla Price, D.J. (1978) “Toward a model of science indicators”, In: Elkana, Y., Lederberg, J., Merton, R.K., Thackray A. and Zuckerman H. (Eds.) The Advent of Science Indicators, pp. 69–95. New York, USA: John Wiley and Sons.
(4) de Solla Price, D.J. (1965) “Networks of scientific papers”, Science, Vol. 149, pp. 510–515.
(5) Whitley, R.D. (1984) The Intellectual and Social Organization of the Sciences. Oxford, UK: Oxford University Press.
(6) Leydesdorff, L. (1995) The Challenge of Scientometrics: The Development, Measurement, and Self-Organization of Scientific Communications. Leiden, the Netherlands: DSWO Press, Leiden University.
(7) Leydesdorff, L. (1998) “Theories of citation?”, Scientometrics, Vol. 43, No. 1, pp. 5–25.
(8) Studer, K.E. and Chubin, D.E. (1980) The Cancer Mission. Social Contexts of Biomedical Research. Beverly Hills, USA: Sage.
(9) Leydesdorff, L. and van der Schaar, P. (1987) “The use of scientometric indicators for evaluating national research programmes”, Science & Technology Studies, Vol. 5, pp. 22–31.
(10) Leydesdorff, L., Cozzens, S.E. and van den Besselaar, P. (1994) “Tracking areas of strategic importance using scientometric journal mappings”, Research Policy, Vol. 23, pp. 217–229.
(11) van den Daele, W., Krohn, W. and Weingart, P. (Eds.) (1979) Geplante Forschung: Vergleichende Studien über den Einfluss politischer Programme auf die Wissenschaftsentwicklung. Frankfurt a.M., Germany: Suhrkamp.
(12) Garfield, E. (1972) “Citation analysis as a tool in journal evaluation”, Science, Vol. 178, No. 4060, pp. 471–479.
(13) Garfield, E. (1979) Citation Indexing: Its Theory and Application in Science, Technology, and Humanities. New York, USA: John Wiley and Sons.
(14) de Solla Price, D.J. (1970) “Citation measures of hard science, soft science, technology, and nonscience”, In: Nelson, C.E. and Pollock D.K., (Eds.) Communication among Scientists and Engineers, pp. 3–22. Lexington, MA, USA: Heath.
(15) Leydesdorff, L. (2008) “Caveats for the use of citation indicators in research and journal evaluation”, Journal of the American Society for Information Science and Technology, Vol. 59, No. 2, pp. 278–287.
(16) Cozzens, S.E. (1985) “Using the archive: Derek Price’s theory of differences among the sciences”, Scientometrics, Vol. 7, pp. 431–441.
(17) Nederhof, A.J., Zwaan, R.A., Bruin, R.E. and Dekker, P.J. (1989) “Assessing the usefulness of bibliometric indicators for the humanities and the social sciences: A comparative study”, Scientometrics, Vol. 15, pp. 423–436.
(18) de Solla Price, D.J. (1963) Little Science, Big Science. New York, USA: Columbia University Press.
(19) Van den Besselaar, P. and Leydesdorff, L. (1996) “Mapping change in scientific specialties: a scientometric reconstruction of the development of artificial intelligence”, Journal of the American Society for Information Science, Vol. 47, pp. 415–436.
(20) de Solla Price, D.J. (1978) “Editorial Statement”, Scientometrics, Vol. 1, No. 1, pp. 7–8.
(21) Wouters, P. and Leydesdorff, L. (1994) “Has Price’s dream come true: is scientometrics a hard science?”, Scientometrics, Vol. 31, pp. 193–222.
(22) Wouters, P. (1999) The Citation Culture. Amsterdam, the Netherlands: Unpublished Ph.D. Thesis, University of Amsterdam.
(23) Leydesdorff, L. (2006) “Can scientific journals be classified in terms of aggregated journal-journal citation relations using the journal citation reports?”, Journal of the American Society for Information Science & Technology, Vol. 57, No. 5, pp. 601–613.
VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)