Issue 7 – September 2008

Articles

Celebrating the legacy of de Solla Price

The relevance of Derek de Solla Price’s work may have taken a long time to be fully recognized, but 25 years after his death he is far from forgotten. Dr. Eugene Garfield looks back at his legacy.

Read more >


Twenty-five years after his death, Derek de Solla Price is still explicitly cited in about 100 scholarly publications each year. The implicit citation of his work is undoubtedly much greater. Rarely does a week go by without someone referring to his aphorism that “80 percent to 90 percent of the scientists who ever lived are alive now.” Having just reread my own 1984 tribute to Derek, I can say that there is little I could add to those remarks to further demonstrate the impact of his work.

That impact will increase as the field of scientometrics continues to experience its own exponential growth. And the award of the Derek de Solla Price Medal will be a regular reminder of his pioneering role. For those who wish to know more about his influence on me and several generations of citation analysts, bibliometricians and science policy enthusiasts, I refer them to my personal Web page where his presence and influence is immediately apparent. Of particular interest is the Citation Classic commentary Derek wrote a few months before his death about his most cited work, Little Science, Big Science.

Delayed recognition

From a long-term historical perspective it is worth noting that de Solla Price’s career exemplifies delayed recognition. His 1951 paper, “Quantitative measures of the development of science”, concerning the exponential growth of science published in the relatively obscure Archives Internationale d’Histoire des Sciences (1) was essentially ignored.

Over a decade later there was still little or no recognition of his seminal observation. Even after Science Since Babylon was published in 1961, there was only a trickle of recognition. Then, in 1963, his future Citation Classic, Little Science, Big Science (2) was published. However, another two decades would pass before citations to his work would reach their peak.

Although Derek was a few years older than me, when he died it felt like I had lost a younger brother. In many ways Derek was a teenager till the end. He had an impish personality. I often had to chastise him for inappropriate behavior for which he always immediately apologized. Derek’s untimely death denied him the opportunity of using citation analysis to support nominations for the Nobel Prize. He had just been elected to the Royal Swedish Academy of Sciences. Unbeknownst to either of us at the time, the librarian of that prestigious institution had been using the Science Citation Index to provide documentation support to all nominations submitted to the Nobel committees.

To demonstrate the citation impact of Derek’s published work, we recently updated several HistCite collections.

Data with far-reaching potential

In closing, since it is unlikely that most readers will have access to the printed volumes, it is worth calling special attention to Derek’s foreword to volume 3 of Essays of an Information Scientist (3). In it he recalls the day we met when I appeared before the Science Information Council of the National Science Foundation (NSF) seeking support to create the experimental Genetic Citation Index. NSF refused the request but NIH funded the study.

Notwithstanding the refusal, I personally was immediately struck by the realization that citation links represented a radically new kind of data with far-reaching potential. Though we couldn’t predict with absolute certainty how much a citation index might be used, or even to what purpose, it seemed clear to me that such an index must be developed. It also seemed clear to me that such an index would have a good chance of becoming a commercial success, instead of becoming a permanent burden on the Federal budget; though a new immigrant to the land of Federal fiscal matters, I was able to recognize that prospect as being nearly unique.

Bit by bit we have begun to understand how citations work, and in the course of this, there has emerged a new sort of statistical sociology of science that has thrown light on many aspects of the authorship, refereeing and publication of scientific research papers. The Society for Social Studies of Science now has an annual meeting devoted to this new method of understanding science that has grown, almost as an accidental by-product, from the indexing technology developed by the Institute for Scientific Information. Our initial intuitive perceptions have turned out to be correct.

Dr. Eugene Garfield

Contact him directly

References:

(1) de solla Price, D.J. (1951) “Quantitative measures of the development of science”, Archives Internationale d’Histoire des Sciences, Vol. 14, pp. 85–93.
(2) de Solla Price, D.J. (1963) Little Science, Big Science. New York, USA: Columbia University Press.
(3) de Solla Price, D.J. (1977–1978) “Foreword”, Essays of an Information Scientist, Vol. 3, pp. v–ix.
VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

How de Solla Price influenced my work

When Professor Leo Egghe met Derek de Solla Price in 1981, he had little idea of the influence he would have on his informetrics career. Here, Egghe recalls how de Solla Price’s universal philosophy on the science of science has inspired his thinking.

Read more >


I was fortunate enough to meet Derek de Solla Price at a lecture he gave in Brussels in 1981. At that time, I was at a crossroads in my career: after my Ph.D. in mathematics in 1978, I became chief librarian of the Limburgs Universitair Centrum (now Universiteit Hasselt), a position I still occupy. In 1983, together with the then chief librarian of the University of Antwerp, Prof. H. Vervliet, I prepared the foundation of the degree in library and information science. In that year, I became part-time professor in this field and still teach courses on Quantitative Methods in Library and Information Science and Information Retrieval. After finishing a book on mathematics in 1984 (1), I switched to informetrics research. When I met de Solla Price, I was not yet an informetrician and had no idea of the influence he was going to have on my future career.

The science of science

It was not so much de Solla Price’s mathematical work that influenced me, as his universal philosophy on the science of science. His book Little Science, Big Science (2) describes growth distributions and size- and rank-frequency distributions of very different phenomena in information science, the physical world, linguistics, econometrics and so on. This book showed me that many of those phenomena have common laws and can be described in one framework, which I called Information Production Processes (IPPs) (3, 4). IPPs can be constructed far beyond information science, as de Solla Price explained (2). I defined an IPP as a system where one has ‘sources’ that have or produce ‘items’.

A classic bibliography is an example of an IPP. Authors have papers, yielding another example. But papers can also be sources, producing or receiving items as references or citations. Books are sources of their borrowings: words are sources (known as ‘types’ in linguistics) and their occurrences in the text are the items (’tokens’ in linguistics). Beyond informetrics, as de Solla Price describes, we have communities (cities and villages) as sources and their inhabitants as items (demography), and in econometrics one can consider employees as sources and their production or salary as items (2).

This universality is not the only remarkable thing. De Solla Price notices that all these phenomena (or IPPs) also satisfy the same sociometric (informetric) laws:

  • exponential or S-shaped growth functions;
  • size-frequency functions (expressing the number of sources with a certain number of items) of power-law type, such as Lotka’s law (5), and;
  • rank-frequency functions (expressing the number of items in the source on rank r – sources are arranged in decreasing order of the number of items they have) also of power-law type but with another exponent than in the size-frequency case, such as Zipf (linguistics) and Mandelbrot and Pareto (econometrics).

Essentially, these are all the same laws and are equivalent to Lotka’s law.

Success breeds success

It is remarkable that while rank-frequency functions are studied in informetrics, linguistics and econometrics, informetrics only studies size-frequency functions via Lotka’s law. De Solla Price introduced Lotka’s law into informetrics and – although equivalent with the rank-frequency laws – the size-frequency function (Lotka’s law) is easier to work with since it does not use source-rankings.

The university of de Solla Price's view of the science of science has influenced my entire informetrics career.

De Solla Price even introduces the econometric principle ‘success breeds success’ (SBS) into informetrics based on the earlier work of Nobel Prize-winner Herbert Simon (6, 7). SBS is the principle that (in my terminology): the probability is higher that a new item is produced by a source that already has many items, than the probability that a new item is produced by a source with only a few items. This leads de Solla Price to a partial explanation of Lotka’s law (7).

More recently, de Solla Price’s work (8) has lent itself to research I am currently undertaking on the relation between productivity (number of papers) and collaboration (co-authorship). He indicates (in my terminology) that for a certain author (the IPP) for whom sources are his or her papers and items are the co-authors of each paper, you may find that researchers produce more papers if they collaborate more, a finding that seems to be confirmed in my recent work (in progress).

The universality of de Solla Price’s view of the science of science has influenced my entire informetrics career. Since 1985, I have worked so much with IPPs and Lotka’s law that I published a mathematically-orientated book (9) in which Lotka’s law is used as an axiom that many mathematical results in all subfields of informetrics follow.

Professor Leo Egghe
Universiteit Hasselt, Belgium, and Universiteit Antwerpen, Belgium

Contact him directly

References:

(1) Egghe, L. (1984) “Stopping time techniques for analysts and probabilists”, London Mathematical Society Lecture Notes Series 100. Cambridge, UK: Cambridge University Press.
(2) de Solla Price, D.J. (1963) Little Science, Big Science. New York, USA: Columbia University Press.
(3) Egghe, L. (1989) The Duality of Informetric Systems with Applications to the Empirical Laws. Ph.D. Thesis, City University, London, UK.
(4) Egghe, L. (1990) “The duality of informetric systems with applications to the empirical laws”, Journal of Information Science, Vol. 16, No. 1, pp. 17–27.
(5) Lotka, A.J. (1926) “The frequency distribution of scientific productivity”, Journal of the Washington Academy of Sciences, Vol. 16, No. 12, pp. 317–324.
(6) de Solla Price, D.J. (1976) “A general theory of bibliometric and other cumulative advantage processes”, Journal of the American Society for Information Science, Vol. 27, pp. 292–306.
(7) Simon, H.A. (1957) “On a class of skew distribution functions”, In: Models of man: Social and Rational, Ch. 9. New York, USA: John Wiley and Sons.
(8) de Solla Price, D.J. and Beaver, D.B. (1966) “Collaboration in an invisible college”, American Psychologist, Vol. 21, pp. 1011–1018.
(9) Egghe, L. (2005) Power Laws in the Information Production Process: Lotkaian Informetrics. Oxford, UK: Elsevier.
VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

De Solla Price and the evolution of scientometrics

Has scientometrics changed over the last two-and-a-half decades? And would Derek de Solla Price have enjoyed the changes? Professor Wolfgang Glänzel answers our questions.

Read more >


Wolfgang Glänzel is Professor of Quantitative Science Studies in the Faculty of Business and Economics at Katholieke Universiteit Leuven, Belgium. He is also the Director of the Steunpunt O&O Indicatoren, which is housed within the Faculty of Economics and Applied Economics. This is an inter-university consortium of all Flemish universities. Its mission is the development of a consistent system of indicators for the Flemish Government to quantify R&D efforts at Flemish universities, research institutes and industry.

Prof. Glänzel answers our questions about his memories of Derek de Solla Price and the changes that have taken place in bibliometrics over the last two-and-a-half decades.

RT: What are your memories of de Solla Price?

WG: I didn’t meet him personally. I studied mathematics in Budapest and joined Tibor Braun’s team in 1980. De Solla Price passed away in 1983, so there was unfortunately little opportunity to meet him. Everything I know about him originates from the literature and the anecdotes of people who personally knew him. I was shocked by his unexpected passing and felt like that day signified the close of an important chapter in the field.

RT: What elements of de Solla Price’s work were the most influential in the field of scientometrics?

WG: He was one of the founders of scientometrics and he paved the way for future scientometric research. He published books and important papers that addressed fundamental issues for our field, such as how to get away from methods and models adopted from other fields towards the development of a scientometric-specific methodology.

De Solla Price proposed the growth model and studied scientometric transactions, i.e. the network of citations between scientific papers. He found that a paper that is frequently cited will probably get more citations than one cited less often and created a model for this phenomenon. He also conducted scientometric studies for policy implications and research evaluation, thus opening the door for the present-day evaluative bibliometrics.

RT: How did de Solla Price’s work influence your own?

WG: His career as a scientist was an example to me of how to approach and conduct interdisciplinary research. De Solla Price had a Ph.D. in experimental physics, and then gained a second doctorate in the history of science. He founded a new discipline but also remained a prominent member of his own scientific community.

I also learned that scientometrics is much more than a mere umbrella for a diversity of tools used to measure the output of research. In several papers and lectures I have expressed my concerns regarding some recent developments in our field (1, 2).

There are several topics already tackled by de Solla Price that inspired me to continue his research or answer unresolved questions. Among these are mathematical models for the cumulative advantage principle and for scientometric transactions, the question of obsolescence of scientific information in different fields.

RT: When you won the Derek de Solla Price Medal, Le Pair described your work as being broad as well as focused, which was at the heart of de Solla Price’s research. What similarities would you draw from this?

WG: I’m afraid that I’m not objective enough to be able to answer that question.

RT: Twenty-five years after his passing, how do you think bibliometrics has changed and do you think de Solla Price would have enjoyed the new elements of the field?

WG: I think he would have enjoyed several new elements. First, scientometrics has evolved from an invisible college to an established field with its own scientific journals, conference series, an international academic society and institutionalized education.

In de Solla Price’s day, data processing for bibliometrics was still slow, expensive and limited. Access to bibliometric information has also been transformed by the development of information technology, and I think de Solla Price would have enjoyed this development. The World Wide Web would also have interested him. In the 1980s, this was in its infancy and no one could have predicted its success.

Important bibliometric results have also been published since, and I think he would have enjoyed reading these advancements to the field. However, his dream that scientometrics would become a hard science has not yet happened, as discussed in “Has Price’s dream come true: is scientometrics a hard science?” (3)

I also see the uninformed use and misuse of bibliometric results. By uniformed use I mean that bibliometric data are not used in the proper context but this is done unconsciously; and by misuse I mean that the data are consciously presented and interpreted incorrectly or deliberately used in an inappropriate context. However, I believe the positive achievements of scientometrics over the past 25 years prevail. New elements such as open access, electronic publication and communication and the extension of the bibliographic databases represent new challenges to be taken on by the scientometric community.

Professor Wolfgang Glänzel
Faculty of Business and Economics at Katholieke Universiteit Leuven, Belgium

Contact him directly here

References:

(1) Glänzel, W. and Schoepflin, U. (1994) “Little scientometrics – big scientometrics ... and beyond”, Scientometrics, Vol. 30, Nos. 2–3, pp. 375–384.
(2) Glänzel, W. (2008) “Seven myths in bibliometrics – About facts and fiction in quantitative science studies”, ISSI Newsletter, Vol. 4, No. 2, pp. 24–32.
(3) Wouters, P. and Leydesdorff, L. (1994) “Has Price’s dream come true: is scientometrics a hard science?”, Scientometrics, Vol. 31, No. 2, pp. 193–222.
VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

Journals as retention mechanisms of scientific growth

Professor Loet Leydesdorff has spent the last 20 years developing an idea first posed by Derek de Solla Price in 1961. He asks whether the aggregated citation relations among journals can be used to study clusters of journals as representations of the intellectual organization of the sciences.

Read more >


Among the many discoveries that Derek de Solla Price made during his lifetime, I find Figure 1 the most inspiring (1). In this picture, de Solla Price provides a graphic illustration of the exponential growth of scientific journal literature since the appearance of the first journals in 1665. De Solla Price was fascinated with journals and their exponential growth in size and numbers ever since his first study of the Philosophical Transactions of the Royal Society of London from its very beginning in 1665 (2, 3).

On the basis of an experimental version of the Science Citation Index in 1961, de Solla Price formulated a program for mapping the sciences in terms of aggregated journal-journal citation structures as follows:

“The total research front of science has never, however, been a single row of knitting. It is, instead, divided by dropped stitches into quite small segments and strips. From a study of the citations of journals by journals I come to the conclusion that most of these strips correspond to the work of, at most a few hundred men at any one time. Such strips represent objectively defined subjects whose description may vary materially from year to year but which remain otherwise an intellectual whole. If one would work out the nature of such strips, it might lead to a method for delineating the topography of current scientific literature. […] Journal citations provide the most readily available data for a test of such methods” (4)

Organization of knowledge

Over the past 20 years, I have addressed the question of whether the aggregated citation relations among journals can be used to study clusters of journals as representations of the intellectual organization of the sciences. If the intellectual organization of the sciences is operationalized using journal structures, three theoretically important problems can be addressed:

1. In science studies, this operationalization of the intellectual organization of knowledge in terms of texts (journals), as different from the social organization of the sciences in terms of institutions and people, would enable us to explain the scientific enterprise as a result of these two interacting and potentially coevolving dimensions (5, 6, 7).

2. In science policy analysis, the question of whether a baseline can be constructed for measuring the efficacy of political interventions was raised by Kenneth Studer and Daryl Chubin (8; cf. 9, 10). Wolfgang van den Daele et al distinguished between parametric steering, in terms of more institutional activities due to increased funding, versus the relative autonomy and potential self-organization of scientific communication into specialties and disciplinary structures (11).

3. While journal Impact Factors are defined with reference to averages across the sciences (12, 13), important parameters of intellectual organization, such as publication and citation frequencies, vary among disciplines (14). In fact, publication practices across disciplinary divides are virtually incomparable (15, 16, 17). The Impact Factor is a global measure that does not take into account the intellectual structures in the database.

Mapping the data

De Solla Price conjectured that specialties would begin to exhibit ‘speciation’ when the carrying community grows larger than a hundred or so active scientists (18). Furthermore, the proliferation of scientific journals can be expected to correlate with this because new communities will wish to begin their own journals (4, 19). New journals are organized within existing frameworks, but the bifurcations and other network dynamics feed back on the historical organization to the extent that new fields of science and technology become established and existing ones reorganized.

De Solla Price's dream of making sciometric mapping a relatively hard sociual science can, with hindsight, be considered as fundamentally flawed.

Whereas the variation is visible in the data, the selection mechanisms remain latent and can therefore only be hypothesized. On the one hand, these constructs are needed as dimensions for the mapping of the data. On the other hand, constructs remain ‘soft’; that is, open for debate and reconstruction. De Solla Price’s dream of making scientometric mapping a relatively hard social science (20) can, with hindsight, be considered as fundamentally flawed (21, 22). When both the data and the perspectives are potentially changing, the position of the analyst can no longer be considered as neutral (23).

 Fig 1

Figure 1 – In this graph showing the number of journals founded (not surviving) as a function of date (NB, the two uppermost points are taken from a slightly differently based list), de Solla Price provides a graphic illustration of the exponential growth of the scientific journal literature since the first journals in 1665 (1).

Professor Loet Leydesdorff
Amsterdam School of Communications Research,
University of Amsterdam

Contact him directly

References:

(1) de Solla Price, D.J. (1961) Science Since Babylon. New Haven, USA: Yale University Press.
(2) de Solla Price, D.J. (1951) “Quantitative measures of the development of science”, Archives Internationales d’Histoire des Sciences, Vol. 14, pp. 85–93.
(3) de Solla Price, D.J. (1978) “Toward a model of science indicators”, In: Elkana, Y., Lederberg, J., Merton, R.K., Thackray A. and Zuckerman H. (Eds.) The Advent of Science Indicators, pp. 69–95. New York, USA: John Wiley and Sons.
(4) de Solla Price, D.J. (1965) “Networks of scientific papers”, Science, Vol. 149, pp. 510–515.
(5) Whitley, R.D. (1984) The Intellectual and Social Organization of the Sciences. Oxford, UK: Oxford University Press.
(6) Leydesdorff, L. (1995) The Challenge of Scientometrics: The Development, Measurement, and Self-Organization of Scientific Communications. Leiden, the Netherlands: DSWO Press, Leiden University.
(7) Leydesdorff, L. (1998) “Theories of citation?”, Scientometrics, Vol. 43, No. 1, pp. 5–25.
(8) Studer, K.E. and Chubin, D.E. (1980) The Cancer Mission. Social Contexts of Biomedical Research. Beverly Hills, USA: Sage.
(9) Leydesdorff, L. and van der Schaar, P. (1987) “The use of scientometric indicators for evaluating national research programmes”, Science & Technology Studies, Vol. 5, pp. 22–31.
(10) Leydesdorff, L., Cozzens, S.E. and van den Besselaar, P. (1994) “Tracking areas of strategic importance using scientometric journal mappings”, Research Policy, Vol. 23, pp. 217–229.
(11) van den Daele, W., Krohn, W. and Weingart, P. (Eds.) (1979) Geplante Forschung: Vergleichende Studien über den Einfluss politischer Programme auf die Wissenschaftsentwicklung. Frankfurt a.M., Germany: Suhrkamp.
(12) Garfield, E. (1972) “Citation analysis as a tool in journal evaluation”, Science, Vol. 178, No. 4060, pp. 471–479.
(13) Garfield, E. (1979) Citation Indexing: Its Theory and Application in Science, Technology, and Humanities. New York, USA: John Wiley and Sons.
(14) de Solla Price, D.J. (1970) “Citation measures of hard science, soft science, technology, and nonscience”, In: Nelson, C.E. and Pollock D.K., (Eds.) Communication among Scientists and Engineers, pp. 3–22. Lexington, MA, USA: Heath.
(15) Leydesdorff, L. (2008) “Caveats for the use of citation indicators in research and journal evaluation”, Journal of the American Society for Information Science and Technology, Vol. 59, No. 2, pp. 278–287.
(16) Cozzens, S.E. (1985) “Using the archive: Derek Price’s theory of differences among the sciences”, Scientometrics, Vol. 7, pp. 431–441.
(17) Nederhof, A.J., Zwaan, R.A., Bruin, R.E. and Dekker, P.J. (1989) “Assessing the usefulness of bibliometric indicators for the humanities and the social sciences: A comparative study”, Scientometrics, Vol. 15, pp. 423–436.
(18) de Solla Price, D.J. (1963) Little Science, Big Science. New York, USA: Columbia University Press.
(19) Van den Besselaar, P. and Leydesdorff, L. (1996) “Mapping change in scientific specialties: a scientometric reconstruction of the development of artificial intelligence”, Journal of the American Society for Information Science, Vol. 47, pp. 415–436.
(20) de Solla Price, D.J. (1978) “Editorial Statement”, Scientometrics, Vol. 1, No. 1, pp. 7–8.
(21) Wouters, P. and Leydesdorff, L. (1994) “Has Price’s dream come true: is scientometrics a hard science?”, Scientometrics, Vol. 31, pp. 193–222.
(22) Wouters, P. (1999) The Citation Culture. Amsterdam, the Netherlands: Unpublished Ph.D. Thesis, University of Amsterdam.
(23) Leydesdorff, L. (2006) “Can scientific journals be classified in terms of aggregated journal-journal citation relations using the journal citation reports?”, Journal of the American Society for Information Science & Technology, Vol. 57, No. 5, pp. 601–613.
VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

The invisible college: working within the Pricean tradition

It is almost impossible to explain how Derek de Solla Price has influenced her work, says Professor Katherine McCain, since her entire career has been within the Pricean tradition. She discusses what this means to her.

Read more >


It’s hard to isolate and focus on “how the work of Derek de Solla Price influenced and continues to influence your work”, as the invitation put it, because I, and several other Drexel faculty and students, have worked within the ‘Pricean tradition’ of research and scholarship throughout our scholarly careers.

De Solla Price’s contributions to the history of science and developmental trends, quantitative patterns of citation and scholarship in the sciences (and non-sciences), and the role of the invisible college in scientific communication were part of the Drexel research environment in the 1970s and 1980s. This came about directly through the friendship between de Solla Price and the late Belver Griffith and more generally the key role that studies of bibliometrics and scientific communication played in faculty and doctoral research at the time under the guidance of Griffith, Howard White and Carl Drott.

In my case, I came to information science and bibliometrics/scientometrics fairly late in life, after two degrees in the life sciences, five years managing a biology library and a strong interest in the history of science and technology. I had actually encountered de Solla Price’s work at high school through his article on the Antikythera Device (1) but of course had no idea who the author was and no inkling of his influence on my future career.

Index for the natural sciences

I first encountered the kinds of phenomena that de Solla Price focused on when tallying citations to journals in faculty and Ph.D. student papers in the Biology Department at Temple University, Philadelphia, US (2). Most citations could be described by what I subsequently discovered was de Solla Price’s Index for the natural sciences (majority of citations to articles published in the previous five years) (3).

I think (one of) the most important links between De Solla Price's work and my own are... his general thoughts on mapping.

As a doctoral student at Drexel with an interest in quantitative studies of science and mapping, I read most of de Solla Price’s major works in my doctoral coursework and as background for my thesis research. Much of it resonated particularly with me because of my previous experiences in the life sciences. The concepts of the invisible college and research front (4, 5) seemed to fit my observations of the way that zoology and marine biology worked and, as noted above, de Solla Price’s Index described the citation patterns of the biologists I worked with at Temple University.

Mapping as scientific representation

Looking at my post-Ph.D. publications, I see several influences of de Solla Price’s work on my own. Generally, I focus on quantitative data that describe trends and activities in the sciences, as de Solla Price did, supplementing it with interviews and other methods of knowledge elicitation. De Solla Price’s interest in patterns of citation in journal literature is reflected in my studies of journals in the natural and social sciences and what citation and co-citation patterns can tell us about the core literature of a field. Invisible colleges and research fronts can be identified in co-citation maps – a major part of the bibliometric work at Drexel.

However, I think the most important links between de Solla Price’s work and my own are his comment on an early author co-citation map of information science and his general thoughts on mapping. In the first case, Howard White (6) quotes de Solla Price as saying that he “knew everyone about halfway across and then no one”. In the second, de Solla Price commented in 1979 that the major features of science could be represented in two-dimensional maps. These two observations validate the usefulness of mapping as a way of representing a field of science more broadly than even the most knowledgeable expert and have continued to inspire me over the past 30 years.

Professor Katherine W. McCain
College of Information Science & Technology, Drexel University, Philadelphia, US

Contact her directly

References:

(1) de Solla Price, D.J. (1959) “An Ancient Greek computer”, Scientific American, Vol. 200, No. 6, pp. 60–67.
(2) McCain, K.W. and Bobick, J.E. (1981) “Patterns of journal use in a departmental science library: a citation analysis”, Journal of the American Society for Information Science, Vol. 32, No. 4, pp. 257–267.
(3) de Solla Price, D.J. (1986) “Citation measures of hard science, soft science, technology, and nonscience”, In: Nelson, C.E. and Pollock, D.K. (Eds.) Communication Among Scientists and Engineers, pp. 155–179. New York, USA: Columbia University Press.
(4) de Solla Price, D.J. (1961) Science Since Babylon. New Haven, USA: Yale University Press.
(5)
de Solla Price, D.J. (1965) “Networks of scientific papers”, Science, Vol. 149, No. 3683, pp. 510–515.
(6)
White, H.D. (1990) “Author cocitation analysis: overview and defense”, In: Borgman, C. (Ed.) Bibliometrics and Scholarly Communication, pp. 84–106. Newbury Park, CA, USA: Sage.
(7) de Solla Price, D.J. (1979) “The revolution in mapping of science”, Proceedings of the American Society for Information Science Annual Meeting, Vol. 16, pp. 249–253.


VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

Why “The citation cycle” is my favorite de Solla Price paper

Dr. Henk Moed’s research over the past 25 years strongly builds upon the pioneering work of Derek de Solla Price. Here, Moed discusses his favorite de Solla Price paper.

Read more >


My research into quantitative science studies over the past 25 years with my colleagues at the Centre for Science and Technology Studies (CWTS) strongly builds upon the pioneering work of Derek de Solla Price. Most, if not all, of the research topics, ideas and methodologies we have been working on were introduced and explored in his publications.

For instance, in the recent development of research performance assessment methodologies in social sciences and humanities, de Solla Price’s analysis of “citation measures of hard science, soft science, technology, and nonscience” (1) play a crucial role. Studies of national research systems and the production of national science indicators build upon de Solla Price’s proposals for a model and a comprehensive system of science indicators (2, 3). Current mapping of the structure and development of scientific activity, and identifying networks of scientific papers, individual researchers and groups, are all based on de Solla Price’s papers “Networks of scientific papers” (4) and “The citation cycle” (5).

The citation cycle

“The citation cycle” is my favorite paper. It contains many ideas, analyses and suggestions for future research. In my view, this paper is an exemplar for genuine, creative and original bibliometric-scientometric analysis of a large database, such as Eugene Garfield’s Science Citation Index (6). One of our publications presented an update and a further extension of some elements of the citation cycle (7).

For instance, while de Solla Price found that, in 1980, a team of 2.5 authors published 2.2 papers, our later study in 2002 found that a team of 3.8 authors produced on average 2.8 papers. An in-depth analysis inspired by de Solla Price (5) categorized authors publishing in a year into continuants, movers, newcomers and transients. De Solla Price defined active scientists in a year as scientists who published in that year. But since active scientists do not necessarily publish papers in each year of their career, the new study proposed a new measure of the number of scientists active in a year based on the assumption that active scientists publish at least one paper every four years.

Most, if not all, of the research topics, ideas and methodologies we have been working on were introduced and experienced in de Solla Price's publications

Publication productivity

Our research question was: did scientists’ publication productivity (defined as the number of published papers per scientist, one of the key parameters in the citation cycle) increase during the 1980s and 1990s? It was found that while the average scientist published more research articles during the time period considered, from a global perspective, introduced in de Solla Price’s citation cycle, the overall publication productivity (defined as the total number of articles published by all authors in a year divided by the number of scientists active in that year) has remained approximately constant during the past two decades.

This paradox is explained by the fact that scientists are collaborating more intensively, making the sizes of teams authoring papers larger. At the level of disciplines, however, basic and applied physics and chemistry tend to show an increase in publication productivity over the years, while medical and biological sciences have declined.

A detailed interpretation of these outcomes, taking the effects of policy into account, goes beyond the scope of this contribution. Scientists may have successfully increased their individual publication output through more collaboration and authorship inflation, possibly stimulated by the use of ‘crude’ publication counts in research evaluation, without increasing their joint productivity.

An alternative interpretation holds that the amount of energy and resources absorbed by collaborative work may be so substantial that it held overall publication productivity back. More research is needed to further clarify the issues addressed. The main objective of this contribution is to emphasize how strongly these issues are related to de Solla Price’s citation cycle, and how an update, a longitudinal analysis and a further extension of his creative works may generate policy-relevant observations.

Dr. Henk F. Moed
Centre for Science and Technology Studies, Leiden University, the Netherlands

Contact him directly

References:

(1) de Solla Price, D.J. (1970) “Citation measures of hard science, soft science, technology, and nonscience”, In: Nelson, C.E. and Pollock, D.K. (Eds.) Communication Among Scientists and Engineers, pp. 3–22. Lexington, MA, USA: D.C. Heath and Company.
(2) Price, D.J. (1978) “Towards a model for science indicators”, In: Elkana, Y., Lederberg, J., Merton, R.K., Thackray, A., and Zuckerman, H. (Eds.) Toward a Metric of Science: The Advent of Science Indicators, pp. 69–95. New York, USA: John Wiley and Sons.
(3) de Solla Price, D.J. (July 1980) “Towards a comprehensive system of science indicators”, Conference on Evaluation in Science and Technology – Theory and Practice, Dubrovnik.
(4) de Solla Price, D.J. (1965) “Networks of scientific papers”, Science, Vol. 149, No. 3683, pp. 510–515.
(5) de Solla Price, D.J. (1980) “The citation cycle”, In: Griffith B.C. Key Papers in Information Science, pp. 195–210. White Plains, NY, USA: Knowledge Industry Publications.
(6) Garfield, E. (1964) “The citation index – a new dimension in indexing”, Science, Vol. 144, pp. 649–654.
(7) Moed, H.F. (2005) Citation Analysis in Research Evaluation. Dordrecht, the Netherlands: Kluwer.
VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

Plus ça change, plus c’est la même chose: de Solla Price’s legacy and the changing face of scientometrics

Derek de Solla Price has always been a major source of inspiration for Professor Anthony van Raan. He looks back at the development of the science of science since the 1960

Read more >


The invention and development of the Science Citation Index by Eugene Garfield in the 1960s was a major breakthrough in the study of science. This invention enabled statistical analyses of scientific literature on a very large scale. The great scientist Derek de Solla Price immediately recognized the value of Garfield’s invention, particularly from the perspective of the contemporaneous history of science.

Scientists have always been fascinated by basic features such as simplicity, symmetry, harmony and order. The Science Citation Index motivated de Solla Price to work on a ‘physical approach’ to science, in which he tried to find laws to predict further developments, inspired by the principles of statistical mechanics.

Cognitive and social indicators
Specific parameters, ‘indicators’, are guides to finding and understanding such basic features. The most basic feature concerns the cognitive dimension: the development of the content and structure of science. Other indicators relate to the social dimension of science, in particular to aspects formulated in questions such as:

  • How many researchers?
  • How much money is spent on science? How ‘good’ are research groups?
  • How does communication in science work, particularly the role of books, journals, conferences?

And beyond that there is another, often forgotten, question:

  • What is the economic profit of scientific activities?

A landmark in the development of science indicators was the first publication in a biennial series of the Science & Engineering Indicators report (as it is now called) in 1973. Encouraged by the success of economists in developing quantitative measures of political significance for areas such as unemployment and GNP, the US National Science Board started this series of reports, which focus more on the demographic and economic state of science than on its cognitive state.

What is the difference between data and indicators?
An indicator is a measure that explicitly addresses some assumption. To begin with, we need to discover which features of science can be given a numerical expression. Indicators cannot exist without a specific goal; they must address specific questions. They have to be created to gauge important ‘forces’; for example, how scientific progress is related to specific cognitive and socio-economic aspects. If indicators are not problem-driven, they are useless. They have to describe the recent past in such a way that they can guide us in, and inform us about, the near future.

A second and more fundamental role of indicators is their potential to test aspects of theories and models of scientific development and its interaction with society. In this sense, indicators are not only tools for science policymakers and research managers, but also instruments in the study of science.

But we also have to realize that science indicators do not answer typical epistemological questions such as:

  • How do scientists decide what will be called a scientific fact?
  • How do scientists decide whether a particular observation supports or contradicts a theory?
  • How do scientists come to accept certain methods or scientific instruments as valid means of attaining knowledge?
  • How does knowledge selectively accumulate? (1)

De Solla Price strikingly described the mission of the indicator-maker: to find the simplest pattern in the data at hand, and then look for the more complex patterns that modify the first (2). What should be constructed from the data is not a number but a pattern: a cluster of points on a map, a peak on a graph, a correlation of significant elements in a matrix, a qualitative similarity between two histograms.

What has also changed is the mode of publishing. Electronic publishing and electronic archives mark a whole new era.

If these patterns are found, the next step is to suggest models that produce such patterns and to test these models with further data. A numerical indicator or an indicative pattern alone has little significance. The data must be given perspective: the change of an indicator with time, or different rates of change of two different indicators. It is crucial that geometrical or topological objects or relations are used to replace numerical quantities.

Now, 25 years after the passing of de Solla Price, plus ça change, plus c’est la même chose rings true. What has changed is the very significant progress in application-oriented indicator work based on the enormous increase of available data and, above all, the almost unbelievable – compared to the 1970s – increase of computing power and electronic facilities. What has also changed is the mode of publishing. Electronic publishing and electronic archives mark a whole new era.

What has remained the same, however, are some of the most fundamental questions. For instance, to what extent can science maps derived from citation or concept-similarity data be said to exist in a strict spatial sense? In other words, do measures of similarity imply the existence of metric space? This question brings us to an even more fundamental problem formulated by de Solla Price: that the ontological status of maps of science will remain speculative until more has been learned about the structure of the brain itself.

The ideas and work of de Solla Price have always been one of my major sources of inspiration and I take pride in being a winner of an international award that bears his name.

Professor Anthony F.J. van Raan
Centre for Science and Technology Studies, Leiden University, the Netherlands

Contact him directly

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

Did you know

Little science, big idea

When Derek de Solla Price’s famous book, Little Science, Big Science, was first published in 1963, it marked a turning point in the development of the science of science. In this short book, de Solla Price developed the statistical basis of such concepts as Big Science and the invisible college, and introduced the enduring observations that “80 to 90 percent of all the scientists that have ever lived are alive now” and that “80 to 90 percent of all scientific work achieved” has been carried out within living memory. An excellent overview if its enduring impact has been recently published (1, 2).

The 1986 edition, Little Science, Big Science … and Beyond, comprises the first four chapters of the first edition and a further nine chapters reprinting some of de Solla Price’s seminal journal articles and book chapters. Both editions continue to be heavily cited, with 60 citations from the journal literature in 2007 alone (according to Scopus). Out of print for many years, copies of the second edition from online booksellers now fetch up to US$200.

References:

(1) Furner, J. (2003) “Little Book, Big Book: Before and after Little Science, Big Science: A review article, Part I”, Journal of Librarianship and Information Science, Vol. 35, No. 2, pp, 115–125.

(2) Furner, J. (2003) “Little Book, Big Book: Before and after Little Science, Big Science: A review article, Part II”, Journal of Librarianship and Information Science, Vol. 35, No. 3, pp. 189–201.

9
https://www.researchtrends.com/wp-content/uploads/2011/01/Research_Trends_Issue7.pdf