One of the main attractions at the Science and Technology Indicators (STI) conference (1) held in Leiden in September 2014, was “The Daily Issue”. Invented by Diana Wildschut, Harmen Zijp and Patrick Nederkoorn, the reporters had three hours to find out what was happening at the conference and report about it using 1950s equipment and without telephones or internet (2). The result was a hilarious newsletter published every day and handed to the audience who came to realize how the world of Scientometrics looks to outsiders. An example included an item on the issue of serendipity in scientific process which resulted in the invention of “Serendipimetry”, a metric that measures serendipity (see additional interpretation in the text box below).


(The Daily Issue; No. 72

Some of the most valuable scientific outcomes are the result of accidental discoveries. This article explores the possibility of a metric of serendipity. Firstly, a clear distinction has to be made between a serendipity indicator and a serendipitous indicator. The latter may only be meaningful in the way it could assist chance events in finding information. More interesting however, it could be to actually measure, or at least estimate, the degree of serendipity that led to a research result. And yet another angle would be the presentation of research that might facilitate its receivers, e.g. the readers of an article, in making odd detours, living through paradigm shifts et cetera.

Alongside the traditional topics often discussed at the STI conference such as statistical representation of scientific output in forms of performance indicators and metrics, this year the conference put a strong focus on innovation. New datasets and algorithms were among the topics given significant attention. Examples include new data derived from funding systems which were explored in relation to productivity, efficiency, and patenting. Looking at the factors that influence participation in government funded programs, Lepori et al. (3) found a very strong concentration of participations from a very small number of European research universities. They also showed that these numbers can be predicted with high precision from organizational characteristics and, especially, size and international reputation. Relationships between funding, competitiveness and performance (4) were found to contradict previous findings, whereas here the researchers found that the share of institutional funding does not correlate with competitiveness, overall performance, and top performance. Additional research papers using funding systems data are available here.

New gender and career data currently available brought forth a series of studies dedicated to the relationship between gender, career level and scientific output. Van der Weijden and Calero Medina (5) studied the oeuvres of female and male scientists in academic publishing using bibliometrics. Using data from the ACUMEN survey (6), their analysis confirmed the traditional gender pattern: men produce on average a higher number of publications compared to women, regardless of their academic position and research field, and women are not evenly represented across authorship positions. Paul-Hus et al. (7) studied the place of women in the Russian scientific research system in various disciplines and how this position has evolved during the last forty years.  They found that gender parity is far from being achieved and that women remain underrepresented in terms of their relative contribution to scientific output across disciplines. Sugimoto et al. (8) presented a study featuring a global analysis of women in patents from 1976 to 2013, which found that women’s contribution to patenting remains even lower than would be predicted given their representation in Science, Technology, Engineering, and Mathematics.

Career-related studies also open new paths to understanding the relationships between academic positions, publishing, and relative scientific contributions of researchers throughout their careers. Derycke et al. (9) studied the factors influencing PhD students’ scientific productivity and found that scientific discipline, phase of the PhD process, funding situation, family situation, and organizational culture within the research team are important factors predicting the number of publications. Van der Weijden (10) used a survey to study PhD students’ perceptions of career perspectives in academic R&D, non-academic R&D, and outside R&D, and assessed to what extent these career perspectives influence their job choice. She found that several career-related aspects, such as long-term career perspectives and the availability of permanent positions, are judged much more negatively for the academic R&D sector.

A session on University-Industry collaborations featured interesting research topics such as the relationship between industry-academia co-authorships and their influence on academic commercialization output (Wong & Singh (11); Yegros-Yegros et al. (12)) as well as global trends in University-Industry relationships using affiliation analysis of dual publications (Yegros-Yegros & Tijssen (13)). Related to this topic was a session on patents analysis which was used to study topics such as scientific evaluation and strategic priorities (Ping Ho & Wong (14)).

Measures of online attention, a topic of discussion in the past couple of years, was given special focus at the conference with probably the most studies featured in a session. Studies covered topics such as Mendeley readership analysis and their relationship with academic status (Zahedi et al (15)), Tweets on the Nobel Prize awards and their impact (Levitt & Thelwall (16)), and gender biases (Bar-Ilan & Van der Weijden (17)).

True to its slogan “Context counts: Pathways to master big and little data”, this conference selected a wide range of studies using newly available data to explore topics that provide context to scientific output, including gender, career, university-industry and measurement of engagement. In addition, the selected keynote lectures provided some overall strategic insight into metrics development. Diana Hicks and Henk Moed encouraged the audience to think more strategically about the application of metrics for evaluation purposes. The 7 principles manifesto suggested by Diana Hicks provides evaluators with a framework which can be used to perform assessments of researchers, institutions and programs. This manifesto was picked up by the CWTS group in Leiden headed by Paul Wouters, who is now working on creating an agreed upon set of principles that could potentially inform evaluation and funding systems (18).

Henk Moed (19) called for special attention to be given to the context and purpose of evaluation, using meta-analysis to inform the choice of data and methodology of the evaluation. Presenting the “The multi-dimensional research assessment matrix”, he gave some examples of how to compile correct and fair evaluation indicators using guiding questions that inform the process (20).

If there is one message that could be drawn from this conference it is that the plethora of recently available data, statistical analysis and indicators is an overall positive development only if they are used in the correct context and are able to answer the questions posed. There is no one metric that fits all evaluation objectives and therefore the data selected, the method used and the conclusions drawn must be made carefully, keeping in mind that context is probably the key factor to successful assessment.

Daily Issue example

The Daily Issue | Edition 73 commenting on Diana Hick’s 7 principles of research evaluation manifesto (comments in italics)

  1. Metrics are not a substitute for assessment – Don’t blame it on the metrics
  2. Spend time and money to produce high quality data – Print your results on glossy paper
  3. Metrics should be transparent and accessible- Everyone can have a say even if they don’t know s**
  4. Data should be verified by those evaluated – Be careful not to insult anyone
  5. Be sensitive to field differences – Use long words to avoid homonyms
  6. Normalize data to account for variations by fields and over time – If your data is useless for one field, make slight adaptations and use them for another field or try again in 10 years
  7. Metrics should align with strategic goals – Follow the money

The Daily Issue: Edition 73:!


(Full text of all articles is available here:

(3) Lepori, B., Heller-Schuh, B., Scherngell, Th., Barber, M. (2014) “Understanding factors influencing participation in European programs of Higher Education Institutions”, Proceedings of STI 2014 Leiden, p. 345
(4) Sandström, U., Heyman, U. and Van den Besselaar, P. (2014) “The Complex Relationship between Competitive Funding and Performance”, Proceedings of STI 2014 Leiden, p. 523
(5) Van der Weijden, I. and Calero Medina, C. (2014) “Gender, Academic Position and Scientific Publishing: a bibliometric analysis of the oeuvres of researchers”, Proceedings of STI 2014 Leiden, p. 673
(6) ACUMEN Survey (2011), available
(7) Paul-Hus, A., Bouvier, R., Ni, C., Sugimoto, C.R., Pislyakov, V. and Larivière, V. (2014) “Women and science in Russia: a historical bibliometric analysis”, Proceedings of STI 2014 Leiden, p. 411
(8) Sugimoto, C.R., Ni, C., West, J.D., and Larivière, V. (2014) “Innovative women: an analysis of global gender disparities in patenting”, Proceedings of STI 2014 Leiden, p. 611
(9) Derycke, H., Levecque, K., Debacker, N., Vandevelde, K. and Anseel, F. (2014) “Factors influencing PhD students’ scientific productivity”, Proceedings of STI 2014 Leiden, p. 155
(10) Van der Weijden, I., Zahedi, Z., Must, U. and Meijer, I. (2014) “Gender Differences in Societal Orientation and Output of Individual Scientists”, Proceedings of STI 2014 Leiden, p. 680
(11) Wong, P.K. and Singh, A. (2014) “A Preliminary Examination of the Relationship between Co-Publications with Industry and Technology Commercialization Output of Academics: The Case of the National University of Singapore”, Proceedings of STI 2014 Leiden, p. 702
(12) Yegros-Yegros, A., Azagra-Caro, J.M., López-Ferrer, M. and Tijssen, R.J.W. (2014) “Do University-Industry co-publication volumes correspond with university funding from business firms?”, Proceedings of STI 2014 Leiden, p. 716
(13) Yegros-Yegros, A. and Tijssen, R. (2014) “University-Industry dual appointments: global trends and their role in the interaction with industry”, Proceedings of STI 2014 Leiden, p. 712
(14) Ho, Y.P. and Wong, P.K. (2014) “Using Patent Indicators to Evaluate the Strategic Priorities of Public Research Institutions: An exploratory study”, Proceedings of STI 2014 Leiden, p. 276
(15) Zahedi, Z., Costas, R. and Wouters, P. (2014) “Broad altmetric analysis of Mendeley readerships through the ‘academic status’ of the readers of scientific publications”, Proceedings of STI 2014 Leiden, p. 720
(16) Levitt, J. and Thelwall, M. (2014) “Investigating the Impact of the Award of the Nobel Prize on Tweets”, Proceedings of STI 2014 Leiden, p. 359
(17) Bar-Ilan, J. and Van der Weijden, I. (2014) “Altmetric gender bias? – Preliminary results”, Proceedings of STI 2014 Leiden, p. 26
(18) CWTS (2014) “The Leiden manifesto in the making: proposal of a set of principles on the use of assessment metrics in the S&T indicators conference”, Available at:
(19) Moed, H.F., Halevi, G. (2014) “Metrics-Based Research Assessment”, Proceedings of STI 2014 Leiden, p. 391
(20) Moed, H.F., Plume, A. (2011) “The multi-dimensional research assessment matrix”, Research Trends, Issue 23, May 2011, Available at:
VN:F [1.9.22_1171]
Rating: 7.0/10 (2 votes cast)
Reporting Back: STI 2014 Leiden, The Netherlands, 7.0 out of 10 based on 2 ratings