Strata is the latest addition to Elsevier’s SciVal suite of research performance and planning tools, and provides methods of assessing individuals and groups through their publication histories. In the last issue, we showed how SciVal Spotlight could visualize the research landscape of the United States; in this article we look at the ways SciVal Strata can chart the research performance of an individual or wider group, either alone or in comparison.

 

Reading between the lines

Science is an inherently progressive, cumulative enterprise. Each year brings more qualified scientists and researchers, more papers and ever more citations to those papers. So the standard view of citations over time in SciVal Strata might come as a surprise. Figure 1 shows the average number of citations per year of papers published in the field of Ecology, Evolution, Behavior and Systematics. At a glance it appears that citations in the field are plummeting, perhaps signaling the implosion of these scientific disciplines.

Figure 1 – Average citations per paper published in the field of Ecology, Evolution, Behavior and Systematics: the three benchmarks show UK papers (purple), European papers (green), and all world papers (blue). Source: SciVal Strata.

Of course, that isn’t the case. This decreasing chart shows a decline neither in scientific quality nor citation quantity: the default view shows average citations per year to documents from each publication year, rather than counting citations cumulatively over time. Since recent publications will typically have received fewer citations to date on average – as they have had less time to accumulate those citations – the shape of the curves now makes sense.

With benchmarks quickly set up in Strata by the user, one can compare researchers to average citation rates in their field. Figure 2 shows a researcher compared with the world average. The previously noted downward slope can be seen, but when looking at an individual’s performance highs and lows can be spotted which will tend to be absent from averages. This researcher clearly had success with papers published in 2000, shown by the sharp rise in the line at that year. Smaller rises can also signify success: while 2008 shows a value that is low relative to earlier years, when the shorter amount of time 2008 papers have had to accumulate citations is taken into account, it was clearly a successful year.

Figure 2 – Average citations per document by publication year of a researcher in the field of Ecology, Evolution, Behavior and Systematics (red) compared with the world average (blue). Source: SciVal Strata.

  

Team players

SciVal Strata enables the analysis of citation patterns in entire research fields, as well as by individual researcher. SciVal Strata also has various ways of showing the contribution an individual makes to a research group; Figures 3 and 4 show two methods. In Figure 3, a researcher is directly compared with his or her team: the two lines weave in and out, as the individual or group outperforms the other, and it is easy to see some disparity in the years 2001 and 2003.

Figure 3 – A researcher (red) compared with their research group (brown) and the world average (blue), showing average citations per document by publication year. Source: SciVal Strata.

Another way of examining the contribution this researcher makes to the research group is to compare two versions of the research group: one with the researcher included, and one without (see Figure 4).

Figure 4 – A comparison of the same research group either with (brown) or without (red) one of its researchers, showing average citations per document by publication year. World average is also shown (blue). Source: SciVal Strata.

 

Open to other views

Bibliometricians commonly warn against the use of a single indicator to make assessments of research output and quality: different measures must often be used evaluate different aspects of performance. For instance, in an assessment of an individual, the number of invited lectures at international conferences is a useful, non-bibliometric indicator. In SciVal Strata, any comparison — whether between individuals, groups, or any other ‘cluster’ of researchers — can be made looking not only at average citations per papers, but also at h-index, citation and publication counts, or the ratio of cited to uncited papers. Figure 5 shows two researchers compared using their h-index values, and Figure 6 their cited and uncited papers from each year. This range of indicators, and the flexibility they allow, means that a comprehensive view of a researcher or group can be used to aid important decisions about promotion, recruitment, and collaboration.

Figure 5 –Comparison of two researchers’ h-index values. The curves show the citations received by each researchers’ papers when arranged in descending order of citations. Dropping a line to either axis from the intersection of each curve with the black line (at 45 degrees from the origin) shows the h-index: here one researcher (green) shows a higher h-index than the other (red). Source: SciVal Strata.

 

Figure 6 – Comparison of the outputs of two researchers per year. The bars show total number of documents, and each is split into solid and faded sections showing the documents that are, respectively, cited and uncited to date. Comparison of the faded and solid areas shows the uncited rate of documents published each year: as expected, this is higher in more recent years as recent documents have had less time to become cited. Source: SciVal Strata.

However, while bibliometric indicators offer a clear view of an individual’s performance — particularly when a wide range are available — it is important to note that they may not tell the whole story. For example, if each co-author of an article is assigned one full credit for the publication, this can mask differences in their actual contributions to the article: one author may have done the majority of the work. Rather than hide such difficulties, bibliometricians and others involved in research assessment need to either use more sophisticated approaches, such as the comparisons available in SciVal Strata, or combine bibliometric assessment with other indicators of research performance and researcher prestige. 

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)