On March 28th in Beijing, Elsevier and The Institute of Scientific and Technical Information of China (ISTIC) hosted a half-day seminar, attended by over 100 people. The seminar focused on the importance of using evidence-based approaches to scientific performance analysis, especially when using it to inform science policy decisions. Evidence-based research relies on the inclusion of diverse datasets in the analysis in order to obtain an in-depth and accurate understanding of scientific progression, competencies and potentialities.

Image 1 – Hosts of the Evidence-based Science Policy Seminar, Beijing, March 28 2012.

The seminar was hosted by Mr.  Wu Yishan, Deputy Director of ISTIC and featured speakers such as Dr. Zhao Zhiyun, Deputy Director at ISTIC; Prof. Dr. Diana Hicks, Chair of the School of Public Policy, Georgia Institute of Technology; Prof. Peter Haddawy, Director of the International Institute for Software Technology at the United Nations University in Maca; and  Dr. Henk Moed, Elsevier Scientific Director.

In her opening presentation, Dr. Zhao discussed ISTIC approaches to evidence based research which includes analyzing internal and external bibliographic databases, patents depositories and technical literature. To that end, ISTIC looks to include reliable and comprehensive scientific datasets from around the world and apply diverse bibliometric methodologies in order to be able to position China in the science world and better understand China’s international scientific collaborations.

Image 2 – Mr.  Wu Yishan, Deputy Director of ISTIC opening the seminar.

Dr. Zhao’s presentation opened up the discussion about bibliometrics as methodology and whether or not it has an actual impact on science policy. To answer this question, Prof. Dr. Diana Hicks presented a series of case studies named “Powerful Numbers” in which she demonstrated how absolute figures, taken from different bibliometric studies, were molded and used by several national  administrations in the USA, UK and Australia to make decisions regarding science funding.  After presenting examples of such use of bibliometric figures, Dr. Hicks concluded that “policy makers over the past few decades have drawn upon analytical scholarly work, and so scholars have produced useful analyses.  However, the relationship between policy and scholarship contains tensions.  Policy users need a clear number.  Scholars seem afraid to draw a strong conclusion, and do not encapsulate their discoveries in simple numbers.”

In the same context, Dr. Henk Moed discussed the use and misuse of the Journal Impact Factor indicator and the ways by which it can be manipulated to achieve certain results, reinforcing the notion that there is no one figure or absolute numeric value that can represent productivity, impact or competency. He presented a new journal metric, SNIP (Source-Normalized Impact per Paper), and discussed its strong points and limitations. Dr. Moed stressed the fact that any conclusion or decision regarding scientific analysis must be preceded by a careful consideration of the purpose of analysis, the appropriate metric and the unit under consideration.

The seminar was concluded by Prof. Peter Haddawy who presented the Global Research Benchmarking System (GRBS) which provides multi-faceted data and analysis to benchmark research performance in disciplinary and interdisciplinary subject areas. This system demonstrates how using SNIP, publications, citations and h-index figures among other data points enables a comprehensive ranking of universities’ research.

In conclusion, this seminar informed the audience of the importance of opening up analytical work being done on productivity, impact and competencies analysis in science to include as many relevant datasets as possible and use more than one metric or a single number. Evaluation must be multi-faceted and comprehensive, much like the research it is trying to capture which is collaborative, international and multi-disciplinary.

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)