Articles

Research trends is an online magazine providing objective insights into scientific trends based on bibliometrics analyses.

Where government, industry and academia meet

It is becoming increasingly clear that the challenges of climate change cannot be solved without combined effort from government, industry and the academic community. Research Trends reports on the 2009 Times/Smith School World Forum on Enterprise and the Environment, which provides a forum for these groups to pool their efforts.

Read more >


The 2009 Times/Smith School World Forum on Enterprise and the Environment, held at the University of Oxford in early July, brought together leaders in the policy-making, business and academic communities to address the issue of carbon emissions in a radically changed economic climate. Eminent speakers, including former US Vice President Al Gore, discussed the challenges ahead during the three-day forum hosted by the Smith School of Enterprise and the Environment.

Presidential commitment

Mohamed Nasheed, President of the Maldives, was one of two heads of state in attendance at the World Forum (the other was Paul Kagame, President of Rwanda). Nasheed explained that the people of his island nation would be among the first to suffer if decisive action is not taken on the climate crisis. With an average ground level of just 1.5 meters above sea level, the Maldives is the lowest country in the world. While Global Mean Sea Level (GMSL) has recently been rising at a rate of about 3mm per year, from 105mm in 1996 to 142mm in 2006 (1), this rise is not evenly distributed across the globe. Indeed, data suggest that the rate may be closer to 4mm per year in the Maldives over recent years (2).

John Church at the Commonwealth Scientific and Industrial Research Organisation (CSIRO) says it is clear that sea levels are continuing to rise: “The Greenland Ice Sheet seems to be making an increasing contribution and there are indications of an increasing contribution from parts of the Antarctica Ice Sheet.” He adds that a major challenge is “whether we can avoid crossing thresholds leading to a larger and more rapid contribution from the ice sheets – meeting this challenge requires urgent mitigation”.

Encouragingly, the global output of research articles focused on climate change-led sea level changes has recently outstripped the rise in the GMSL itself (see Figure 1). With so much at stake, the Maldives has recently committed to lead by example in the fight against climate change by pledging to be the world’s first carbon-neutral country within a decade.

Four challenges
In his keynote speech, Lord Browne of Madingley set forth four challenges to be met if we are to tackle climate change. President of the Royal Academy of Engineering and former group chief executive of BP, Lord Browne noted direct effects on the pockets of consumers, with fuel bills expected to rise by 2–3% each year for the next two decades as emissions are reduced. Meanwhile, investment in ambitious sustainable energy projects must be stimulated and emissions trading schemes (“cap and trade”) implemented to motivate investors. Opportunities and incentives must be made for business to pursue low carbon technology in their core activities. Finally, global governance must involve all stakeholders, since two thirds of potential emissions reductions could be achieved at half the cost in the developing world.

A unique position

The World Forum was hosted by the Smith School of Enterprise and the Environment, founded in 2008 as a unique interdisciplinary hub where leading academics work with the private sector and government to meet the environmental challenges of our times.

As Dr John Hood, Vice-Chancellor of the University of Oxford, said in his introductory remarks to the World Forum: “All this was thanks to benefactor Martin Smith who realized we had a large number of scholars researching climate change in many different fields, but no interaction with business. This was the gap that the Smith School could fill. The School had the capacity to bring together private companies, academic institutions, governments and non-governmental organizations to meet the climate challenge. The location of the Smith School at Oxford University meant its research could be fully interdisciplinary.”

In this way, the Smith School embodies the concept of the Triple Helix of university-industry-government interactions. This model of innovation, developed by Henry Etzkowitz and Loet Leydesdorff in the 1990s, invokes a spiral of complex and dynamic interactions and knowledge flows between the three players but places the university as the leader in the creation of knowledge and economic development.

The school’s research fellows, visiting research fellows and faculty associates provide expertise in fields as diverse as engineering, physics, geography, economics, law and philosophy. Founding Director Sir David King, who served as the UK Government’s Chief Scientific Adviser and Head of the Government Office of Science from 2000 to 2007, told the World Forum that, “the Smith School is a global hub to facilitate governments, the private sector and academia to meet the climate challenge”. Sir David is the author of more than 400 articles since the 1960s that have collectively attracted more than 750 citations in 2008 alone (source: Scopus). A surface chemist, his work on catalysis on solid surfaces has paved the way for improvements in the efficiency of industrial processes and reductions in the cost of catalytic converters for cars.

An inconvenient truth

Former US Vice President Al Gore, co-winner of the 2007 Nobel Peace Prize with the Intergovernmental Panel on Climate Change, gave the closing address to the World Forum. His remarks, while critical of the role that he felt government had so far neglected to play in the climate crisis, were also positive for our chances to avoid catastrophe: “I say we ought to approach this challenge with a sense of joy. What a privilege to have work so worthy of one’s best efforts. A challenge so crucial to all future generations.”

Gore, who brought climate change awareness to a mass audience with the 2006 documentary An Inconvenient Truth, noted that one way forward was continued advances in renewable energy science and technology. “I come away from that journey absolutely convinced that we have the tools available to us to solve three climate crises. We only have to solve one.”

In his final remarks at the World Forum, Gore said of the future of climate change action: “It can happen. It will happen. We have everything we need, except political will – and political will is a renewable resource.”

Sir David King added: “Uniquely, the first Smith School World Forum provided a venue for entrepreneurs, scientists and business leaders to seek and assess ways of defossilizing our economies. As the Copenhagen protocol develops, the World Forum will become the key annual event for leaders of governments, the private sector and academe to examine progress and the many innovative solutions to this unavoidable challenge.”

Figure 1 – Figure 1. Annual global output of articles on “sea level” and “climate change” and global mean sea level (mm) 1996–2006 Source: Scopus

Figure 1 – Figure 1. Annual global output of articles on “sea level” and “climate change” and global mean sea level (mm) 1996–2006 Source: Scopus

References:

(1) Church, J.A. and N.J. White (2006) “A 20th century acceleration in global sea-level rise”, Geophys. Res. Lett., 33, L01602. Updated data.
(2) Khan, T.M.A., Quadir, D.A., Murty, T.S., Kabir, A., Aktar, F. and Sarkar, M.A. (2002) “Relative sea level changes in Maldives and vulnerability of land due to abnormal coastal inundation”, Marine Geodesy, 25, 133–143.
VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

The 2009 Times/Smith School World Forum on Enterprise and the Environment, held at the University of Oxford in early July, brought together leaders in the policy-making, business and academic communities to address the issue of carbon emissions in a radically changed economic climate. Eminent speakers, including former US Vice President Al Gore, discussed the challenges ahead during the three-day forum hosted by the Smith School of Enterprise and the Environment.

Presidential commitment

Mohamed Nasheed, President of the Maldives, was one of two heads of state in attendance at the World Forum (the other was Paul Kagame, President of Rwanda). Nasheed explained that the people of his island nation would be among the first to suffer if decisive action is not taken on the climate crisis. With an average ground level of just 1.5 meters above sea level, the Maldives is the lowest country in the world. While Global Mean Sea Level (GMSL) has recently been rising at a rate of about 3mm per year, from 105mm in 1996 to 142mm in 2006 (1), this rise is not evenly distributed across the globe. Indeed, data suggest that the rate may be closer to 4mm per year in the Maldives over recent years (2).

John Church at the Commonwealth Scientific and Industrial Research Organisation (CSIRO) says it is clear that sea levels are continuing to rise: “The Greenland Ice Sheet seems to be making an increasing contribution and there are indications of an increasing contribution from parts of the Antarctica Ice Sheet.” He adds that a major challenge is “whether we can avoid crossing thresholds leading to a larger and more rapid contribution from the ice sheets – meeting this challenge requires urgent mitigation”.

Encouragingly, the global output of research articles focused on climate change-led sea level changes has recently outstripped the rise in the GMSL itself (see Figure 1). With so much at stake, the Maldives has recently committed to lead by example in the fight against climate change by pledging to be the world’s first carbon-neutral country within a decade.

Four challenges
In his keynote speech, Lord Browne of Madingley set forth four challenges to be met if we are to tackle climate change. President of the Royal Academy of Engineering and former group chief executive of BP, Lord Browne noted direct effects on the pockets of consumers, with fuel bills expected to rise by 2–3% each year for the next two decades as emissions are reduced. Meanwhile, investment in ambitious sustainable energy projects must be stimulated and emissions trading schemes (“cap and trade”) implemented to motivate investors. Opportunities and incentives must be made for business to pursue low carbon technology in their core activities. Finally, global governance must involve all stakeholders, since two thirds of potential emissions reductions could be achieved at half the cost in the developing world.

A unique position

The World Forum was hosted by the Smith School of Enterprise and the Environment, founded in 2008 as a unique interdisciplinary hub where leading academics work with the private sector and government to meet the environmental challenges of our times.

As Dr John Hood, Vice-Chancellor of the University of Oxford, said in his introductory remarks to the World Forum: “All this was thanks to benefactor Martin Smith who realized we had a large number of scholars researching climate change in many different fields, but no interaction with business. This was the gap that the Smith School could fill. The School had the capacity to bring together private companies, academic institutions, governments and non-governmental organizations to meet the climate challenge. The location of the Smith School at Oxford University meant its research could be fully interdisciplinary.”

In this way, the Smith School embodies the concept of the Triple Helix of university-industry-government interactions. This model of innovation, developed by Henry Etzkowitz and Loet Leydesdorff in the 1990s, invokes a spiral of complex and dynamic interactions and knowledge flows between the three players but places the university as the leader in the creation of knowledge and economic development.

The school’s research fellows, visiting research fellows and faculty associates provide expertise in fields as diverse as engineering, physics, geography, economics, law and philosophy. Founding Director Sir David King, who served as the UK Government’s Chief Scientific Adviser and Head of the Government Office of Science from 2000 to 2007, told the World Forum that, “the Smith School is a global hub to facilitate governments, the private sector and academia to meet the climate challenge”. Sir David is the author of more than 400 articles since the 1960s that have collectively attracted more than 750 citations in 2008 alone (source: Scopus). A surface chemist, his work on catalysis on solid surfaces has paved the way for improvements in the efficiency of industrial processes and reductions in the cost of catalytic converters for cars.

An inconvenient truth

Former US Vice President Al Gore, co-winner of the 2007 Nobel Peace Prize with the Intergovernmental Panel on Climate Change, gave the closing address to the World Forum. His remarks, while critical of the role that he felt government had so far neglected to play in the climate crisis, were also positive for our chances to avoid catastrophe: “I say we ought to approach this challenge with a sense of joy. What a privilege to have work so worthy of one’s best efforts. A challenge so crucial to all future generations.”

Gore, who brought climate change awareness to a mass audience with the 2006 documentary An Inconvenient Truth, noted that one way forward was continued advances in renewable energy science and technology. “I come away from that journey absolutely convinced that we have the tools available to us to solve three climate crises. We only have to solve one.”

In his final remarks at the World Forum, Gore said of the future of climate change action: “It can happen. It will happen. We have everything we need, except political will – and political will is a renewable resource.”

Sir David King added: “Uniquely, the first Smith School World Forum provided a venue for entrepreneurs, scientists and business leaders to seek and assess ways of defossilizing our economies. As the Copenhagen protocol develops, the World Forum will become the key annual event for leaders of governments, the private sector and academe to examine progress and the many innovative solutions to this unavoidable challenge.”

Figure 1 – Figure 1. Annual global output of articles on “sea level” and “climate change” and global mean sea level (mm) 1996–2006 Source: Scopus

Figure 1 – Figure 1. Annual global output of articles on “sea level” and “climate change” and global mean sea level (mm) 1996–2006 Source: Scopus

References:

(1) Church, J.A. and N.J. White (2006) “A 20th century acceleration in global sea-level rise”, Geophys. Res. Lett., 33, L01602. Updated data.
(2) Khan, T.M.A., Quadir, D.A., Murty, T.S., Kabir, A., Aktar, F. and Sarkar, M.A. (2002) “Relative sea level changes in Maldives and vulnerability of land due to abnormal coastal inundation”, Marine Geodesy, 25, 133–143.
VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

Biomass and biofuels – the promising potential of oilgae

Biomass, commonly defined as a renewable energy source using energy from living or recently living organisms, is not a new initiative. As early as the beginning of the last century, pioneers such as Henry Ford and Rudolph Diesel designed cars and engines to run on biofuels and before World War II, the UK and Germany […]

Read more >


Biomass, commonly defined as a renewable energy source using energy from living or recently living organisms, is not a new initiative. As early as the beginning of the last century, pioneers such as Henry Ford and Rudolph Diesel designed cars and engines to run on biofuels and before World War II, the UK and Germany sold biofuels mixed with petrol or diesel.

However, as the case for climate change becomes more widely accepted, interest in renewable energies in general and biomass in particular is increasing. One outcome of the G8 summit in Italy in July was a document entitled Responsible Leadership for a Sustainable Future (1), which introduced the idea of a “green recovery” through investment in ecological R&D, industry and infrastructure.

The G8 leaders also expressed a specific interest in biofuels: “We welcome the work of the Global Bioenergy Partnership (GBEP) in developing a common methodological framework to measure greenhouse gas emissions from biofuels and invite GBEP to accelerate its work in developing science-based benchmarks and indicators for sustainable biofuel production and to boost technological cooperation and innovation in bioenergy.” (1) Interestingly, the list of most recently prolific institutes in the field (see Table 1) reflects this international awareness.

Food for thought

However, there is also an ethical issue associated with biofuels, as the agricultural land required to produce them takes scarce farmland away from food crops. “Biofuels could help mitigate emissions from the transportation sector,” says Oliver Inderwildi, research fellow at the Smith School of Enterprise and the Environment, University of Oxford, “but not all biofuels are low-carbon fuels. Some emit even more carbon than conventional fossil fuels. Moreover, feedstock farming has serious effects on water resources, food prices and ecosystems. A considerable amount of research is focused on those environmentally unfriendly, high-carbon biofuels. This is because energy security concerns were the key drivers for the recent hype in biofuel research, rather than environmental concerns.”

This is where algal fuel (also known as oilgae) comes in: not only is its productivity higher than that of other biofuel crops, but algae not need arable land or potable water to thrive (2), therefore reducing competition with food crops. Furthermore, algae sequester CO2 as they grow, making them a carbon neutral energy source and the by-products of oilgae production can have other applications, such as in animal feed or as a replacement for common petroleum products, such as plastics or cosmetics. There are even claims that genetic engineering has allowed scientists to modify algae to produce crude oil (3), which can be used to generate different types of fuels and thus eliminates the need to retro-engineer existing engines.

Yusuf Chisti, Professor of Biochemical Engineering at Massey University in New Zealand, believes that using microalgae to produce renewable, carbon-neutral transport fuels is the only way forward. “The technology for transforming crude algal oils into diesel, gasoline and jet fuels exists already,” he comments. “Only microalgae have the potential to provide crude oil in sufficient quantities to meaningfully displace petroleum-derived fuels. Microalgae can do this without affecting our food supply, animal feed or freshwater; producing fuels from algae will not cause deforestation. Algal fuels are currently expensive, however. But due to concerns about climate change, it’s likely that we’ll eventually have to switch to these environmentally friendly fuels.”

However, genetic engineering comes with its own concerns: while the organisms created still require an artificial environment in which to live and multiply, there are concerns that they could escape and adapt to survive in a natural environment (4) where their potential impact is unknown.

Challenges and champions

Despite the controversies, scientific literature on biomass has increased by 9% per annum over the past decade, representing a three-fold increase in articles, reviews, and conference papers between 1996 and 2008 (see Figure 1). At the same time, research into the subfield of oilgae grew 6% per annum, with the number of articles, reviews, and conference papers published each year on the subject more than doubling between 1996 and 2008. This is a strong indicator of the potential of this field.

While the high-yield capacity of algae has been experimentally proven, the production of algal biofuels in vast quantities remains a challenge. However, the G8 leaders expressed a multinational commitment to develop biofuels and invest in renewable energies as part of the “green recovery” plan. Significant R&D investment could help the industrial production of oilgae take off. Further developments on the subject are expected to be unveiled at the United Nations Climate Change conference in Copenhagen in December.

Rank Institute Country Articles, reviews, conference papers Average citations by publication
1 Chinese Academy of Sciences China 1135 1.79
2 USDA Agricultural Research Service USA 462 3.24
3 University of Florida USA 312 2.83
4 Universidade de Sao Paulo Brazil 302 3.33
5 Zhejiang University China 296 1.81
6 Wageningen University Netherlands 286 3.81
7 UC Davis USA 280 3.87
8 Russian Academy of Sciences Russia 264 1.29
9 Sveriges lantbruksuniversitet Sweden 261 4.26
10 Oregon State University USA 248 6.38
11 University of British Columbia Canada 229 4.73
12 Lunds Universitet Sweden 223 5.04
13 Michigan State University USA 221 5.44
14 Cornell University USA 209 5.31
15 Tsinghua University China 206 2.70
16 USDA Forest Service USA 204 4.19
17 Iowa State University USA 204 2.75
18 University of Tokyo Japan 200 3.45
19 Universiteit Gent Belgium 197 4.48
20 Consiglio Nazionale delle Ricerche Italy 194 3.63

Table 1 – Most prolific institutes in biomass and biofuel research; publication and citation years: 2005–2008. Source: Scopus

Figure 1 – Scientific literature on biomass has grown steadily over the past 12 years. Source: Scopus

Figure 1 – Scientific literature on biomass has grown steadily over the past 12 years. Source: Scopus

Useful links:

United Nations Climate Change Conference 2009
G8 Summit 2009

References:

(1) Documents of the G8 Summit 2009
(2) Jha, A (2008) “‘Oil from algae’ promises climate friendly fuel”, The Guardian
(3) Jha, A (2008) “Gene scientist to create algae biofuel with Exxon Mobil”, The Guardian
(4) Rosenberg, J.N., Oyler, G.A., Wilkinson, L. and Betenbaugh, M.J. (2008) “A green light for engineered algae: redirecting metabolism to fuel a biotechnology revolution”, Current Opinion in Biotechnology, 19 (5), pp. 430–436
VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

Biomass, commonly defined as a renewable energy source using energy from living or recently living organisms, is not a new initiative. As early as the beginning of the last century, pioneers such as Henry Ford and Rudolph Diesel designed cars and engines to run on biofuels and before World War II, the UK and Germany sold biofuels mixed with petrol or diesel.

However, as the case for climate change becomes more widely accepted, interest in renewable energies in general and biomass in particular is increasing. One outcome of the G8 summit in Italy in July was a document entitled Responsible Leadership for a Sustainable Future (1), which introduced the idea of a “green recovery” through investment in ecological R&D, industry and infrastructure.

The G8 leaders also expressed a specific interest in biofuels: “We welcome the work of the Global Bioenergy Partnership (GBEP) in developing a common methodological framework to measure greenhouse gas emissions from biofuels and invite GBEP to accelerate its work in developing science-based benchmarks and indicators for sustainable biofuel production and to boost technological cooperation and innovation in bioenergy.” (1) Interestingly, the list of most recently prolific institutes in the field (see Table 1) reflects this international awareness.

Food for thought

However, there is also an ethical issue associated with biofuels, as the agricultural land required to produce them takes scarce farmland away from food crops. “Biofuels could help mitigate emissions from the transportation sector,” says Oliver Inderwildi, research fellow at the Smith School of Enterprise and the Environment, University of Oxford, “but not all biofuels are low-carbon fuels. Some emit even more carbon than conventional fossil fuels. Moreover, feedstock farming has serious effects on water resources, food prices and ecosystems. A considerable amount of research is focused on those environmentally unfriendly, high-carbon biofuels. This is because energy security concerns were the key drivers for the recent hype in biofuel research, rather than environmental concerns.”

This is where algal fuel (also known as oilgae) comes in: not only is its productivity higher than that of other biofuel crops, but algae not need arable land or potable water to thrive (2), therefore reducing competition with food crops. Furthermore, algae sequester CO2 as they grow, making them a carbon neutral energy source and the by-products of oilgae production can have other applications, such as in animal feed or as a replacement for common petroleum products, such as plastics or cosmetics. There are even claims that genetic engineering has allowed scientists to modify algae to produce crude oil (3), which can be used to generate different types of fuels and thus eliminates the need to retro-engineer existing engines.

Yusuf Chisti, Professor of Biochemical Engineering at Massey University in New Zealand, believes that using microalgae to produce renewable, carbon-neutral transport fuels is the only way forward. “The technology for transforming crude algal oils into diesel, gasoline and jet fuels exists already,” he comments. “Only microalgae have the potential to provide crude oil in sufficient quantities to meaningfully displace petroleum-derived fuels. Microalgae can do this without affecting our food supply, animal feed or freshwater; producing fuels from algae will not cause deforestation. Algal fuels are currently expensive, however. But due to concerns about climate change, it’s likely that we’ll eventually have to switch to these environmentally friendly fuels.”

However, genetic engineering comes with its own concerns: while the organisms created still require an artificial environment in which to live and multiply, there are concerns that they could escape and adapt to survive in a natural environment (4) where their potential impact is unknown.

Challenges and champions

Despite the controversies, scientific literature on biomass has increased by 9% per annum over the past decade, representing a three-fold increase in articles, reviews, and conference papers between 1996 and 2008 (see Figure 1). At the same time, research into the subfield of oilgae grew 6% per annum, with the number of articles, reviews, and conference papers published each year on the subject more than doubling between 1996 and 2008. This is a strong indicator of the potential of this field.

While the high-yield capacity of algae has been experimentally proven, the production of algal biofuels in vast quantities remains a challenge. However, the G8 leaders expressed a multinational commitment to develop biofuels and invest in renewable energies as part of the “green recovery” plan. Significant R&D investment could help the industrial production of oilgae take off. Further developments on the subject are expected to be unveiled at the United Nations Climate Change conference in Copenhagen in December.

Rank Institute Country Articles, reviews, conference papers Average citations by publication
1 Chinese Academy of Sciences China 1135 1.79
2 USDA Agricultural Research Service USA 462 3.24
3 University of Florida USA 312 2.83
4 Universidade de Sao Paulo Brazil 302 3.33
5 Zhejiang University China 296 1.81
6 Wageningen University Netherlands 286 3.81
7 UC Davis USA 280 3.87
8 Russian Academy of Sciences Russia 264 1.29
9 Sveriges lantbruksuniversitet Sweden 261 4.26
10 Oregon State University USA 248 6.38
11 University of British Columbia Canada 229 4.73
12 Lunds Universitet Sweden 223 5.04
13 Michigan State University USA 221 5.44
14 Cornell University USA 209 5.31
15 Tsinghua University China 206 2.70
16 USDA Forest Service USA 204 4.19
17 Iowa State University USA 204 2.75
18 University of Tokyo Japan 200 3.45
19 Universiteit Gent Belgium 197 4.48
20 Consiglio Nazionale delle Ricerche Italy 194 3.63

Table 1 – Most prolific institutes in biomass and biofuel research; publication and citation years: 2005–2008. Source: Scopus

Figure 1 – Scientific literature on biomass has grown steadily over the past 12 years. Source: Scopus

Figure 1 – Scientific literature on biomass has grown steadily over the past 12 years. Source: Scopus

Useful links:

United Nations Climate Change Conference 2009
G8 Summit 2009

References:

(1) Documents of the G8 Summit 2009
(2) Jha, A (2008) “‘Oil from algae’ promises climate friendly fuel”, The Guardian
(3) Jha, A (2008) “Gene scientist to create algae biofuel with Exxon Mobil”, The Guardian
(4) Rosenberg, J.N., Oyler, G.A., Wilkinson, L. and Betenbaugh, M.J. (2008) “A green light for engineered algae: redirecting metabolism to fuel a biotechnology revolution”, Current Opinion in Biotechnology, 19 (5), pp. 430–436
VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

Tackling climate change on three fronts: politics, public opinion and science

Public and political interest in tackling climate change has grown in recent years as a result of scientific research. It is now becoming clearer that government policies and public opinion are also spurring further research. Research Trends measures the growing pace.

Read more >


US President Barack Obama’s priority to battle global warming and the success of Al Gore’s movie An Inconvenient Truth are only two examples of the growing political and public interest in climate change. In 2001, Gerald Stanhill published a bibliometric study into the growth of climate change science (1). He found that over a 25-year period between 1970 and 1995 the annual number of publications on climate change in the abstracting journal of the American Meteorological Society increased from 14 to 372.

Stanhill published his analysis in the journal Climatic Change, one of the first journals dedicated to the problem of climate variability and change. Showing significant growth rates, Climatic Change’s publication data from 1996 onward indicate a continuation of the trend that Stanhill describes: the number of articles and reviews published in the journal increased from 83 in 1996 to 162 in 2007 while simultaneously maintaining a positive citation trend, as indicated by the average citations per paper (see Figure 1). A modest rise in the number of unique authors and unique author affiliations in the journal over the same period suggests that this growth in output is at least partly the result of attracting new minds to the problem.

The power of public opinion

In his article, Stanhill emphasizes the impact of extra-scientific factors, such as public interest and government support, on the growth of climate change studies: “The continued public interest and political support needed for this to occur is at least partially dependent on the emergence within the near future of unambiguous and palpable evidence of widespread and economically damaging climate change, preferably in accordance with current scientific predictions.”

According to Hans-Martin Füssel, senior research fellow and head of the working group on adaptation in the research domain Sustainable Solutions at the Potsdam Institute for Climate Impact Research (PIK) in Potsdam, Germany: “The growth in scientific publications on climate change identified in the review study by Stanhill is remarkable in itself, but it covers only part of the literature. The last decade saw a remarkable growth in climate change-related studies in the social sciences, economics and engineering that are unlikely to be covered in the Meteorological and Geoastrophysical Abstracts. This growth is also reflected in the recent launch of several specialized journals, such as Mitigation and Adaptation Strategies for Global Change, Climate and Development, and Carbon and Climate Law Review. It would certainly be worthwhile to conduct an updated review study that includes literature from all three working groups of the Intergovernmental Panel on Climate Change.”

Stephen Schneider, Founder and Editor of Climatic Change, adds: “I don’t know if the scientific output on climate change will continue to increase as it has over the years. I do know, however, that despite the launch of several new journals, Climatic Change has grown and shows growth again this year.”

Stephen Schneider with the bust of Svante ArrheniusStephen Schneider with the bust of Svante Arrhenius, who performed the first CO2 climate change calculations over 110 years ago.

Stephen H. Schneider is the Founder and Editor of the journal Climatic Change. He has published more than 215 articles that have been cited more than 3,800 times since 1996 (source: Scopus). He is co-author of the article
“Fingerprints of Global Warming on Animal and Plants” (2003), Nature, 421, pp. 57–60, which has received 730 citations to date. His new book Science as a Contact Sport; Inside the Battle to Save Earth’s Climate will be published by National Geographic Society Press later this year.

Schneider agrees that the development of climate change research depends, at least partially, on public interest: “It is a matter of policy and in the end it comes down to the question of how much the public is willing to spend. It is true that people want evidence of global warming; people want certainties. On the other hand, people also understand that uncertainties can never be ruled out completely. Financial investments in climate change research are like any other insurance that people buy to protect against future uncertainties. Unfortunately, extreme weather events like hurricane Katrina or the heatwave in 2003 in Europe that killed more than 50,000 people are more likely to get the media’s and other people’s attention.”

Harnessing political will
According to Schneider, the US, as a major contributor of CO2 emissions and an industrial and political power, plays a critical role in the course climate change policy will take: “President Obama’s approach is a dramatic improvement from that of the previous administration. His biggest challenge will be to get support from Congress, which tends to think local and short term, while a global approach and long-term vision are needed.”

Schneider writes in the introduction of his new book scheduled for publication later this year, Science as a Contact Sport; Inside the Battle to Save Earth’s Climate (2): “Today, climate change is acknowledged by most climatological experts around the world. Some have replaced the term global warming with global heating or the global heat trap or simply climate disruption, to indicate our human agency in what has occurred. The more jargon-bound scientists, in their endless striving to prove dispassionate objectivity, call this anthropogenic climate change, an accurate phrase, but not a favorite of newspaper headline writers and TV anchors.

“This acknowledgement of global concern has been achieved through surmounting numerous obstacles along the way. Policymakers, lobbyists, financial interests and extreme skeptics have struggled mightily to steer public opinion – and the funds associated with it – in their preferred directions. Most mainstream scientists have fought back with the weapons at their disposal: methods of truth seeking, such as peer review, responsible reporting of research data, best practice theory, international cooperation and cautious calls for policy consideration. The battle is by no means won. The world needs all our combined strengths to cope with the dangerous climate impacts already in the pipeline, much less prevent far more damaging climate change 20 or more years from now.

“If only President Obama and former rival Senator John McCain – an early supporter of climate action – could unite in showing leadership from one end of Pennsylvania Avenue to the other, we might at last achieve meaningful climate policy.”

Figure 1 – Publications and citation trends in the journal Climatic Change have risen steadily during the last decade. Source: Scopus

Figure 1 – Publications and citation trends in the journal Climatic Change have risen steadily during the last decade. Source: Scopus

Useful links:

Intergovernmental Panel on Climate Change
An Inconvenient Truth
World Meteorological Organization
American Meteorological Society
Stephen H. Schneider
Environmental science and ecology, Elsevier

References:

(1) Stanhill, G. (2001) “The growth of climate change science: a scientometric study”, Climatic Change, Vol. 48, pp. 515–524.
(2) Schneider, S. (2009) “Science as a contact sport; inside the battle to save earth’s climate”, to be published by National Geographic Society Press.
VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

US President Barack Obama’s priority to battle global warming and the success of Al Gore’s movie An Inconvenient Truth are only two examples of the growing political and public interest in climate change. In 2001, Gerald Stanhill published a bibliometric study into the growth of climate change science (1). He found that over a 25-year period between 1970 and 1995 the annual number of publications on climate change in the abstracting journal of the American Meteorological Society increased from 14 to 372.

Stanhill published his analysis in the journal Climatic Change, one of the first journals dedicated to the problem of climate variability and change. Showing significant growth rates, Climatic Change’s publication data from 1996 onward indicate a continuation of the trend that Stanhill describes: the number of articles and reviews published in the journal increased from 83 in 1996 to 162 in 2007 while simultaneously maintaining a positive citation trend, as indicated by the average citations per paper (see Figure 1). A modest rise in the number of unique authors and unique author affiliations in the journal over the same period suggests that this growth in output is at least partly the result of attracting new minds to the problem.

The power of public opinion

In his article, Stanhill emphasizes the impact of extra-scientific factors, such as public interest and government support, on the growth of climate change studies: “The continued public interest and political support needed for this to occur is at least partially dependent on the emergence within the near future of unambiguous and palpable evidence of widespread and economically damaging climate change, preferably in accordance with current scientific predictions.”

According to Hans-Martin Füssel, senior research fellow and head of the working group on adaptation in the research domain Sustainable Solutions at the Potsdam Institute for Climate Impact Research (PIK) in Potsdam, Germany: “The growth in scientific publications on climate change identified in the review study by Stanhill is remarkable in itself, but it covers only part of the literature. The last decade saw a remarkable growth in climate change-related studies in the social sciences, economics and engineering that are unlikely to be covered in the Meteorological and Geoastrophysical Abstracts. This growth is also reflected in the recent launch of several specialized journals, such as Mitigation and Adaptation Strategies for Global Change, Climate and Development, and Carbon and Climate Law Review. It would certainly be worthwhile to conduct an updated review study that includes literature from all three working groups of the Intergovernmental Panel on Climate Change.”

Stephen Schneider, Founder and Editor of Climatic Change, adds: “I don’t know if the scientific output on climate change will continue to increase as it has over the years. I do know, however, that despite the launch of several new journals, Climatic Change has grown and shows growth again this year.”

Stephen Schneider with the bust of Svante ArrheniusStephen Schneider with the bust of Svante Arrhenius, who performed the first CO2 climate change calculations over 110 years ago.

Stephen H. Schneider is the Founder and Editor of the journal Climatic Change. He has published more than 215 articles that have been cited more than 3,800 times since 1996 (source: Scopus). He is co-author of the article
“Fingerprints of Global Warming on Animal and Plants” (2003), Nature, 421, pp. 57–60, which has received 730 citations to date. His new book Science as a Contact Sport; Inside the Battle to Save Earth’s Climate will be published by National Geographic Society Press later this year.

Schneider agrees that the development of climate change research depends, at least partially, on public interest: “It is a matter of policy and in the end it comes down to the question of how much the public is willing to spend. It is true that people want evidence of global warming; people want certainties. On the other hand, people also understand that uncertainties can never be ruled out completely. Financial investments in climate change research are like any other insurance that people buy to protect against future uncertainties. Unfortunately, extreme weather events like hurricane Katrina or the heatwave in 2003 in Europe that killed more than 50,000 people are more likely to get the media’s and other people’s attention.”

Harnessing political will
According to Schneider, the US, as a major contributor of CO2 emissions and an industrial and political power, plays a critical role in the course climate change policy will take: “President Obama’s approach is a dramatic improvement from that of the previous administration. His biggest challenge will be to get support from Congress, which tends to think local and short term, while a global approach and long-term vision are needed.”

Schneider writes in the introduction of his new book scheduled for publication later this year, Science as a Contact Sport; Inside the Battle to Save Earth’s Climate (2): “Today, climate change is acknowledged by most climatological experts around the world. Some have replaced the term global warming with global heating or the global heat trap or simply climate disruption, to indicate our human agency in what has occurred. The more jargon-bound scientists, in their endless striving to prove dispassionate objectivity, call this anthropogenic climate change, an accurate phrase, but not a favorite of newspaper headline writers and TV anchors.

“This acknowledgement of global concern has been achieved through surmounting numerous obstacles along the way. Policymakers, lobbyists, financial interests and extreme skeptics have struggled mightily to steer public opinion – and the funds associated with it – in their preferred directions. Most mainstream scientists have fought back with the weapons at their disposal: methods of truth seeking, such as peer review, responsible reporting of research data, best practice theory, international cooperation and cautious calls for policy consideration. The battle is by no means won. The world needs all our combined strengths to cope with the dangerous climate impacts already in the pipeline, much less prevent far more damaging climate change 20 or more years from now.

“If only President Obama and former rival Senator John McCain – an early supporter of climate action – could unite in showing leadership from one end of Pennsylvania Avenue to the other, we might at last achieve meaningful climate policy.”

Figure 1 – Publications and citation trends in the journal Climatic Change have risen steadily during the last decade. Source: Scopus

Figure 1 – Publications and citation trends in the journal Climatic Change have risen steadily during the last decade. Source: Scopus

Useful links:

Intergovernmental Panel on Climate Change
An Inconvenient Truth
World Meteorological Organization
American Meteorological Society
Stephen H. Schneider
Environmental science and ecology, Elsevier

References:

(1) Stanhill, G. (2001) “The growth of climate change science: a scientometric study”, Climatic Change, Vol. 48, pp. 515–524.
(2) Schneider, S. (2009) “Science as a contact sport; inside the battle to save earth’s climate”, to be published by National Geographic Society Press.
VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

…a classic paper?

Why do researchers continue to cite classic papers for many decades? Is it to formally acknowledge an intellectual debt or is it the ‘done thing’ in the field? We ask two researchers why they cited a classic paper.

Read more >


Citation classics (articles receiving many more than the expected number of citations for their area) exist in all fields of research. Often marking technological or theoretical advances, they can stimulate a generation of researchers to make significant advances that might otherwise not have been possible. As such, they become highly cited and may persist for many years.

Tyge Payne

Tyge Payne

Why do researchers continue to return to these classic papers long after the initial wave of excitement has passed? Are they cited to formally acknowledge an intellectual debt or is it just the ‘done thing’ in the field?

Conceptual shorthand

Michael Jensen and William Meckling’s landmark 1976 paper, ‘Theory of the firm: Managerial behavior, agency costs and ownership structure’ (1), offered a unique synthesis of three existing theories (of agency, of property rights and of finance) to formulate a theory of the ownership structure of a firm. The article has been cited over 3,900 times since 1996 in journals on topics as diverse as business, management, accounting, economics, econometrics, finance, decision sciences and psychology, and its broad impact can be attributed to its inclusive scope and broad theoretical framework.

Associate Professor Tyge Payne at Rawls College of Business, Texas Tech University, cited this classic paper in two of his recent papers on management decision-making and performance (2, 3). Dr Payne notes that in doing so: “I was drawing on classic agency theory and, therefore, used Jensen and Meckling as a means of establishing that line of thinking.”

David Jones

David Jones

He continues: “For me, citation classics are a way of communicating certain ideas to the reader. It positions the reader in an established research stream and develops a measure of credibility without having to extensively develop a particular line of thinking.”

Acknowledging past achievements

Citation classics may offer methodological advances that subsequently become the standard or reference procedure for work in a given field – and sometimes beyond. Despite running to just six pages in length, James Murphy and John Riley’s 1962 article, ‘A modified single solution method for the determination of phosphate in natural waters’ (4), has been cited more than 4,200 times since 1996. As the authors stated in a 1986 column: “I suppose that this paper has been so extensively cited [because] it provides a simple, highly reproducible technique for the determination of microgram amounts of phosphate. Almost 25 years later, the method is still the recommended standard procedure for the analysis of fresh and potable waters, as well as seawater. Although it was originally developed for the analysis of phosphate in natural waters, it has been widely adopted in many other fields, including, for example, botany, zoology, biochemistry, geochemistry, metallurgy, and clinical medicine. Indeed, kits for the determination are available commercially for use in physiological investigations and water analysis.”

Professor David L. Jones at the School of the Environment and Natural Resources, Bangor University, Wales, has cited this paper three times this year (5, 6, 7). He says: “This is the definitive paper on measuring phosphorus in soil solution. Murphy and Riley published the method first, and it only seems fitting to acknowledge that achievement.”

Citation classics punctuate the scholarly landscape and mark waypoints in our development as a scientific society. But they are more than that: in the words of Eugene Garfield, who coined the term “citation classic” in 1977 (8), “this is the human side of science”.

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

Citation classics (articles receiving many more than the expected number of citations for their area) exist in all fields of research. Often marking technological or theoretical advances, they can stimulate a generation of researchers to make significant advances that might otherwise not have been possible. As such, they become highly cited and may persist for many years.

Tyge Payne

Tyge Payne

Why do researchers continue to return to these classic papers long after the initial wave of excitement has passed? Are they cited to formally acknowledge an intellectual debt or is it just the ‘done thing’ in the field?

Conceptual shorthand

Michael Jensen and William Meckling’s landmark 1976 paper, ‘Theory of the firm: Managerial behavior, agency costs and ownership structure’ (1), offered a unique synthesis of three existing theories (of agency, of property rights and of finance) to formulate a theory of the ownership structure of a firm. The article has been cited over 3,900 times since 1996 in journals on topics as diverse as business, management, accounting, economics, econometrics, finance, decision sciences and psychology, and its broad impact can be attributed to its inclusive scope and broad theoretical framework.

Associate Professor Tyge Payne at Rawls College of Business, Texas Tech University, cited this classic paper in two of his recent papers on management decision-making and performance (2, 3). Dr Payne notes that in doing so: “I was drawing on classic agency theory and, therefore, used Jensen and Meckling as a means of establishing that line of thinking.”

David Jones

David Jones

He continues: “For me, citation classics are a way of communicating certain ideas to the reader. It positions the reader in an established research stream and develops a measure of credibility without having to extensively develop a particular line of thinking.”

Acknowledging past achievements

Citation classics may offer methodological advances that subsequently become the standard or reference procedure for work in a given field – and sometimes beyond. Despite running to just six pages in length, James Murphy and John Riley’s 1962 article, ‘A modified single solution method for the determination of phosphate in natural waters’ (4), has been cited more than 4,200 times since 1996. As the authors stated in a 1986 column: “I suppose that this paper has been so extensively cited [because] it provides a simple, highly reproducible technique for the determination of microgram amounts of phosphate. Almost 25 years later, the method is still the recommended standard procedure for the analysis of fresh and potable waters, as well as seawater. Although it was originally developed for the analysis of phosphate in natural waters, it has been widely adopted in many other fields, including, for example, botany, zoology, biochemistry, geochemistry, metallurgy, and clinical medicine. Indeed, kits for the determination are available commercially for use in physiological investigations and water analysis.”

Professor David L. Jones at the School of the Environment and Natural Resources, Bangor University, Wales, has cited this paper three times this year (5, 6, 7). He says: “This is the definitive paper on measuring phosphorus in soil solution. Murphy and Riley published the method first, and it only seems fitting to acknowledge that achievement.”

Citation classics punctuate the scholarly landscape and mark waypoints in our development as a scientific society. But they are more than that: in the words of Eugene Garfield, who coined the term “citation classic” in 1977 (8), “this is the human side of science”.

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

Busting the open access myth

Open access has been touted as the future of scientific publishing, claiming benefits such as wider readership and, crucially, significantly higher citation rates. However, research carried out by Phil Davis at Cornell University suggests that the manner of publication may have very little to do with citations. He discusses his latest research.

Read more >


Open access has been touted as the future of scientific publishing, claiming benefits such as wider readership and, crucially, significantly higher citation rates. However, research carried out by Phil Davis at Cornell University suggests that the manner of publication may have very little to do with citations. He discusses his latest research (1).

Research Trends (RT): Your methodology is pretty unique for doing citation analysis. How did you decide on a randomized controlled trial?

Phil Davis (PD): Previous studies that measured the citation advantage were all based on observational methodologies. Essentially, researchers counted citations to open-access articles and compared them to subscription-access articles. This is a very weak methodology, as it ignores factors other than access that lead to a citation. It also ignores the direction of causality.

The only way to adequately control for confounding explanations and to rule out the possibility of reverse directionality was to set up a proper scientific trial. By randomizing which articles were given the open-access “treatment” we could effectively control for other possible causes and focus entirely on the effect of access on readership and citations. This methodology makes our study much more rigorous than other observational studies that were done in the past.

RT: How did you get publishers to participate in your study?

PD: It was much easier than I expected! I focused on recruiting scientific societies, since I knew they had an interest in the outcome of the study. Ultimately, their participation depended on trust: they trusted that I would conduct a rigorous, scientific study and that I was going to be fair and objective in reporting the results. All but one publisher gave me access to their online publishing system so that I could manipulate the access conditions without their involvement, thus minimizing potential publisher influence and bias. Every publisher gave me full access to their statistical reporting systems. This says a lot about the integrity of these people and their dedication to the scientific process.

RT: You found evidence that open access increases readership but not citations. What does this mean?

PD: A large open access “citation advantage” would suggest that the subscription model is doing a very poor job of disseminating information to the research community. The fact that we were unable to detect a citation difference suggests that the subscription model is operating efficiently, at least for authors. Yet, the research community is not the only group that reads the scientific literature. We were able to document a large increase in full-text article downloads and a smaller, but significant, increase in PDF downloads and unique visitors to the journal websites. This suggests that open-access publishing may reach a wider readership community, although this may not translate into more citations.

RT: Who are these additional readers of the scientific literature?

PD: It is difficult to say from our data. We know that they are accessing the literature from outside subscriber IP addresses. But we don’t know who they are, nor do we know their intention. They could be people like my dad – who had triple bypass heart surgery – typing a search query into Google and landing on an article published by the American Heart Association. They could be teachers, students, physicians or journalists, or just interested people trying to learn from the primary literature. The research field is wide open on answering this question.

RT: Why is measuring a citation difference so important in making the case for open access?

PD: Most scientists view citations as a form of reward, and thus an incentive, for where and how they publish. The potential of getting a 50–250% return in expected citations by publishing in an open-access journal or by making your articles freely available from an institutional archive has been used repeatedly as an argument to change the behavior of scientists. There are many other good reasons for making one’s results widely available – a citation advantage, however, does not appear to be one of them.

RT: You take issue with the phrase “open access”. Why?

PD: “Open access” assumes a dissemination model in which information only flows from the publisher to the reader. It’s a model that completely ignores the high degree of sharing of articles that takes place within informal networks of authors, readers and libraries. I’m very privileged to belong to an institution with such rich access to the literature, and yet I still depend on my peers for copies of research articles and manuscripts. Secondly, “open access” implies a right to information; I much prefer “free access”, which implies a privilege.

RT: Some have criticized you for reporting too early on your study. What is your response?

PD: Our first article, reporting initial results within the first year after publication (1) was indeed published early in the study. We felt confident that the main results wouldn’t change over time, and they haven’t. After two years, we are yet to detect a difference in the citations to the open-access articles compared to the control articles. Remember that other studies had reported huge differences after very short periods of time, some within the first few months after publication. I was confident that if we didn’t see a difference within the first year, we were unlikely to see a difference in the future. I’m glad we made the decision to publish early. Similar findings from other journals in the sciences, medicine, social sciences and humanities will be coming out in the next few years.

RT: The scholarly publishing field is changing very rapidly. How relevant will your study be, in say, five years?

PD: I imagine that the main results of our study will largely be moot in another five years. The information landscape is changing very rapidly right now, with new granting and institutional policies and new publishing business models. Bibliometrics is a very powerful tool, although it requires theory from other disciplines to give it meaning. This is why my professors have pushed me to read into the history of science, economics, communication, law and sociology. When this study runs its course, I hope to be ready for the next big question.

Reference:

(1) Davis, P. (2008) ‘Open access publishing, article downloads, and citations: randomised controlled trial’, BMJ 2008;337:a568; doi:10.1136/bmj.a568

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

Open access has been touted as the future of scientific publishing, claiming benefits such as wider readership and, crucially, significantly higher citation rates. However, research carried out by Phil Davis at Cornell University suggests that the manner of publication may have very little to do with citations. He discusses his latest research (1).

Research Trends (RT): Your methodology is pretty unique for doing citation analysis. How did you decide on a randomized controlled trial?

Phil Davis (PD): Previous studies that measured the citation advantage were all based on observational methodologies. Essentially, researchers counted citations to open-access articles and compared them to subscription-access articles. This is a very weak methodology, as it ignores factors other than access that lead to a citation. It also ignores the direction of causality.

The only way to adequately control for confounding explanations and to rule out the possibility of reverse directionality was to set up a proper scientific trial. By randomizing which articles were given the open-access “treatment” we could effectively control for other possible causes and focus entirely on the effect of access on readership and citations. This methodology makes our study much more rigorous than other observational studies that were done in the past.

RT: How did you get publishers to participate in your study?

PD: It was much easier than I expected! I focused on recruiting scientific societies, since I knew they had an interest in the outcome of the study. Ultimately, their participation depended on trust: they trusted that I would conduct a rigorous, scientific study and that I was going to be fair and objective in reporting the results. All but one publisher gave me access to their online publishing system so that I could manipulate the access conditions without their involvement, thus minimizing potential publisher influence and bias. Every publisher gave me full access to their statistical reporting systems. This says a lot about the integrity of these people and their dedication to the scientific process.

RT: You found evidence that open access increases readership but not citations. What does this mean?

PD: A large open access “citation advantage” would suggest that the subscription model is doing a very poor job of disseminating information to the research community. The fact that we were unable to detect a citation difference suggests that the subscription model is operating efficiently, at least for authors. Yet, the research community is not the only group that reads the scientific literature. We were able to document a large increase in full-text article downloads and a smaller, but significant, increase in PDF downloads and unique visitors to the journal websites. This suggests that open-access publishing may reach a wider readership community, although this may not translate into more citations.

RT: Who are these additional readers of the scientific literature?

PD: It is difficult to say from our data. We know that they are accessing the literature from outside subscriber IP addresses. But we don’t know who they are, nor do we know their intention. They could be people like my dad – who had triple bypass heart surgery – typing a search query into Google and landing on an article published by the American Heart Association. They could be teachers, students, physicians or journalists, or just interested people trying to learn from the primary literature. The research field is wide open on answering this question.

RT: Why is measuring a citation difference so important in making the case for open access?

PD: Most scientists view citations as a form of reward, and thus an incentive, for where and how they publish. The potential of getting a 50–250% return in expected citations by publishing in an open-access journal or by making your articles freely available from an institutional archive has been used repeatedly as an argument to change the behavior of scientists. There are many other good reasons for making one’s results widely available – a citation advantage, however, does not appear to be one of them.

RT: You take issue with the phrase “open access”. Why?

PD: “Open access” assumes a dissemination model in which information only flows from the publisher to the reader. It’s a model that completely ignores the high degree of sharing of articles that takes place within informal networks of authors, readers and libraries. I’m very privileged to belong to an institution with such rich access to the literature, and yet I still depend on my peers for copies of research articles and manuscripts. Secondly, “open access” implies a right to information; I much prefer “free access”, which implies a privilege.

RT: Some have criticized you for reporting too early on your study. What is your response?

PD: Our first article, reporting initial results within the first year after publication (1) was indeed published early in the study. We felt confident that the main results wouldn’t change over time, and they haven’t. After two years, we are yet to detect a difference in the citations to the open-access articles compared to the control articles. Remember that other studies had reported huge differences after very short periods of time, some within the first few months after publication. I was confident that if we didn’t see a difference within the first year, we were unlikely to see a difference in the future. I’m glad we made the decision to publish early. Similar findings from other journals in the sciences, medicine, social sciences and humanities will be coming out in the next few years.

RT: The scholarly publishing field is changing very rapidly. How relevant will your study be, in say, five years?

PD: I imagine that the main results of our study will largely be moot in another five years. The information landscape is changing very rapidly right now, with new granting and institutional policies and new publishing business models. Bibliometrics is a very powerful tool, although it requires theory from other disciplines to give it meaning. This is why my professors have pushed me to read into the history of science, economics, communication, law and sociology. When this study runs its course, I hope to be ready for the next big question.

Reference:

(1) Davis, P. (2008) ‘Open access publishing, article downloads, and citations: randomised controlled trial’, BMJ 2008;337:a568; doi:10.1136/bmj.a568

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

What’s leading the curve: research or policy?

Stem cell research often attracts headlines due to the controversial nature of human embryonic stem cell research, and most countries have strict rules governing what can and cannot be done with public funding in this field. Research Trends investigates the relationship between policy changes and publication rates in recent years.

Read more >


Stem cells are characterized by the ability to renew themselves and differentiate into a diverse range of specialized cell types. Therefore, stem cell research opens the possibility of new technologies with great therapeutic potential for many medical conditions. The two main types of mammalian stem cells are adult stem cells, which are found in adult tissues, and embryonic stem cells, which are isolated from the inner cell mass of early embryos. The latter are at the center of a moral and ethical debate on research involving the creation, use and destruction of human embryonic stem cells.

Stem cell research is an exciting but controversial field and has been the subject of intense debate in recent years. Despite the controversy, the field has sustained a strong growth over the past decade (~11%). National policies on stem cell research have been evolving with the debate and influencing research outputs, as can be seen in the following overview of the most prolific countries during the past decade.

Stem cell research in the USA, Japan, Germany, the UK and France may be affected by national policy.

Stem cell research in the USA, Japan, Germany, the UK and France may be affected by national policy.

France: ranked 5

Since the 1994 bioethics laws, embryo research has in principle been forbidden in France. However, these laws are evaluated every five years and the bioethic law of August 6, 2004, allowed research with a therapeutic aim and under extremely controlled conditions to be carried out. Research is only permitted on embryos created as a by-product of IVF and only with the parents’ agreement that the superfluous embryos can be used for research purposes. Taking into account the fact that research manuscripts can take up to a year to appear in a journal, this relaxing of the law may be reflected in a slight increase in the number of articles on this subject published by French authors in 2005. The law is due to be re-examined at the end of the year, so the future of stem cell research in France is still uncertain.

UK: ranked 4

The UK has a well-established system for regulating the creation and use of embryos: the Human Fertilisation and Embryology Act (HFEA) of 1990. This act allows the creation and use of embryos for research, provided that the research is for one of five specified purposes and has been granted a license by the HFEA. The UK Stem Cell Initiative, launched in March 2005, aims at defining a 10-year vision for UK stem cell research and coordinating its public and private funding. This initiative matches a modest increase in the number of articles published by UK authors in the following years.

Germany: ranked 3

Any creation of human stem cells that also involves the use of embryos is prohibited in Germany by the Embryo Protection Law of January 1, 1991. However, this law does not cover the importation of stem cell lines produced in other countries from human embryos and their use in Germany for the purposes of research. In response, the Bundestag adopted the Stem Cell Law on July 1, 2002. In principle, this law prohibits the importation and use of human embryonic stem cells except for research under exceptional conditions. Germany’s peak in output in 2001 could be interpreted as an advanced reaction to the impending 2002 Stem Cell Law. In May 2007, the Stem Cell Law was discussed by the German Parliament, the Bundestag’s, Committee on Education, Research and Appraisal of the Consequences of Technology, a fact that could suggest a need for reform.

Japan: ranked 2

Since June 6, 2001, the law on Human Cloning Techniques and Other Similar Techniques has been enforced in Japan. This law specifically bans reproductive cloning and recommends the elaboration of national guidelines for the creation of “Specified Embryos” for research purposes. On July 23, 2004, Japan’s Council for Science and Technology Policy, the Japanese government’s highest science and technology policy body, approved the final report of its Bioethics Expert Panel on human embryo and stem cell research. The report recommended allowing the creation of human embryos for stem cell research. Japan’s relatively steady increase in output could reflect the absence of any drastic new legislation in recent years.

USA: ranked 1

On August 9, 2001, President George Bush allowed the funding of stem cell research through taxpayer financing, but only with strict limits. The USA’s peak in output in 2001 could correspond to an accelerated publication of publicly funded research at the time of President Bush’s restrictive statement; surprisingly, commentary on the legislation is scarce in the scientific literature of the time. The House of Representatives passed a bill on May 24, 2005, to expand federal financing for embryonic stem cell research, defying a veto threat from then President Bush. On March 9, 2009, President Barack Obama issued Executive Order 13505, Removing Barriers to Responsible Scientific Research Involving Human Stem Cells, revoking President Bush’s statement of August 9, 2001, as well as its supplemental executive order of June 20, 2007, and opens the door to new horizons for the future of stem cell research in the US. The continued upswing in research outputs on stem cells may reflect the fact that many researchers have sought non-federal funding for their research or diversified their research efforts into permissible techniques for acquiring human stem cells. Recently, two leading bioethicists have even argued that Bush’s restrictive policy may have inadvertently pushed stem cell research, and thinking about the underlying ethical dilemmas, much further forward. (1)

Open future

As the debate on stem cell research evolves, national policies follow: advances in the field raise new ethical issues, entailing an evolution of the controversy, and a resulting need for new regulations. The ethics of stem cell research are still controversial and, despite a recent tendency towards an increase in legislative permissiveness, the future of this exciting field of research is still to be written.

Reference:
(1) ‘Benefits of the stem cell ban’, New Scientist, available at:
www.the-scientist.com/news/display/55752/

(registration, which is free, is required to read this article)

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

Stem cells are characterized by the ability to renew themselves and differentiate into a diverse range of specialized cell types. Therefore, stem cell research opens the possibility of new technologies with great therapeutic potential for many medical conditions. The two main types of mammalian stem cells are adult stem cells, which are found in adult tissues, and embryonic stem cells, which are isolated from the inner cell mass of early embryos. The latter are at the center of a moral and ethical debate on research involving the creation, use and destruction of human embryonic stem cells.

Stem cell research is an exciting but controversial field and has been the subject of intense debate in recent years. Despite the controversy, the field has sustained a strong growth over the past decade (~11%). National policies on stem cell research have been evolving with the debate and influencing research outputs, as can be seen in the following overview of the most prolific countries during the past decade.

Stem cell research in the USA, Japan, Germany, the UK and France may be affected by national policy.

Stem cell research in the USA, Japan, Germany, the UK and France may be affected by national policy.

France: ranked 5

Since the 1994 bioethics laws, embryo research has in principle been forbidden in France. However, these laws are evaluated every five years and the bioethic law of August 6, 2004, allowed research with a therapeutic aim and under extremely controlled conditions to be carried out. Research is only permitted on embryos created as a by-product of IVF and only with the parents’ agreement that the superfluous embryos can be used for research purposes. Taking into account the fact that research manuscripts can take up to a year to appear in a journal, this relaxing of the law may be reflected in a slight increase in the number of articles on this subject published by French authors in 2005. The law is due to be re-examined at the end of the year, so the future of stem cell research in France is still uncertain.

UK: ranked 4

The UK has a well-established system for regulating the creation and use of embryos: the Human Fertilisation and Embryology Act (HFEA) of 1990. This act allows the creation and use of embryos for research, provided that the research is for one of five specified purposes and has been granted a license by the HFEA. The UK Stem Cell Initiative, launched in March 2005, aims at defining a 10-year vision for UK stem cell research and coordinating its public and private funding. This initiative matches a modest increase in the number of articles published by UK authors in the following years.

Germany: ranked 3

Any creation of human stem cells that also involves the use of embryos is prohibited in Germany by the Embryo Protection Law of January 1, 1991. However, this law does not cover the importation of stem cell lines produced in other countries from human embryos and their use in Germany for the purposes of research. In response, the Bundestag adopted the Stem Cell Law on July 1, 2002. In principle, this law prohibits the importation and use of human embryonic stem cells except for research under exceptional conditions. Germany’s peak in output in 2001 could be interpreted as an advanced reaction to the impending 2002 Stem Cell Law. In May 2007, the Stem Cell Law was discussed by the German Parliament, the Bundestag’s, Committee on Education, Research and Appraisal of the Consequences of Technology, a fact that could suggest a need for reform.

Japan: ranked 2

Since June 6, 2001, the law on Human Cloning Techniques and Other Similar Techniques has been enforced in Japan. This law specifically bans reproductive cloning and recommends the elaboration of national guidelines for the creation of “Specified Embryos” for research purposes. On July 23, 2004, Japan’s Council for Science and Technology Policy, the Japanese government’s highest science and technology policy body, approved the final report of its Bioethics Expert Panel on human embryo and stem cell research. The report recommended allowing the creation of human embryos for stem cell research. Japan’s relatively steady increase in output could reflect the absence of any drastic new legislation in recent years.

USA: ranked 1

On August 9, 2001, President George Bush allowed the funding of stem cell research through taxpayer financing, but only with strict limits. The USA’s peak in output in 2001 could correspond to an accelerated publication of publicly funded research at the time of President Bush’s restrictive statement; surprisingly, commentary on the legislation is scarce in the scientific literature of the time. The House of Representatives passed a bill on May 24, 2005, to expand federal financing for embryonic stem cell research, defying a veto threat from then President Bush. On March 9, 2009, President Barack Obama issued Executive Order 13505, Removing Barriers to Responsible Scientific Research Involving Human Stem Cells, revoking President Bush’s statement of August 9, 2001, as well as its supplemental executive order of June 20, 2007, and opens the door to new horizons for the future of stem cell research in the US. The continued upswing in research outputs on stem cells may reflect the fact that many researchers have sought non-federal funding for their research or diversified their research efforts into permissible techniques for acquiring human stem cells. Recently, two leading bioethicists have even argued that Bush’s restrictive policy may have inadvertently pushed stem cell research, and thinking about the underlying ethical dilemmas, much further forward. (1)

Open future

As the debate on stem cell research evolves, national policies follow: advances in the field raise new ethical issues, entailing an evolution of the controversy, and a resulting need for new regulations. The ethics of stem cell research are still controversial and, despite a recent tendency towards an increase in legislative permissiveness, the future of this exciting field of research is still to be written.

Reference:
(1) ‘Benefits of the stem cell ban’, New Scientist, available at:
www.the-scientist.com/news/display/55752/

(registration, which is free, is required to read this article)

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)
Research Trends Image

Analyzing the multidisciplinary landscape

Many of our most urgent scientific challenges require multidisciplinary approaches; however, research performance is typically measured on a unidisciplinary basis. Research Trends learns about a new study seeking to measure output in alternative-energy research in a novel way.

Read more >


Many of today’s most pressing scientific challenges, such as identifying alternative energy sources, require a multidisciplinary approach. However, traditional methods for assessing research output cannot adequately measure multidisciplinary research output.

Current methods of organizing, and thus analyzing, science are based on journal categories. Yet, since journals are based on single disciplines, this classification system cannot capture the changing landscape. This means it is impossible for research executives and government policymakers to gain insight into which institutions, countries and regions are leading in such fields as alternative energy.

Leaders in alternative-energy research
Alternative-energy research is, by its very nature, multidisciplinary, and any attempt to identify leaders in this field must take this into account. In order to rank leaders in alternative-energy research, Boyak and Klavans first identified alternative energy-related paradigms using search terms from relevant websites. They discovered that 1,100 paradigms contained alternative energy research, and divided these into three equally distributed topic groups:

1. Solar/PV
2. Fuel cells
3. Environmentally related (efficiency + renewable + biomass + biodiesel + biofuel + nuclear + wind + cogeneration + clean coal + carbon + bioenergy + security + hydroelectric + geothermal)

They then counted the alternative-energy papers for over 3,000 major academic and government players within the global research community, ranked them according to output and calculated distinctive competencies for each of the top-50 institutions on the list.

To rank the research leaders in this field (see Figure 1), they found where the 1,100 paradigms from the three topic groups belonged to a distinctive competency and counted the number of alternative-energy papers that were in distinctive competencies for each university/laboratory.

This information was aggregated to identify country (see Figures 2, 3 and 4) and regional leaders in alternative-energy research.

However, research executives need accurate research performance information to identify areas of research strengths and make strategic decisions. Developing an accurate picture of how universities and countries are performing is critical to advancing the frontiers of science.

A new way to measure multidisciplinary impact

Senior Development Advisors Kevin Boyak and Dick Klavans, together with Elsevier, have developed a new method of measuring output in multidisciplinary research. Based on co-citation analysis, SciVal Spotlight displays research performance from an interdisciplinary perspective.

Using Scopus as its underlying data source, SciVal Spotlight draws upon 5.6 million research papers published between 2003 and 2007, along with another two million reference papers that these publications cite heavily. This content was divided into about 80,000 paradigms, each of which is centered on a separate topic (e.g. alternative energy) in science.

These paradigms were used to identify an institution’s distinctive competencies. Researchers tend to focus within a unique set of related paradigms, which form natural clusters based on the research networks at their institution. These clusters can be seen as the institution’s distinctive competencies, and are the areas in which the institution is a research leader. One of the unique features of this method is that it can identify those distinctive competencies that link multiple disciplines within an institution, indicating that research within the university is not being done in isolated silos. If work does not appear as part of a distinctive competency, this does not mean that it is not good work, but rather that it is isolated, and not part of a larger network.

An institute is identified as a research leader if it displays substantial activity and impact in the topics associated with the paradigm.

True leadership is in distinctive competencies

Using this new methodology to measure which institutions, countries and regions are research leaders in alternative energy-related science gave some surprising and insightful results (see box for method).

At an institute-level, the top-10 world institutes are almost all in the United States, with Germany a close second (see Figure 1). In fact, the United States is ahead in all of the topic groups on a single-country basis; however, the only area in which it has overwhelming leadership is in environmentally related research. In fuel cells and solar energy, leadership is more diffuse, and Germany and China are significant players in these two fields.

In fact, while Germany’s total number of papers remains lower than the United States’, its percentage of papers in distinctive competencies in both solar-energy and fuel-cells research is higher. This indicates that Germany is a formidable competitor in these areas, particularly in solar energy, where it has 335 papers in distinctive competencies compared with 454 for the United States.

Identifying distinctive competencies rather than simply replying on citation counts shows where competition could come from in the future. While Germany may not yet be leading the United Stated on alternative-energy research, it is certainly developing deep expertise in a wide range of disciplines, which could result in breakthroughs in the near future.

If our most urgent scientific challenges, such as alternative-energy, require a multidisciplinary approach, then we urgently need to find ways of measuring output in these areas. Future breakthroughs in such areas are expected to emerge from the institutes and countries drawing on the widest range of their research capabilities to answer specific questions. And this methodology helps us see where those breakthroughs are likely to emerge.

Institution Country Total
1 NASA Goddard Space Flight Center US 309
2 National Renewable Energy Laboratory US 271
3 Hahn-Meitner-Institut DE 240
4 Forschungszentrum Julich DE 234
5 Pennsylvania State University US 168
6 National Oceanic and Atmospheric Administration US 121
7 University of California at Irvine US 101
8 Osaka University JP 97
9 California Institute of Technology US 97
10 Harvard University US 84

Figure 1 – Top-10 institutions for alternative-energy research

Country Total papers Papers in DCs % in DCs
United States 893 454 51%
Japan 455 149 33%
Germany 370 335 91%

Figure 2 – Top-three countries for solar/photovoltaic research

Country Total papers Papers in DCs % in DCs
United States 1006 377 38%
China 574 157 27%
Japan 531 94 18%

Figure 3 – Top-three countries for fuel-cells research

Country Total papers Papers in DCs % in DCs
United States 1997 797 40%
China 425 75 18%
Japan 216 0%

Figure 4 – Top-three countries for environmentally related energy research

Useful links:

SciVal Spotlight
Research leadership redefined – measuring performance in a multidisciplinary landscape. Listen to the webinar here
USA Today, ‘US institutes lead in environmental research expertise’

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

Many of today’s most pressing scientific challenges, such as identifying alternative energy sources, require a multidisciplinary approach. However, traditional methods for assessing research output cannot adequately measure multidisciplinary research output.

Current methods of organizing, and thus analyzing, science are based on journal categories. Yet, since journals are based on single disciplines, this classification system cannot capture the changing landscape. This means it is impossible for research executives and government policymakers to gain insight into which institutions, countries and regions are leading in such fields as alternative energy.

Leaders in alternative-energy research
Alternative-energy research is, by its very nature, multidisciplinary, and any attempt to identify leaders in this field must take this into account. In order to rank leaders in alternative-energy research, Boyak and Klavans first identified alternative energy-related paradigms using search terms from relevant websites. They discovered that 1,100 paradigms contained alternative energy research, and divided these into three equally distributed topic groups:

1. Solar/PV
2. Fuel cells
3. Environmentally related (efficiency + renewable + biomass + biodiesel + biofuel + nuclear + wind + cogeneration + clean coal + carbon + bioenergy + security + hydroelectric + geothermal)

They then counted the alternative-energy papers for over 3,000 major academic and government players within the global research community, ranked them according to output and calculated distinctive competencies for each of the top-50 institutions on the list.

To rank the research leaders in this field (see Figure 1), they found where the 1,100 paradigms from the three topic groups belonged to a distinctive competency and counted the number of alternative-energy papers that were in distinctive competencies for each university/laboratory.

This information was aggregated to identify country (see Figures 2, 3 and 4) and regional leaders in alternative-energy research.

However, research executives need accurate research performance information to identify areas of research strengths and make strategic decisions. Developing an accurate picture of how universities and countries are performing is critical to advancing the frontiers of science.

A new way to measure multidisciplinary impact

Senior Development Advisors Kevin Boyak and Dick Klavans, together with Elsevier, have developed a new method of measuring output in multidisciplinary research. Based on co-citation analysis, SciVal Spotlight displays research performance from an interdisciplinary perspective.

Using Scopus as its underlying data source, SciVal Spotlight draws upon 5.6 million research papers published between 2003 and 2007, along with another two million reference papers that these publications cite heavily. This content was divided into about 80,000 paradigms, each of which is centered on a separate topic (e.g. alternative energy) in science.

These paradigms were used to identify an institution’s distinctive competencies. Researchers tend to focus within a unique set of related paradigms, which form natural clusters based on the research networks at their institution. These clusters can be seen as the institution’s distinctive competencies, and are the areas in which the institution is a research leader. One of the unique features of this method is that it can identify those distinctive competencies that link multiple disciplines within an institution, indicating that research within the university is not being done in isolated silos. If work does not appear as part of a distinctive competency, this does not mean that it is not good work, but rather that it is isolated, and not part of a larger network.

An institute is identified as a research leader if it displays substantial activity and impact in the topics associated with the paradigm.

True leadership is in distinctive competencies

Using this new methodology to measure which institutions, countries and regions are research leaders in alternative energy-related science gave some surprising and insightful results (see box for method).

At an institute-level, the top-10 world institutes are almost all in the United States, with Germany a close second (see Figure 1). In fact, the United States is ahead in all of the topic groups on a single-country basis; however, the only area in which it has overwhelming leadership is in environmentally related research. In fuel cells and solar energy, leadership is more diffuse, and Germany and China are significant players in these two fields.

In fact, while Germany’s total number of papers remains lower than the United States’, its percentage of papers in distinctive competencies in both solar-energy and fuel-cells research is higher. This indicates that Germany is a formidable competitor in these areas, particularly in solar energy, where it has 335 papers in distinctive competencies compared with 454 for the United States.

Identifying distinctive competencies rather than simply replying on citation counts shows where competition could come from in the future. While Germany may not yet be leading the United Stated on alternative-energy research, it is certainly developing deep expertise in a wide range of disciplines, which could result in breakthroughs in the near future.

If our most urgent scientific challenges, such as alternative-energy, require a multidisciplinary approach, then we urgently need to find ways of measuring output in these areas. Future breakthroughs in such areas are expected to emerge from the institutes and countries drawing on the widest range of their research capabilities to answer specific questions. And this methodology helps us see where those breakthroughs are likely to emerge.

Institution Country Total
1 NASA Goddard Space Flight Center US 309
2 National Renewable Energy Laboratory US 271
3 Hahn-Meitner-Institut DE 240
4 Forschungszentrum Julich DE 234
5 Pennsylvania State University US 168
6 National Oceanic and Atmospheric Administration US 121
7 University of California at Irvine US 101
8 Osaka University JP 97
9 California Institute of Technology US 97
10 Harvard University US 84

Figure 1 – Top-10 institutions for alternative-energy research

Country Total papers Papers in DCs % in DCs
United States 893 454 51%
Japan 455 149 33%
Germany 370 335 91%

Figure 2 – Top-three countries for solar/photovoltaic research

Country Total papers Papers in DCs % in DCs
United States 1006 377 38%
China 574 157 27%
Japan 531 94 18%

Figure 3 – Top-three countries for fuel-cells research

Country Total papers Papers in DCs % in DCs
United States 1997 797 40%
China 425 75 18%
Japan 216 0%

Figure 4 – Top-three countries for environmentally related energy research

Useful links:

SciVal Spotlight
Research leadership redefined – measuring performance in a multidisciplinary landscape. Listen to the webinar here
USA Today, ‘US institutes lead in environmental research expertise’

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

Learning from our mistakes

Critics have long held that only positive results (where the outcome fits the hypothesis) are published in journals. However, science has always progressed by learning from its mistakes as well as its successes. Research Trends investigates the impact of negative findings.

Read more >


Human discovery, scientific and otherwise, has always been moved forwards in response to the positive and negative outcomes of our experiences. The experimental nature of scientific research, based on the testing of hypotheses, implies a distinct possibility of negative results to our experiments. The very essence of science is based on using both positive and negative results as steps along the continuum.

Medical and scientific theories are developed over time as new research challenges and builds upon received wisdom. For instance, medical research has overturned the assumption that conditions like scurvy and beri-beri are caused by infection, finding that they are actually a symptom of vitamin or hormonal deficiency due to malnutrition.

However, there is a growing feeling in the research community that publishing negative results, despite their scientific value, can be damaging, and many are choosing not to submit such findings to journals.

Publishing negative results

Much research does result in negative findings, and these are rarely published. However, prior knowledge that a particular hypothesis or experiment leads to a negative result could help other researchers modify their experiments or save time in reproducing this event. In an article in Nature, Jonathan Knight has asked whether scientific progress is being hampered in some areas by this practice (1).

William F. Balistreri, MD, Editor-in-Chief of The Journal of Pediatrics, says: “We agree with the International Committee of Medical Journal Editors (ICMJE). They have made a clear statement regarding the obligation to publish negative studies: ‘Editors should consider seriously for publication any carefully done study of an important question, relevant to their readers, whether the results for the primary or any additional outcome are statistically significant. Failure to submit or publish findings because of lack of statistical significance is an important cause of publication bias.’

The Journal of Pediatrics serves as a practical guide for the continuing education of physicians who diagnose and treat disorders in infants, children and adolescents. We seek original work, which undergoes peer-reviewed scrutiny overseen by the Editorial Board, and have accepted articles that clearly documented a lack of efficacy of therapeutic agents or procedures. We believe that evidence-based medicine must be based on the best evidence.”

In an attempt to encourage researchers to publish negative results, BMC launched the Journal of Negative Results in BioMedicine in 2002. This journal publishes research that covers: “aspects of unexpected, controversial, provocative and/or negative results/conclusions in the context of current tenets, providing scientists and physicians with responsible and balanced information to support informed experimental and clinical decisions.”

Spectacular blunder
Polywater was initially described in 1962 as a new form of water generated from regular water inside glass capillaries. Polywater was believed to have different properties to normal water, including a significantly higher boiling point (three times that of water) and a higher level of viscosity. This led to considerable research for several years until it was eventually confirmed that Polywater was actually normal water containing impurities that were so concentrated that they significantly affected the properties of their solvent – i.e. water. Polywater is rather a large negative result and World Records in Chemistry has described it as a “spectacular blunder” (5).

The polywater effect

The effects of negative results and wide-scale research failures have also caught the attention of the scientometric community. The polywater (see box) research front has been analyzed both bibliometrically and econometrically to assess its impacts on citation activity and economics.

In two papers published in Scientometrics, Eric Ackermann followed the progression of polywater research, demonstrating that seminal papers published in 1962 led to an “information epidemic” that proliferated through the literature and peaked in 1970 with over 100 articles (2, 3). Ackermann found 445 papers on polywater between 1962 and 1974. The research penetrated numerous disciplines, with 85% of papers appearing in five subject fields: nuclear science and technology, physics, multidisciplinary science, electro-chemistry and analytical chemistry.

Ackerman’s findings show how rapidly a new research front can spread and how readily researchers alter their own direction in the light of seminal papers, regardless of whether the research carried out turns out to be true or not.

References:
(1) Knight, J. (2003) ‘Null and Void’, Nature, 422 (6932), pp. 554–555
(2) Ackermann, E. (2005) ‘Bibliometrics of a controversial scientific literature: Polywater research, 1962–1974’, Scientometrics, 63 (2) pp. 189–208
(3) Ackermann, E. (2006) ‘Indicators of failed information epidemics in the scientific journal literature: A publication analysis of Polywater and Cold Nuclear Fusion’, Scientometrics, 66 (3), pp. 451–466
(4) Diamond, A.M. (2009) ‘The career consequences of a mistaken research project – the case of polywater’, American Journal of Economics & Sociology, 68 (2), pp. 387–411
(5) Quadbeck-Seeger, H-J. (Ed.); Faust, R.; Knaus, G.; Siemeling, U. (1999) World Records in Chemistry. New York: Wiley-VCH

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

Human discovery, scientific and otherwise, has always been moved forwards in response to the positive and negative outcomes of our experiences. The experimental nature of scientific research, based on the testing of hypotheses, implies a distinct possibility of negative results to our experiments. The very essence of science is based on using both positive and negative results as steps along the continuum.

Medical and scientific theories are developed over time as new research challenges and builds upon received wisdom. For instance, medical research has overturned the assumption that conditions like scurvy and beri-beri are caused by infection, finding that they are actually a symptom of vitamin or hormonal deficiency due to malnutrition.

However, there is a growing feeling in the research community that publishing negative results, despite their scientific value, can be damaging, and many are choosing not to submit such findings to journals.

Publishing negative results

Much research does result in negative findings, and these are rarely published. However, prior knowledge that a particular hypothesis or experiment leads to a negative result could help other researchers modify their experiments or save time in reproducing this event. In an article in Nature, Jonathan Knight has asked whether scientific progress is being hampered in some areas by this practice (1).

William F. Balistreri, MD, Editor-in-Chief of The Journal of Pediatrics, says: “We agree with the International Committee of Medical Journal Editors (ICMJE). They have made a clear statement regarding the obligation to publish negative studies: ‘Editors should consider seriously for publication any carefully done study of an important question, relevant to their readers, whether the results for the primary or any additional outcome are statistically significant. Failure to submit or publish findings because of lack of statistical significance is an important cause of publication bias.’

The Journal of Pediatrics serves as a practical guide for the continuing education of physicians who diagnose and treat disorders in infants, children and adolescents. We seek original work, which undergoes peer-reviewed scrutiny overseen by the Editorial Board, and have accepted articles that clearly documented a lack of efficacy of therapeutic agents or procedures. We believe that evidence-based medicine must be based on the best evidence.”

In an attempt to encourage researchers to publish negative results, BMC launched the Journal of Negative Results in BioMedicine in 2002. This journal publishes research that covers: “aspects of unexpected, controversial, provocative and/or negative results/conclusions in the context of current tenets, providing scientists and physicians with responsible and balanced information to support informed experimental and clinical decisions.”

Spectacular blunder
Polywater was initially described in 1962 as a new form of water generated from regular water inside glass capillaries. Polywater was believed to have different properties to normal water, including a significantly higher boiling point (three times that of water) and a higher level of viscosity. This led to considerable research for several years until it was eventually confirmed that Polywater was actually normal water containing impurities that were so concentrated that they significantly affected the properties of their solvent – i.e. water. Polywater is rather a large negative result and World Records in Chemistry has described it as a “spectacular blunder” (5).

The polywater effect

The effects of negative results and wide-scale research failures have also caught the attention of the scientometric community. The polywater (see box) research front has been analyzed both bibliometrically and econometrically to assess its impacts on citation activity and economics.

In two papers published in Scientometrics, Eric Ackermann followed the progression of polywater research, demonstrating that seminal papers published in 1962 led to an “information epidemic” that proliferated through the literature and peaked in 1970 with over 100 articles (2, 3). Ackermann found 445 papers on polywater between 1962 and 1974. The research penetrated numerous disciplines, with 85% of papers appearing in five subject fields: nuclear science and technology, physics, multidisciplinary science, electro-chemistry and analytical chemistry.

Ackerman’s findings show how rapidly a new research front can spread and how readily researchers alter their own direction in the light of seminal papers, regardless of whether the research carried out turns out to be true or not.

References:
(1) Knight, J. (2003) ‘Null and Void’, Nature, 422 (6932), pp. 554–555
(2) Ackermann, E. (2005) ‘Bibliometrics of a controversial scientific literature: Polywater research, 1962–1974’, Scientometrics, 63 (2) pp. 189–208
(3) Ackermann, E. (2006) ‘Indicators of failed information epidemics in the scientific journal literature: A publication analysis of Polywater and Cold Nuclear Fusion’, Scientometrics, 66 (3), pp. 451–466
(4) Diamond, A.M. (2009) ‘The career consequences of a mistaken research project – the case of polywater’, American Journal of Economics & Sociology, 68 (2), pp. 387–411
(5) Quadbeck-Seeger, H-J. (Ed.); Faust, R.; Knaus, G.; Siemeling, U. (1999) World Records in Chemistry. New York: Wiley-VCH

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

…So many papers outside your field?

Computer science is one field that is displaying great leaps towards multidisciplinarity. We ask a computer scientist why he cited outside his subject area.

Read more >


In this issue of Research Trends, we have analyzed the multidisciplinary nature of research and developments. One area that is becoming more multidisciplinary over time is computer science.

Fionn Murtagh

Fionn Murtagh

A good example is Professor Fionn Murtagh’s recent paper, “The structure of narrative: The case of film scripts” in Pattern Recognition, cited in Nature (1). Murtagh is from the Computer Science Department at the University of London. His paper is clearly multidisciplinary, citing many papers from linguistics. Murtagh adds: “it also strongly cites media arts and digital humanities, mathematics and statistics.”

Following a theme

One of the linguistics papers referred to is a paper by Yves Bestgen (2). Murtagh says: “We cited the Bestgen paper due to its content – readying input data for analysis of discourse (in the case of that particular author) and analysis of the particular narrative form provided by a filmscript (in the case of our paper). But I paid no interest whatsoever to whether this paper was categorized as linguistics or otherwise. The way I work is to pursue themes that I think are (very) important, find supporting data, perform extensive evaluation and write all that up.

“Then, if I am convinced at that point that it is presentable, I start thinking of an appropriate journal. I publish, or have published, regularly in journals that are categorized as computer science, statistics, mathematics, physics, astronomy, geology, and other areas.”

Murtagh goes on to explain that in this instance they chose a computer science journal rather than a linguistics journal because “I always seek the most appropriate journal, irrespective of area. I have published in Pattern Recognition before, my first being in 1984, and it is high on my list of ‘personal best’ journals. I am also mindful of discipline-specific evaluations at national and other levels, which can have career implications. I therefore ensure that I have sufficient publications in any given area when I think this is necessary.”

Breaking boundaries

On the topic of multidisciplinarity in general, he says: “I personally have research interests overlapping many fields. My personal aspiration is to always pursue my interests, irrespective of the labels applied to the fields or journals. I would suggest that the core of what computer science is all about is ‘computational thinking’. This is applicable to all disciplines and beyond – to humanities, and to governance and management too.

“However, I do admit that career structures in particular mitigate strongly against cross-disciplinarity. In universities you are in a particular discipline and your performance in all aspects, including research, is evaluated in accordance with the discipline you are in. No one ever said that life is easy!”

References:

(1) Merali, Z. (2008) “Here’s looking at you, kid”, Nature, issue 453, p. 708.

(2) Bestgen, Y. (1998) “Segmentation markers as trace and signal of discourse structure”, Journal of Pragmatics, issue 29, pp. 753–763.

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

In this issue of Research Trends, we have analyzed the multidisciplinary nature of research and developments. One area that is becoming more multidisciplinary over time is computer science.

Fionn Murtagh

Fionn Murtagh

A good example is Professor Fionn Murtagh’s recent paper, “The structure of narrative: The case of film scripts” in Pattern Recognition, cited in Nature (1). Murtagh is from the Computer Science Department at the University of London. His paper is clearly multidisciplinary, citing many papers from linguistics. Murtagh adds: “it also strongly cites media arts and digital humanities, mathematics and statistics.”

Following a theme

One of the linguistics papers referred to is a paper by Yves Bestgen (2). Murtagh says: “We cited the Bestgen paper due to its content – readying input data for analysis of discourse (in the case of that particular author) and analysis of the particular narrative form provided by a filmscript (in the case of our paper). But I paid no interest whatsoever to whether this paper was categorized as linguistics or otherwise. The way I work is to pursue themes that I think are (very) important, find supporting data, perform extensive evaluation and write all that up.

“Then, if I am convinced at that point that it is presentable, I start thinking of an appropriate journal. I publish, or have published, regularly in journals that are categorized as computer science, statistics, mathematics, physics, astronomy, geology, and other areas.”

Murtagh goes on to explain that in this instance they chose a computer science journal rather than a linguistics journal because “I always seek the most appropriate journal, irrespective of area. I have published in Pattern Recognition before, my first being in 1984, and it is high on my list of ‘personal best’ journals. I am also mindful of discipline-specific evaluations at national and other levels, which can have career implications. I therefore ensure that I have sufficient publications in any given area when I think this is necessary.”

Breaking boundaries

On the topic of multidisciplinarity in general, he says: “I personally have research interests overlapping many fields. My personal aspiration is to always pursue my interests, irrespective of the labels applied to the fields or journals. I would suggest that the core of what computer science is all about is ‘computational thinking’. This is applicable to all disciplines and beyond – to humanities, and to governance and management too.

“However, I do admit that career structures in particular mitigate strongly against cross-disciplinarity. In universities you are in a particular discipline and your performance in all aspects, including research, is evaluated in accordance with the discipline you are in. No one ever said that life is easy!”

References:

(1) Merali, Z. (2008) “Here’s looking at you, kid”, Nature, issue 453, p. 708.

(2) Bestgen, Y. (1998) “Segmentation markers as trace and signal of discourse structure”, Journal of Pragmatics, issue 29, pp. 753–763.

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

Promoting innovation in Italy

The European paradox, whereby Europe trails the United States in its ability to transfer academic knowledge to industry, is mirrored in Italy, which is falling behind the major European countries. We speak to Giovanni Abramo and Ciriaco Andrea D’Angelo, who believe that bibliometrics could be part of the solution.

Read more >


In 2000, the Lisbon Agenda, which aims to increase European competitiveness, identified numerous areas for improvement. One of its key recommendations was that governments should invest in public research as a source of innovation for industry.

Nine years on, the question remains: has this goal been achieved? While European governments have increased funding, are they also getting return on this investment?

Ciriaco Andrea D’Angelo

Ciriaco Andrea D’Angelo

For Giovanni Abramo of the Italian National Research Council and Ciriaco Andrea D’Angelo, both based at the University of Rome “Tor Vergata”, this has not been the case in Italy. “Our ability to transfer knowledge to industry is very weak. Where Europe lags behind the US, Italy lags behind the major European countries.”

Knowledge transfer is Abramo’s area of expertise. Together with D’Angelo, Abramo he has built a database that contains information on the research output of all Italian researchers. A free query on any subject of interest returns a ranking of relevant experts, based on their productivity and quality of output. This effort led them to bibliometrics and research assessment.

Giovanni Abramo

Giovanni Abramo

Meanwhile, in 2001–2003, Italy launched its first research-evaluation exercise (the triennial VTR), which assesses a sample of an institution’s research for ranking and funding purposes. The VTR is entirely based on peer assessment. Abramo and D’Angelo used this exercise as the springboard for their research, investigating whether bibliometrics can deliver results that are comparable to peer assessment and, if so, whether it could be used to support peer review in general (1).

Towards better and more complete assessment

Abramo and D’Angelo particularly wanted to explore whether bibliometrics could address some of peer assessment’s limitations. Abramo says: “Two major shortcomings of peer reviews are, first, that they can only be carried out on a sample of an institution’s research output. This means it cannot measure productivity. Second, it relies on research institutes being able to select their own best outputs.”

To begin, they tested for correlations between the results of the VTR and a bibliometric analysis of the same data set. “We started by assessing all papers submitted for review and then compared the quality rankings – both methods gave the same results. This means that bibliometrics can be used to support peer review when assessing the hard sciences, thus avoiding peer review’s shortcomings, while also offering the advantages of time and cost efficiencies.”

Abramo also says that relying on universities to select their own best work is a dangerous practice: according to his research, many universities are actually inaccurate when selecting their ‘best’ publications for review. Taking their bibliometric analysis of Italian research output as their starting point, they found that some areas were particularly bad, with certain universities submitting publications whose level of quality fell far below the median of their portfolio of products. In the area of mathematics, for instance, around a quarter of submitted papers had a quality ranking below the median. “This suggests that the universities themselves cannot assess their own value. And if the national assessment is based on what they submit for review, this means the national assessment is meaningless,” he adds.

He suggests that bibliometrics could help at both the selection and submission level within the university – helping them identify their best work – and at the national assessment level. In this way, bibliometric data can help both at the beginning and the end of any research-assessment exercise.

Abramo and D’Angelo hope that Italy will move towards more metric-based assessment in the future. They believe that it is the only way to help Italy improve its ability to allocate scarce public resources more efficiently. “It is also important to consider the transfer of knowledge to government, not just to industry. Our policymakers should be using the output of research that they are actually funding,” says Abramo.

Encouraging collaboration

In another paper, University-industry collaboration in Italy: A bibliometric examination, Abramo and D’Angelo explored where collaborations between universities and industry most frequently occur and how collaboration with industry affects a researcher’s reputation (2).

They discovered that in terms of sheer numbers, most collaboration occurs in the fields of medicine and chemistry. However, the highest concentrations of university-industry co-authored papers are found in the areas of information technology and engineering. This reflects the industries that Italy is strong in,” explains Abramo.

More interesting was their analysis of whether collaboration positively affects quality of output. Their research suggests that it does when academics collaborate with colleagues of other universities or public research institutions, but not when industrial partners are involved. They also studied the motivations for university-industry collaborations. Where industry is seeking new applications and patents, the universities want to publish research results. However, prestigious journals are less inclined to publish this kind of applied research. This means academics have to forgo high-impact publications. So what is in it for them?

According to Abramo: “The incentive for universities is simple: they need the cash to fund research. For academics, it is a tradeoff: they get their funding, but for less prestigious research. They can then do more of the kind of basic research that gets published in high-impact journals.”

In a subsequent investigation, Abramo and D’Angelo found that the way companies select university partners is far from efficient. Even considering the effect of geographic proximity, in 65% of cases, companies could have selected an academic partner closer and with superior scientific performance than the one actually chosen. The bibliometric database set up by Abramo and D’Angelo can help companies identify the best experts.

Collaboration is key to innovation

Abramo and D’Angelo believe that increasing industry-university collaboration is essential if Italy is to achieve its potential: “I cannot understand why governments are prepared to invest so much in research, only to ignore its results,” says Abramo.

He adds that according to the results of a study they have just completed, bibliometrics cannot only support peer review in assessing research efficiency, it can also help in evaluating how universities perform in collaborations with industry (3).

Abramo believes that increased options are the solution. “For me, the ability to better assess public research institutes on a wide range of criteria means we now have the tools to stimulate much better research and technology transfer efficiency than ever before.”

Useful links:
Osservatorio Ricerca Pubblica Italiana (Interface in Italian; queries in English)
Laboratory for Studies on Research and Technology Transfer, University of Rome “Tor Vergata”
IREG-4 Conference
NCURA Magazine

References:

(1) Abramo, G., D’Angelo, C.A. and Capasecca, A. (2009) “Allocative efficiency in public research funding: Can bibliometrics help?”, Research Policy, Vol. 38, pp. 206–215.
(2) Abramo, G. et al (2009) “University-industry collaboration in Italy: A bibliometric examination”, Technovation, doi:10.1016/j.technovation.2008.11.003
(3) Abramo, G. et al (2009) “Assessing the performance of universities in research collaboration with industry”, working paper available pre-publication in English and Italian at: Laboratory for Studies on Research and Technology Transfer, University of Rome “Tor Vergata”
VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

In 2000, the Lisbon Agenda, which aims to increase European competitiveness, identified numerous areas for improvement. One of its key recommendations was that governments should invest in public research as a source of innovation for industry.

Nine years on, the question remains: has this goal been achieved? While European governments have increased funding, are they also getting return on this investment?

Ciriaco Andrea D’Angelo

Ciriaco Andrea D’Angelo

For Giovanni Abramo of the Italian National Research Council and Ciriaco Andrea D’Angelo, both based at the University of Rome “Tor Vergata”, this has not been the case in Italy. “Our ability to transfer knowledge to industry is very weak. Where Europe lags behind the US, Italy lags behind the major European countries.”

Knowledge transfer is Abramo’s area of expertise. Together with D’Angelo, Abramo he has built a database that contains information on the research output of all Italian researchers. A free query on any subject of interest returns a ranking of relevant experts, based on their productivity and quality of output. This effort led them to bibliometrics and research assessment.

Giovanni Abramo

Giovanni Abramo

Meanwhile, in 2001–2003, Italy launched its first research-evaluation exercise (the triennial VTR), which assesses a sample of an institution’s research for ranking and funding purposes. The VTR is entirely based on peer assessment. Abramo and D’Angelo used this exercise as the springboard for their research, investigating whether bibliometrics can deliver results that are comparable to peer assessment and, if so, whether it could be used to support peer review in general (1).

Towards better and more complete assessment

Abramo and D’Angelo particularly wanted to explore whether bibliometrics could address some of peer assessment’s limitations. Abramo says: “Two major shortcomings of peer reviews are, first, that they can only be carried out on a sample of an institution’s research output. This means it cannot measure productivity. Second, it relies on research institutes being able to select their own best outputs.”

To begin, they tested for correlations between the results of the VTR and a bibliometric analysis of the same data set. “We started by assessing all papers submitted for review and then compared the quality rankings – both methods gave the same results. This means that bibliometrics can be used to support peer review when assessing the hard sciences, thus avoiding peer review’s shortcomings, while also offering the advantages of time and cost efficiencies.”

Abramo also says that relying on universities to select their own best work is a dangerous practice: according to his research, many universities are actually inaccurate when selecting their ‘best’ publications for review. Taking their bibliometric analysis of Italian research output as their starting point, they found that some areas were particularly bad, with certain universities submitting publications whose level of quality fell far below the median of their portfolio of products. In the area of mathematics, for instance, around a quarter of submitted papers had a quality ranking below the median. “This suggests that the universities themselves cannot assess their own value. And if the national assessment is based on what they submit for review, this means the national assessment is meaningless,” he adds.

He suggests that bibliometrics could help at both the selection and submission level within the university – helping them identify their best work – and at the national assessment level. In this way, bibliometric data can help both at the beginning and the end of any research-assessment exercise.

Abramo and D’Angelo hope that Italy will move towards more metric-based assessment in the future. They believe that it is the only way to help Italy improve its ability to allocate scarce public resources more efficiently. “It is also important to consider the transfer of knowledge to government, not just to industry. Our policymakers should be using the output of research that they are actually funding,” says Abramo.

Encouraging collaboration

In another paper, University-industry collaboration in Italy: A bibliometric examination, Abramo and D’Angelo explored where collaborations between universities and industry most frequently occur and how collaboration with industry affects a researcher’s reputation (2).

They discovered that in terms of sheer numbers, most collaboration occurs in the fields of medicine and chemistry. However, the highest concentrations of university-industry co-authored papers are found in the areas of information technology and engineering. This reflects the industries that Italy is strong in,” explains Abramo.

More interesting was their analysis of whether collaboration positively affects quality of output. Their research suggests that it does when academics collaborate with colleagues of other universities or public research institutions, but not when industrial partners are involved. They also studied the motivations for university-industry collaborations. Where industry is seeking new applications and patents, the universities want to publish research results. However, prestigious journals are less inclined to publish this kind of applied research. This means academics have to forgo high-impact publications. So what is in it for them?

According to Abramo: “The incentive for universities is simple: they need the cash to fund research. For academics, it is a tradeoff: they get their funding, but for less prestigious research. They can then do more of the kind of basic research that gets published in high-impact journals.”

In a subsequent investigation, Abramo and D’Angelo found that the way companies select university partners is far from efficient. Even considering the effect of geographic proximity, in 65% of cases, companies could have selected an academic partner closer and with superior scientific performance than the one actually chosen. The bibliometric database set up by Abramo and D’Angelo can help companies identify the best experts.

Collaboration is key to innovation

Abramo and D’Angelo believe that increasing industry-university collaboration is essential if Italy is to achieve its potential: “I cannot understand why governments are prepared to invest so much in research, only to ignore its results,” says Abramo.

He adds that according to the results of a study they have just completed, bibliometrics cannot only support peer review in assessing research efficiency, it can also help in evaluating how universities perform in collaborations with industry (3).

Abramo believes that increased options are the solution. “For me, the ability to better assess public research institutes on a wide range of criteria means we now have the tools to stimulate much better research and technology transfer efficiency than ever before.”

Useful links:
Osservatorio Ricerca Pubblica Italiana (Interface in Italian; queries in English)
Laboratory for Studies on Research and Technology Transfer, University of Rome “Tor Vergata”
IREG-4 Conference
NCURA Magazine

References:

(1) Abramo, G., D’Angelo, C.A. and Capasecca, A. (2009) “Allocative efficiency in public research funding: Can bibliometrics help?”, Research Policy, Vol. 38, pp. 206–215.
(2) Abramo, G. et al (2009) “University-industry collaboration in Italy: A bibliometric examination”, Technovation, doi:10.1016/j.technovation.2008.11.003
(3) Abramo, G. et al (2009) “Assessing the performance of universities in research collaboration with industry”, working paper available pre-publication in English and Italian at: Laboratory for Studies on Research and Technology Transfer, University of Rome “Tor Vergata”
VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)
  • Elsevier has recently launched the International Center for the Study of Research - ICSR - to help create a more transparent approach to research assessment. Its mission is to encourage the examination of research using an array of metrics and a variety of qualitative and quantitive methods.