Columns Indian Scenario

Should we rethink the way we evaluate research?

Shreya Ghosh

 .
.  

The last few decades have seen a steady increase in research output from India’s rapidly growing pool of academicians and scientists. While this is certainly a positive development for our country, institutional policies, particularly those in place for hiring and evaluating researchers, have struggled to keep up. Current policies suffer from a lack of transparency and consistency, and many researchers have expressed concerns that they impose flawed benchmarks for scientists to meet in order to progress in their careers. In the wake of this, the Indian National Science Academy (INSA) recently released a policy statement that provides specific recommendations on the way basic research should be disseminated and evaluated by the Indian scientific community.

Quantifying research ‘impact’

At present, any assessment of the quality of scientific contributions made by an individual scientist or an organisation often relies on the use of bibliometric standards such as the impact factor of the journals that papers are published in, the total number of publications authored by a scientist, or a myriad of citation-based indexes. “Almost nobody reads what is published,” says Subhash Chandra Lakhotia, INSA fellow and one of the lead authors of the policy statement, “They simply go by impact factor or the name of the journal, which is completely wrong.”

Metrics like impact factors reduce research output to numbers. On one hand, this makes the assessment of a large number of applications quicker and easier for already over-burdened scientists. On the other hand, this often results in a loss of the big-picture understanding of the scientific merits of a certain body of work.

Fears have also been expressed that the need to publish in high-impact factor journals sometimes creates a subtle pressure on scientists to modify the focus and interpretation of their research. “By putting a premium on publication in such journals, we are asking our young researchers to do a certain kind of research and to ensure that that research is published in a certain kind of journal,” says Praveen Chaddah, the other lead author of INSA’s policy statement. “Our interpretation of our research, the way we present it, is now being conditioned by the need to publish in such journals,” he says.

“The reason we are in this situation is because the metrics we have for numerically quantifying research impact are flawed,” says Megha, post-doctoral fellow at the National Center for Biological Sciences (NCBS), Bangalore, adding that prescriptive and restrictive policies tend to mask the diversity of Indian research.

The use of publication metrics as a standard for assessing scientific merit may also extract a heavy toll from young graduate students who are just beginning their careers and already feel the pressure to publish in high-impact journals. “It takes away from that wonder and excitement,” says Rashna Bhandari, Center for DNA Fingerprinting and Diagnostics (CDFD), Hyderabad, “What they are doing primarily needs to be driven by curiosity, by the satisfaction that you get at the end of finding something new, the joy that they will get from sharing it with their peers and with the world.”

The INSA policy statement strongly recommends that research should be evaluated on the basis of “what is published” rather than “where it is published”. Since it may be impossible for hiring, granting and awarding committees to do an in-depth analysis of the scientific merit of every single publication by any particular candidate, the statement suggests that a researcher should be allowed to select their five “best” papers, which may then be classified by an expert committee as being “confirmatory “, “incremental” or “path-breaking” in nature.

The last category of “path-breaking” research has the highest impact on the scientific community at large and should be encouraged, according to Chaddah. “This is the kind of research we really want coming out from our leading institutes, from our best brains,” he says, “Unfortunately, we are not giving rewards for this kind of research.”

Praveen Chaddah (left) and Subhash Chandra Lakhotia (right)

Taking research to the community

Our current policies for evaluating research also have a strong impact on the way researchers choose to disseminate their research. Over-reliance on impact factors and bibliometrics leads scientists to chase high-profile publications in well-known journals, increasing the time required for results to become available to the community at large and increasing the chances of idea-plagiarism and loss of priority on a particular discovery.

To counter this, the INSA policy statement recommends that researchers make use of the several pre-print servers that currently exist and which allow scientists to establish precedence before waiting for the lengthy pre-publication peer review process. Such repositories are publicly accessible and allow peers to discuss and comment on published content. “Publication process is only the start of the evaluation of the scientific result. The validation process starts only after you have publicly released it within the community,” says Chaddah.

While being used quite extensively by mathematicians, physicists and computer scientists, pre-print repositories are yet to see widespread use outside these fields. Biologists, in particular, have been slow to make use of bioRxiv, the pre-print server for life sciences that was launched in 2013. “I think there’s still a lot of wait and watch going on,” says Bhandari.

INSA’s statement also speaks out against the policy that several institutes implement of asking for a researcher’s contributions to be segregated as being published in ‘national’ vs ‘international’ journals – with the implication being that publications in Indian journals are either less reliable or of a lesser value than those in international journals. This discourages scientists from sending their research to Indian journals, starting a vicious cycle which prevents the latter from rising in quality. The present statement strongly discourages such classifications. “Our journals are international journals,” says Bhandari.

Another unfortunate side effect of the pressure to publish has been a rise in the number of predatory journals and conferences which allow scientists to pad their CVs and increase the number of publications “for a fee”. A recent investigative report by The Indian Express revealed the presence of hundreds of such journals in India which charge hefty fees to churn out so-called research papers with minimal editing or reviewing.

“The genesis has to be sought in the fact that for nearly two-three decades, scientists agreed that they will pay open access charges, and processing charges and so on,” says Lakhotia, “For predatory journals and publishers, this was ready ground.”

So, what should young researchers choose to do in such a situation? “Submit the paper to the best journal (i.e. many of your peers read that journal) where you think it can get accepted. Simultaneously, put up your paper on a widely read pre-print archive,” says Chaddah. Lakhotia agrees - “As long as the journal is available publically in the given discipline and has good policy, they [young scientists] should publish and then instead of worrying about impact factor, they should justify that they did good science.”

INSA’s policy statement was published in the Proceedings of the Indian National Science Academy and is already provoking discussions in several academic circles. “It really helps that INSA has put out this policy document,” says Bhandari, “I think it has been long in the making and really, really needed.”

============================

Do you agree with the policy recommendations put forward by INSA? Let us know in the comments below.

Written By

Program Manager - Science Communication