Guardian: Researchers, journal editors, and those who provide funding have joined together to create the San Francisco Declaration of Research Assessment (DORA). The declaration argues that journal impact factors—rankings of journals determined by the number of citations they receive over a given period—do not necessarily reflect the quality of individual articles or authors and should not be used in hiring and promotion decisions. Impact factors are being blamed for having driven researchers to avoid potentially groundbreaking work in favor of work that is considered to be more publishable. Furthermore, researchers are more likely to favor highly rated journals over those that might be more appropriate for their work. In turn, what gets published, and even what gets submitted, can be affected. Those considerations can encourage participants to try to game the system, which can skew the ranking. DORA is part of a growing effort to increase scientific diversity and to evaluate research, researchers, and journals on their individual merits.
Science: In December 2012, the US Congress expanded the law defining the sanctions against Iran in response to Iran’s continued pursuit of nuclear technology. One of the changes includes preventing all US citizens, regardless of who employs them, from interacting with employees of the Iranian government. US-based journal publishers had already been prevented from publishing research performed by Iranian scientists employed by their government. The updated law now prevents US employees of foreign publishers from editing or reviewing research by employees of the Iranian government. Nongovernmental research can still be handled by US publishers and US citizens, however.
New York Times: The open access movement has been pushing for scientific publishers to move away from subscription-based systems for access to academic papers. One side effect of the movement is the rise of journals that give the appearance of being legitimate but lack peer review and use shady business practices. This can range from hiding fees from submitting authors to using names of researchers as supporters without permission. And many of the publications have titles that are very similar to well-known journals. The lack of peer review and the willingness to publish anything if the author pays the fee results in the mixing of legitimate research with pseudoscience. This gives pseudoscience the patina of legitimacy, which is damaging when nonexperts are attempting to research a topic. When researchers apply to submit a paper, they are often uninformed of the cost of publication, and then are held liable for exorbitant costs when their papers are accepted. Employers who are looking at resumes can not necessarily distinguish legitimate publications from the mimics. Some people in academia are attempting to catalog the pseudo-journals and their pseudo-conference brethren, but the growth in such entities may be too much to track or limit.
Nature: Hundreds of researchers who thought they were submitting articles to two European science journals have been cheated out of their author fees by criminals who have created counterfeit websites. Neither of the authentic journals—the Archives des Sciences, published by the Society of Physics and Natural History of Geneva, and Wulfenia, published by the Regional Museum of Carinthia in Klagenfurt, Austria—has its own website. The counterfeiters are expert at deception: They have on their websites the journals’ impact factors, postal addresses, international standard serial numbers, and even editorial board listings that include real scientists. Daniel Gamelin, a chemist and materials scientist at the University of Washington in Seattle, and Gerald Cleaver, a high-energy physicist at Baylor University in Waco, Texas, were surprised to learn that they were listed on one of the bogus websites as being on the editorial board of Archives des Sciences. Although the journals are trying to fight the scammers, efforts to shut them down have so far proved ineffective.
Science: On 22 February, the White House Office of Science and Technology Policy (OSTP) released a directive to all federal agencies that spend more than $100 million on scientific research. The new policy requires the results of all federally funded research be made freely available online within 12 months of the papers’ original publications. The requirement closely resembles the National Institutes of Health’s 2008 decision that all agency research be posted in PubMed Central within 12 months of initial publication. However, the OSTP’s policy does not specify where the agencies will publish the papers or how they will be indexed. Each agency has six months to draft a plan that will determine those details. For the last several years, there has been extensive debate, including a petition to the White House, about whether to make results of research freely available to the public. The new policy can be seen as a compromise solution between the position of publishers who make money from the journals where the papers are originally published and the open-access advocates who desire immediate and free access to results.
Science: Whereas it occurs across all stages of career development, misconduct by scientists is committed more often by men than by women, according to a study published yesterday in the journal mBio. The study was conducted by three microbiologists, who reviewed all the misconduct reports produced by the US Office of Research Integrity (ORI) since 1994. Nearly all instances of misconduct concerned research in the life sciences. Of the 228 people involved, 149, or 65%, were men. Academic rank also played a part. Of those who had committed misconduct, 32% were faculty members. The findings could influence interventions, which have traditionally centered on trainees. Principal investigators “are a legitimate target for interventions to improve ethics,” said Ferric Fang of the University of Washington in Seattle, who is one of the study’s authors. “They also, more than anyone else, create the environment in which science is performed.”
Nature: To facilitate publication of scientific research, a new platform called the Episciences Project is scheduled to be launched this April. The brainchild of Jean-Pierre Demailly, a mathematician at the University of Grenoble in France, the project will consist of a series of free, open-access journals, whose articles will be culled from the arXiv preprint server. Each journal will have its own editor and editorial board, which will select the content and organize peer review of the articles. The Episciences platform will be maintained by the Center for Direct Scientific Communication, based in France. The project has the seal of approval of Tim Gowers, a Fields Medal winner and mathematician at the University of Cambridge who last year initiated a boycott of the world’s largest scientific journal publisher, Elsevier, in an effort to reform research publishing.
BBC: A heated debate has arisen over a paper in the New Journal of Physics about a technique that potentially allows for increase data transmission rates via light waves. The paper’s authors claim that extra data can be encoded in light by defining the mode of the orbital angular momentum. (A paper on a similar technique was published recently in Science.) Whether the physicists have done anything new and whether the technique will be effective have been discussed in a number of journals. Several electrical engineers have argued that the two modes the researchers used re-create a data transmission technique outlined in the 1970s called “multiple input, multiple output.” They also argue that any attempt to increase the number of modes used will fail.
New York Times: Responding to evidence that fraud, plagiarism, and other misconduct explain most biomedical publication retractions, a recent New York Times editorial calls the news “a revealing glimpse of the pressures driving many scientists to improper conduct.” The editors observe that many theories explain “why retractions and fraud have increased.” They report that “a benign view suggests that because journals are now published online and more accessible to a wider audience, it’s easier for experts to spot erroneous or fraudulent papers,” but “a darker view suggests that publish-or-perish pressures in the race to be first with a finding and to place it in a prestigious journal has [sic] driven scientists to make sloppy mistakes or even falsify data.” They conclude, “The solutions are not obvious, but clearly greater vigilance by reviewers and editors is needed.”
Nature: An examination of manuscript flows for bioscience articles reveals that articles published on a second submission attempt are more frequently cited than those accepted on their first submission. Vincent Calcagno of the French Institute for Agricultural Research in Sophia-Antipolis and his colleagues looked at more than 80 000 articles published between 2006 and 2008. One finding reinforces the idea that papers rejected by higher-impact journals generally tend to be published in lower-impact journals. But 75% of papers appear in the journal to which they are first submitted. Regardless of journal impact, the articles that were initially rejected were more highly cited on average within 3–6 years of their publication. Calcagno believes that is because of the impact of peer review and the resulting improvements in the articles.