In typically understated fashion, the Canadian funding
scheme for academic science has an answer to excessive
reliance on citation metrics.  

'In your proposal for an individual grant tell us, briefly, 
your 10 year plan and within that the project(s) you propose 
over your next 5 year individual grant. (no convincing 
long term plan = lethally negative marks).  Then tell us 
your best contributions in the last 5 years (which of course the
reviewers can check, as they see fit, via citation metrics
and other criteria).

'In your proposal for a group strategic grant, tell us how 
your proposal relates to public good science or to private
good science.' (criterion tends to be well written proposal
with regard to science and relation to public or private good
science).

'In your proposal for a grant in collaboration with industry
tell us how you will work with your industry partner.'
(criterion tends to be well written proposal
with regard to science and cash contribution by industry).

Your program officer at NSF can relate what they know about 
the degree to which citation rate determines grant outcome. 

David Schneider
Memorial University, St. John's NL Canada




----- Forwarded message from malcolm McCallum <[email protected]>
-----
    Date: Sat, 21 Jan 2012 20:04:50 -0600
    From: malcolm McCallum <[email protected]>
Reply-To: malcolm McCallum <[email protected]>
 Subject: Re: [ECOLOG-L] academic publishers and politics
      To: [email protected]

For people who are interested in the politics of publishing and
citation metrics, the following are really worth
reading...technically, we should all be following this stuff. Although
scientists in general are pretty smart, a huge bunch of us tends to
ignore the continual political under-cutting of our profession.
Sometimes I wonder if we are not as a group standing up for ourselves
enough!

Lawrence, P.A. 2007. The mismeasurement of science. Current Biology
17(15):R583-R585.
Answer from the hero in Leo Szilard’s 1948 story “The Mark Gable
Foundation” when asked
by a wealthy entrepreneur who believes that science has progressed too
quickly, what he
should do to retard this progress: “You could set up a foundation with
an annual endowment of
thirty million dollars. Research workers in need of funds could apply
for grants, if they could
make a convincing case. Have ten committees, each composed of twelve
scientists, appointed to
pass on these applications. Take the most active scientists out of the
laboratory and make them
members of these committees. ...First of all, the best scientists
would be removed from their
laboratories and kept busy on committees passing on applications for
funds. Secondly
the scientific workers in need of funds would concentrate on problems
which were considered
promising and were pretty certain to lead to publishable results.
...By going after the obvious,
pretty soon science would dry out. Science would become something like
a parlor game.
...There would be fashions. Those  who followed the fashions would get
grants. Those who wouldn’t
would not.”


Todd, PA, and R.J. Ladle. 2008. Hidden dangers of a "citation
culture." Ethics in Science and environmental politics 8:preprint (I
don't have the paginated version). ABSTRACT: The influence of the
journal impact factor and the effect of a ‘citation culture’ on
science and scientists have been discussed extensively (Lawrence 2007;
Curr Biol 17:R583–585). Nevertheless, many still believe that the
number of citations a paper receives provides some measure of its
quality. This belief may be unfounded, however, as there are 2
substantial areas of error that can distort a citation count or any
metric based on a citation count. One is the deliberate manipulation
of the system by scientists trying to ensure the highest possible
number of cites to their papers; this has  been examined elsewhere
(Lawrence 2003; Nature 422:259–261). The second area of inaccuracy is
inherent to how papers are cited, indexed and searched for. It is this
latter, lesser known, source of error that we will investigate here.

Campbell, P. 2008. Escape from the impact factor. Ethics in science
and environmental politics 8:5-7.
ABSTRACT: As Editor-in-Chief of the journal Nature, I am concerned by
the tendency within academic
administrations to focus on a journal’s impact factor when judging the
worth of scientific contributions
by researchers, affecting promotions, recruitment and, in some
countries, financial bonuses
for each paper. Our own internal research demonstrates how a high
journal impact factor can be the
skewed result of many citations of a few papers rather than the
average level of the majority, reducing
its value as an objective measure of an individual paper. Proposed
alternative indices have their
own drawbacks. Many researchers say that their important work has been
published in low-impact
journals. Focusing on the citations of individual papers is a more
reliable indicator of an individual’s
impact. A positive development is the increasing ability to track the
contributions of individuals by
means of author-contribution statements and perhaps, in the future,
citability of components of
papers rather than the whole. There are attempts to escape the
hierarchy of high-impact-factor journals
by means of undifferentiated databases of peer-reviewed papers such as
PLoS One. It remains
to be seen whether that model will help outstanding work to rise to
due recognition regardless of
editorial selectivity. Although the current system may be effective at
measuring merit on national and
institutional scales, the most effective and fair analysis of a
person’s contribution derives from a direct
assessment of individual papers, regardless of where they were published.

Weingart, P. 2003. Impact of bibliometrics upon the science system:
inadvertent consequences?  This paper is an edited version of the
keynote address given at the 2nd conference of the Central Library
Forschungszentrum Jülich, 5-7 November 2003, Conference Proceedings
Bibliometric Analysis in Science and Research, Schriften des
Forschungszentrums Jülich Vol. 11, 2003, 7-19. I thank Grit Laudel and
Jochen Gläser for their comments on that previous version and above
all Matthias Winterhager for his assistance in preparing this paper
without which it would not have come about.

Lawrence, P.A. 2008. Lost in publication: How measurement harms
science. Ethics in Science and Environmental Politics 8:preprint.
ABSTRACT: Measurement of scientific productivity is difficult. The
measures used (impact factor of
the journal, citations to the paper being measured) are crude. But
these measures are now so universally
adopted that they determine most things that matter: tenure or
unemployment, a postdoctoral
grant or none, success or failure. As a result, scientists have been
forced to downgrade their primary
aim from making discoveries to publishing as many papers as
possible—and trying to work them into
high impact factor journals. Consequently, scientific behaviour has
become distorted and the utility,
quality and objectivity of articles has deteriorated. Changes to the
way scientists are assessed are
urgently needed, and I suggest some here.

Ricker, M., H.M. Hernandez and D.C. Daly. 2009. Measuring scientists'
performance: a view from organismal biologists. Interciencia
34(11):830-835. Increasingly, academic evaluations quantify
performance in
science by giving higher rank to scientists (as well as journals
and institutions) who publish more articles and have more citations.
In Mexico, for example, a centralized federal agency uses
such bibliometric statistics for evaluating the performance of all
Mexican scientists. In this article we caution against using this
form of evaluation as an almost exclusive tool of measuring and
comparing scientists’ performance. We argue that from an economic
viewpoint, maximizing the number of journal articles and
their citations does not necessarily correspond to the preferences
and needs of society. The traditional peer review process
is much better suited for that purpose, and we propose “rulebased
peer review” for evaluating a large number of scientists.

Cassey, P., and T.M. Blackburn. 2004. Publication and rejection among
successful ecologists. BioScience 54(3):234-239. Scientific rejection
is a frequent part of the publication process that is rarely
explicitly discussed. Peer review is an essential and well-established
part of the scientific method. But to what degree is manuscript
rejection indicative of scientific inadequacy? Here we quantify the
extent to which
a sample of scientists with successful publication careers in our
discipline, ecology, have experienced manuscript rejection.We show
that publication
success and manuscript rejection are definitely not exclusive.
Notably, we find that the ecologists with the highest number of
publications also
suffered the largest proportion of manuscript rejections. Rejection is
not easy even for the most successfully publishing ecologists;
however, manuscript
rejection does not seem to have deterred our respondents or to have
hampered their career advancement.We hope that our results will
encourage ecologists (and particularly research students) to continue
submitting their studies for publication.

Leimu, R. and J. Koricheva. 2005. What determines the citation
frequency of ecological papers? Trends in Ecology and evolution
20(1):28-32. Citation frequencies of scientific articles are
increasingly
used for academic evaluation in various disciplines,
including ecology. However, the factors affecting citation
rates have not been extensively studied. Here, we
examine the association between the citation frequency
of ecological articles and various characteristics of
journals, articles and authors. Our analysis shows that
the annual citation rates of ecological papers are
affected by the direction of the study outcome with
respect to the hypothesis tested (supportive versus
unsupportive evidence), by article length, by the number
of authors, and by their country and university of
affiliation. These results cast doubt on the validity of
using citation counts as an objective and unbiased tool
for academic evaluation in ecology.

Scarano, F.R. 2008. Why publish? Revista Brasil. Bot. 31:189-194.
ABSTRACT – (Why publish?). This paper forwards an opinion about
authors’ and journals’ motivations for scientific
writing. Personal and institutional motivations are listed and
discussed and, in regard to biodiversity sciences, I propose
that a nationalistic motivation is also pertinent in a
biodiversity-rich country such as Brazil. Curiosity and
competitiveness
should be combined for better results. Finally I discuss
ground-breaking science within a post-modern perspective, and how
the mere act of scientific writing might trigger both scientific and
social revolutions.

Lawrence, P.A. 2007. The mismeasurement of science. Current Biology
17(15):R583-R585.
Answer from the hero in Leo
Szilard’s 1948 story “The Mark
Gable Foundation” when asked
by a wealthy entrepreneur
who believes that science has
progressed too quickly, what he
should do to retard this progress:
“You could set up a foundation
with an annual endowment of
thirty million dollars. Research
workers in need of funds could
apply for grants, if they could
make a convincing case. Have
ten committees, each composed
of twelve scientists, appointed to
pass on these applications. Take
the most active scientists out of
the laboratory and make them
members of these committees.
...First of all, the best scientists
would be removed from their
laboratories and kept busy
on committees passing on
applications for funds. Secondly
the scientific workers in need
of funds would concentrate on
problems which were considered
promising and were pretty certain
to lead to publishable results.
...By going after the obvious,
pretty soon science would dry
out. Science would become
something like a parlor game.
...There would be fashions. Those
who followed the fashions would
get grants. Those who wouldn’t
would not.”


On Sat, Jan 21, 2012 at 1:03 PM, David Inouye <[email protected]> wrote:
>>Here is a nice article warning about recent moves by the big for
>>profit scientific publishers.
>>
>>Opinion by British geologist on Research Works Act.
>><http://www.guardian.co.uk/science/2012/jan/16/academic-publishers-enemies-science>http://www.guardian.co.uk
>
> "Academic publishers have become the enemies of science"
>
> "The US Research Works Act would allow publishers to line their
> pockets by locking publicly funded research behind paywalls"



-- 
Malcolm L. McCallum
Department of Molecular Biology and Biochemistry
School of Biological Sciences
University of Missouri at Kansas City

Managing Editor,
Herpetological Conservation and Biology

"Peer pressure is designed to contain anyone with a sense of drive" -
Allan Nation

1880's: "There's lots of good fish in the sea"  W.S. Gilbert
1990's:  Many fish stocks depleted due to overfishing, habitat loss,
            and pollution.
2000:  Marine reserves, ecosystem restoration, and pollution reduction
          MAY help restore populations.
2022: Soylent Green is People!

The Seven Blunders of the World (Mohandas Gandhi)
Wealth w/o work
Pleasure w/o conscience
Knowledge w/o character
Commerce w/o morality
Science w/o humanity
Worship w/o sacrifice
Politics w/o principle

Confidentiality Notice: This e-mail message, including any
attachments, is for the sole use of the intended recipient(s) and may
contain confidential and privileged information.  Any unauthorized
review, use, disclosure or distribution is prohibited.  If you are not
the intended recipient, please contact the sender by reply e-mail and
destroy all copies of the original message.

----- End forwarded message -----




This electronic communication is governed by the terms and conditions at
http://www.mun.ca/cc/policies/electronic_communications_disclaimer_2011.php

Reply via email to