BEE-L Archives

Informed Discussion of Beekeeping Issues and Bee Biology

BEE-L@COMMUNITY.LSOFT.COM

Options: Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Jerry Bromenshenk <[log in to unmask]>
Reply To:
Informed Discussion of Beekeeping Issues and Bee Biology <[log in to unmask]>
Date:
Sun, 16 Dec 2018 19:08:22 +0000
Content-Type:
text/plain
Parts/Attachments:
text/plain (45 lines)
In the medical field >peer review system as 'mutual back scratching.'<   I can't speak to the medical field.  

I can say from years of experience, having published and reviewed articles, with respect to main line science journals, anyone who thinks peer-review is 'back-scratching' has neither been subjected to peer-review nor served as a peer-reviewer.   In fact, I remember my major Ph.D. thesis adviser warning me of 'turf protection' by reviewers.

A good editor's responsibility is to find 3-5 reviewers who have the expertise to properly review a paper.  The same applies to a Panel Chair and the selection of reviewers of competitive research proposals.  

 

In past month, I've reviewed two research proposals for a federal agency, and I've reviewed two research papers for two different journals.  I have no idea who the other reviewers were, nor what they said about any of the proposals or papers. 

For research proposals, standard federal agency practice is for review by a Panel, often comprised of a dozen or more members, who first receive and review all of the proposal submissions, then convene to discuss and provide final rankings and reviews, working with the panel chair. 

In my most recent proposal reviews, I served as an external reviewer.  My role was advisory to the main panel.  I don't know who was on the main panel, but the Request for Proposals was not  aimed at projects involving bees. The Panel was looking for Innovative Technologies and received two technology proposals that had potential application to bees.  

For submitted manuscripts, usually only the editor knows who the reviewers were and sees the full reviews.  In most cases, the editor forwards a summary of the reviews to the submitter.  Seldom does the submitter receive a copy of the full reviews.  

In all cases, the standard practice is for reviewers to remain anonymous.  Names are not provided to the submitter.  The editor or panel chair's summaries serve as an additional confidentiality filter.  A good Editor or Panel Chair provides a summary to both those whose papers or proposals were selected for publication or funding and, equally importantly, provides a summary as to why a paper or proposal was rejected.  In that way, the peer-review is intended as an educational process.

My career was based on 39 years of Competitive Research grants for research involving honey bees and 20 years as the State Director for Research and Educational Infra-Structure building grants involving the state's three research campuses.  

Rejections are part of the process.  For competitive research projects, rejections are common.  It's what one does after a rejection that counts.  Keep in mind, far more proposals are rejected than are funded, especially at the federal level - NSF, DOE, DOD, USDA, etc.

Manuscript rejections are also common.  Years ago, Science warned that they rejected 90% or more of submitted articles.  As such, I'm glad to say that the one article that I've published in Science received the most uniformly positive reviews of any paper in my career.  I also had one rejected as not being of sufficient interest to Science's readers.

Typically, whether a competitive proposal or an articles, reviews tend to vary - seldom do all of the reviewers agree unless the proposal or paper is extraordinarily good or bad.

Virtually all peer-reviews ask each reviewer to provide not only a detailed written review but to also select an overall evaluation, usually something along the lines of:

Accept, Accept with Minor Revisions, Accept with Major Revisions, or Reject.   The onus is on the submitter to take appropriate action.  

All of the above is a description of standard practices in peer-review in science, especially at the federal level.  Describing the process is different from experiencing it.  Overall, far more of the proposals that I and my Research Teams submitted were accepted and funded than rejected - so I'm way ahead of the game, I/we more than beat the odds. 

 But there were stumbles and rejections.  I remember my first proposal for Funding Graduate Research Fellows in Energy Science - it was a face-plant.  But a generous colleague, who had one of the successful proposals that year, shared his winning proposal with me.  The next year, Montana had the top ranked proposal.   The comments from the rejection review and the example of the winning proposal, plus a year to work on the revision, made the difference.

Manuscript wise, all have been published.  All but one were published by my original journal of choice.  The one that  was rejected by Science, as not of general interest, was published by PLoS.  Still, other than my original paper in Science, I can't remember any manuscript that didn't require some revisions, whether general housekeeping like a mis-numbered reference, a reference format, or clarification of statements, or a re-write to better meet the orientation of the journal or to put a different focus on the narrative.  I haven't yet met anyone who is perfect and who can satisfy everyone on a committee.  

Usually, the revisions improve a paper.  That does not mean that as an author, I have to blindly agree with any and all comments.  I remember one reviewer who misread isopleth maps of a study of pollution dispersion from a point source as monitored using honey bee sentinel colonies.  He/she was down-right nasty.  At one end of the isopleth was the chimney of a heavy industry point source.  At the other end, over 60 miles away, was a DOE National Laboratory covering over 1200 acres.  The isopleth map looked like a fan, with the widest part of the fan over the national laboratory site, which was known to have several possible sources of various air-borne chemical pollutants.  Problem was, it was the National Laboratory site that was being impacted by a spreading plume from the industrial plant chimney, not that the wide area assemblage of sources from the National Laboratory were all funneling down into the chimney of the industrial source.  Needless to say, the editor and other reviewers agreed.

All said, I don't take rejection well.  And agency's are famous for sending out rejections late Friday afternoons.  That's intentional.   Give the submitter(s) time to cool off, removes the editor or panel chair from access until the following week.  Generally, I give myself 24 hrs to rant and rave, then another 24 to sulk, and it's back into the game.

             ***********************************************
The BEE-L mailing list is powered by L-Soft's renowned
LISTSERV(R) list management software.  For more information, go to:
http://www.lsoft.com/LISTSERV-powered.html

ATOM RSS1 RSS2