BEE-L Archives

Informed Discussion of Beekeeping Issues and Bee Biology

BEE-L@COMMUNITY.LSOFT.COM

Options: Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Christina Wahl <[log in to unmask]>
Reply To:
Informed Discussion of Beekeeping Issues and Bee Biology <[log in to unmask]>
Date:
Sun, 10 Feb 2013 13:49:46 -0500
Content-Type:
text/plain
Parts/Attachments:
text/plain (43 lines)
Regarding this comment, by Jerry:

"Complex issues, not getting any simpler.  One thing Malcolm Sanford,  James
Tew, and I agree upon - when we started, one was given a quick Beekeeping
101 course, told to throw in some antibiotic in the spring, and collect the
honey.

Vastly different in these days with a diverse array of pests, diseases, and
 chemicals.  Gone are the days when a grower applied one fungicides once
during the growing season (which the Johansen and Mayer book concludes are of
 little risk to bees).  I've now seen 5-7  fungicide applications to a
crop, with 4-6 different chemicals, as just another example of the  complexity.

So, I believe you Stan, when you report problems.  Its just that  there's
no simple answer without some serious, thorough investigation.  Too  many
variables, all in play, at the same time."

In my first message, I was hoping to start some discussion about mathematical modeling of multiple risks.  It's very complicated and so I thought:  how about start simple and build up to a more sophisticated model?  I'm not a good mathematician (ask any of my co-authors!!) but since both field and lab testing are expensive, time-consuming, and difficult, wouldn't it be worthwhile to at least try to figure out what sorts of experiments are most valuable to do, with the resources available?  A good mathematical model would help with that.

So, I started with the Haber's rule because it's about as simple as it gets....and simple does not mean useless.  Haber won the Nobel Prize in chemistry; he was no slouch.  Dt = c, or Dose X time = constant effect.  Rough analogy:  Water pouring into a bucket.  Water = dose, pour time = time, and size of bucket = effect.  The size of the bucket determines the effect...a small bucket = big effect with small dose.  Big bucket = smaller effect (because it takes more water to fill it...and let's say the "effect" is "death", but it could be any other level of effect one wants to look at.)

Refinements are necessary.  The second part of my earlier message was about that...the National Academy of Sciences paper discussed some of this, in their piece evaluating combat risks to U.S. troops exposed to chemical hazards.  They pointed out that there are three ways to scale time in Haber's rule:  First, the half-life of the chemical being dosed (analogy: water evaporating as it enters the bucket).  Second, the rate of metabolism of the chemical (Analogy:  Punch a hole in the bucket)  Third, the frequency of the dose (analogy: how fast the water is poured into the bucket).

Since posting that, I came across a paper that actually appears to consider this!  It is called "Time-dependent species sensitivity distributions" by Fox and Billoir, published in Environmental Toxicology & Chemistry, vol. 32, no. 2, pp 378-383, 2012.  The abstract reads:  "Time is a central component of toxicity assessments. However, current ecotoxicological practice marginalizes time in concentration–response (C-R) modeling and species sensitivity distribution (SSD) analyses. For C-R models, time is invariably fixed,and toxicity measures are estimated from a function fitted to the data at that time. The estimated toxicity measures are used as inputs to the SSD modeling phase, which similarly avoids explicit recognition of the temporal component. The present study extends some commonly employed probability models for SSDs to derive theoretical results that characterize the time-dependent nature of hazardous concentration (HCx) values. The authors’ results show that even from very simple assumptions, more complex patterns in the SSD time dependency can be revealed."  So this is a step in an interesting direction.  They present a lot of math, then a three-D graph showing how concentration, time, and cumulative frequency interact for different species sensitivities.

That's great but they didn't really succeed in varying the concentration with time (at least that's what it looks like to me), so they don't appear to be accounting for metabolism/half-life effects.  But it is a step forward.

Thinking about the bucket analogy, we could put a lot of holes in the bucket.  But that means fiddling with metabolism, genetics, etc.  There are limits to what we can force/select an animal to do.  OK, then let's get the water to evaporate faster...but no, that won't work, because the stuff is meant to kill target pests and if the half-life is too short it's not useful.

How about a bigger bucket? So I keep thinking about this.  All I can come up with is that monoculture has to stop.  Wouldn't that allow us to use less stuff on the crops, and thus generate two possible effects...."decrease the amount of "increase the size of the bucket"?

Theoretically, if almond growers would consider checkerboarding with other crops and then allow wildflowers to grow between/under their trees, ecological theory would predict that all the species' health would improve....including that of the trees and the bees.  Same for other monocultured crops.  It's what they do in so-called "third world" countries with coffee and palm trees...and it works for them!

Christina

             ***********************************************
The BEE-L mailing list is powered by L-Soft's renowned
LISTSERV(R) list management software.  For more information, go to:
http://www.lsoft.com/LISTSERV-powered.html

Guidelines for posting to BEE-L can be found at:
http://honeybeeworld.com/bee-l/guidelines.htm

ATOM RSS1 RSS2