The most “meaty” part of the EPA response to the emergency petition to ban clothianidin is the support documents to the main response letter. The EPA’s supporting document takes each argument and citation given in the petitioners’ State of the Science report writtten by PANNA (Pesticide Action Network North America) and classifies how important the EPA considers each study for making a decision.
In this document, the EPA is trying to judge the usefulness of the evidence presented by the petitioners. First, is it a relevant piece of research? That is, are the experiments in a study related to >clothianidin (or a closely related neonicotinoid)? Does the experiment represent how clothianidin is used in the field? Secondly, the EPA needs to know if the result given is a strong result. Did it have good study design? Are there good controls? Were statistics used appropriately? But that’s not how PANNA chose to provide their evidence. PANNA and the EPA differ quite a bit on how they use scientific evidence.
The PANNA report1 provides as evidence a lot of studies done on pesticides other than clothianidin (and surprisingly many were studies not even on neonicotinoids!). The EPA document is thus a bit repetitive and follows a very similar pattern: summarize the study’s methods and results, note any flaws or concerns, then note the results are of “low qualitative value for use in clothianidin risk assessment”. Another set of results provided are fairly interesting behavioral results for sub-lethal exposures. Unfortunately most of these studies either weren’t actually for neonics or used exposure amounts larger than are normally seen in the field. These studies are possibly great science, but not as useful for the EPA to do appropriate (and legal) risk assessments.
Sometimes the EPA’s analysis is a little bit more interesting and assesses actual methodological flaws. I am intentionally leaving out the study authors or title to not have it be searchable2 . If you want to find the underlying study, go download the source EPA document and look up the study.
Author, et. al. in a semi-field study (with 2300 bees/colony) reported no decrease in bee attendance at a feeder in the presence of 6 µ/kg imidacloprid, but that activity (defined as feeding on sucrose solution) at this concentration was decreased compared to controls during four days. Bees exposed to fipronil (2 µ/kg) were reported to exhibit both a decrease in attendance and a decrease in activity. The study used an unbalanced design with 8 controls temporally spaced prior to the experiment and one control and three treatment colonies used during the treatment. The authors then pooled the data from the 9 control plots. EFED typically recommends using only concurrently run control and treatment groups and an equal number of control and treatment groups. Of the three treatment groups, two exhibited a significant decrease in bee activity by the 4th treatment day, while one group experienced no decrease. The authors did not quantify this difference in the article other than to report it was significant. The study authors did not report food (sucrose) consumption to the feeder nor any colony parameters making the relevancy of this study on effects on full colonies under natural field conditions somewhat unclear, but the authors did use environmentally relevant concentrations of imidacloprid. This study would likely be considered of low qualitative use for clothianidin risk assessment purposes because of its unbalanced test design with only one control hive, pooling of controls with previous data prior to conducting statistical analysis, inability to define a dose-reponse given that only one treatment concentation was tested, unclear definition of “active” bees and how this parameter was observed and validated, and lack of measurement of food consumption by the bees.
Notice that this doesn’t say “we think this study isn’t very good”. Instead, it gives reasons why the study likely isn’t showing what PANNA claims it shows and probably isn’t a good one to use for risk assessment. Note that the study in question is cited elsewhere in literature — it’s not so poor that it’s been ignored by other scientists. It just has enough flaws that the EPA doesn’t consider it a very strong result. Contrast this now to how the petitioners describe the same study3:
This study investigated the sub-lethal effects of two insecticides in semi-field conditions on the foraging behavior of honey bees. Imidacloprid and fipronil were chosen because both behave systemically, were recently introduced, considered highly toxic to bees, had shown sub-lethal effects on bees in lab conditions and had been implicated in honey productivity declines in Europe. The primary aim was to address a gap in environmental assessment of systemic pesticides by improving on the methods used to quantify foraging behavior changes. Bee colonies were placed in enclosed tunnels and their feeding behavior video recorded over a period of five days, constituting a cumulative effects study much shorter than a bee or hive lifecycle study would be. With imidacloprid at 6.0 µg/kg, inactive bees those visiting the feeder, but not feeding increased over time in relation to active bees. With fipronil at 2.0 µg/kg, most bees stopped coming to the feeder by the last day, and the few that did tended to be inactive. Convulsions and paralysis were also observed in bees feeding on fipronil-contaminated sub-lethal levels 70 times below the referenced LD50s. They also concluded that their experimental protocol “provided an indispensable interface between controlled conditions in the laboratory and the field”, which suggests its adoption in regulatory testing of sub-lethal effects.
Note that PANNA notes no issues with this study. This is common in this document — studies I’ve read before are described in ways that ignore flaws, caveats or complications and emphasize results that support their position4. To PANNA, each study that at least partially supports their position is more evidence. That some studies are more convincing than others isn’t relevant. Noting flaws in that research or futher work that would be needed to make the case more certain doesn’t fit into that mindset. In other words, this document, though titled “Pesticides and Honey Bees: State of the Science”, is not a scientific document. I mean this in a philosophic sense: the PANNA document was not written with a scientific mindset of including all relevant issues, even if they do not fully support their policy position.
However, it looks pretty scientific to a casual reader. If you were passed a Facebook post about this petition — save the bees! — and dutifully decided to actually read some of the documents, you’d probably pick out the main “state of the science” report that the petition claims as its main evidence. While reading it, you would see all these scientific studies and think the case to ban clothianidin must be pretty straightforward. When the EPA responds denying the petition, the EPA looks like it is “ignoring the science”, even though all the EPA is doing is taking a properly critical attitude. Thus, the narrative that the EPA is in bed with industry and will always bend the science to protect them, is maintained. Hopefully something good can come out of this though: perhaps the suit will force a compromise and the EPA may regulate use to mitigate known harms (e.g. such as clothianidin-contaminated planter dust responsible for some acute bee-poisoning cases4).
- To be clear, I haven’t actually read every last line of all these documents yet. I’m still working thru these two main documents. ↩
- It seems unfair to quote an EPA criticism of their work with their names given and raise visibility of it. This is an example of how differently the EPA and PANNA assess a single study and how PANNA is willing to uncritically include anything that supports their position, not specifically a criticism of this study. ↩
- Yes, I intentionally put the EPA’s response to the petitioner first. Yes, this biases you. The way the EPA called out this result was hilarious to me and this post started out just as a quick comment on how droll and reserved scientific criticism can be. Then I realized it was a good example of how differently the EPA and PANNA view the body of scientific evidence. ↩
- For example, the PANNA report cites Krupke, et. al. that I first looked at. That’s a good study to cite! They even correctly note most of the findings. But then when discussing the main route of pesticide exposure — clothianidin-contaminated planter dust — the PANNA report fails to include that it’s possible to handle the dust in a more responsible manner. This is noted by the study authors themselves in their discussion, citing a 2010 study. PANNA’s report does not include this when citing the Krupke study, but later in an entirely different section cite a more recent study on mitigation of planter dust. A casual reader might assume there’s nothing to be done about planter dust because PANNA only includes a reference to the one that supports their position. Unfortunately I don’t have access to one of the studies so can’t really make a judgement … not that two studies would likely be sufficient. ↩