U.S. government underestimated contamination in Gulf of Mexico after Deepwater Horizon oil spill – ‘To see NOAA doing this, that’s inexcusable’
By HENRY FOUNTAIN
19 August 2013 (The New York Times) – An analysis of water, sediment, and seafood samples taken in 2010 during and after the oil spill in the Gulf of Mexico has found higher contamination levels in some cases than previous studies by federal agencies did, casting doubt on some of the earlier sampling methods. The lead author, Paul W. Sammarco of the Louisiana Universities Marine Consortium, said that dispersants used to break up the oil might have affected some of the samples. He said that the greater contamination called into question the timing of decisions by the National Oceanic and Atmospheric Administration to reopen gulf fisheries after the spill and that “it might be time to review the techniques that are used to determine” such reopenings. Eleven workers died and roughly 200 million gallons of crude oil gushed into the gulf after a blowout at an exploratory well owned by BP caused the Deepwater Horizon drilling rig to explode on 20 April 2010. Nearly two million gallons of Corexit, a dispersant, were sprayed on the surface or injected into the oil plume near the wellhead. In all, more than 88,000 square miles of federal waters were closed to commercial and recreational fishing. Some areas were reopened before the well was capped three months after the blowout; the last areas were reopened a year after the disaster. Like other studies after the spill, the new analysis, published last week in the journal Marine Pollution Bulletin, found that components of oil were distributed along the Gulf Coast as far west as Galveston, Texas — about 300 miles from the well site — and southeast to the Florida Keys. But the study found higher levels of many oil-related compounds than earlier studies by NOAA scientists and others, particularly in seawater and sediment. The compounds studied included polycyclic aromatic hydrocarbons, some of which are classified as probably carcinogenic, and volatile organic compounds, which can affect the immune and nervous systems. “When the numbers first started coming in, I thought these looked awfully high,” Dr. Sammarco said, referring to the data he analyzed, which came from samples that he and other researchers had collected. Then he looked at the NOAA data. “Their numbers were very low,” he said, “I thought what is going on here? It didn’t make sense.” Dr. Sammarco said that a particular sampling method used in some earlier studies might have led to lower readings. That method uses a device called a Niskin bottle, which takes a sample from a specific point in the water. Because of the widespread use of dispersants during the spill — which raised separate concerns about toxicity — the oil, broken into droplets, may have remained in patches in the water rather than dispersing uniformly. “Sampling a patchy environment, you may not necessarily hit the patches,” he said. The plastic that the bottles are made from also attracts oily compounds, potentially removing them from any water sample and leading to lower readings of contaminants, Dr. Sammarco said. Riki Ott, an independent marine toxicologist who has studied effects of the 1989 Exxon Valdez spill in Alaska as well as the BP spill, said she was “totally shocked” when she read the high numbers in Dr. Sammarco’s study. “To see NOAA doing this, that’s inexcusable,” Dr. Ott said, referring to the use of Niskin bottles. “It has been known since Exxon Valdez that this spotty sampling does not work.” [more]
Gulf Spill Sampling Questioned