Log DOW: Key to Understanding and Regulating Wastewater-Derived Contaminants

2006 ◽  
Vol 3 (6) ◽  
pp. 439 ◽  
Author(s):  
Martha J. M. Wells

Environmental Context. Worldwide, surface water is a source of drinking water and is a recipient of wastewater effluents and pollutants. Many surface water bodies undergo a natural, cyclical, diurnal variation in pH between 7 and 9. Most drinking water and wastewater treatment in the United States is conducted between pH 7 and 8. The pH of water undergoing treatment processes directly impacts the ratio of nonionized to ionized chemical form(s) present, which in turn impacts the success rate of contaminant removal. Many organic wastewater-derived contaminants are very water soluble at pH 7–8 and are inadequately treated. Abstract. Wastewater-derived contaminants (WWDCs) occur in surface water due to inadequate wastewater treatment and subsequently challenge the capabilities of drinking water treatment. Fundamental chemical properties must be understood to reduce the occurrence of known WWDCs and to better anticipate future chemical contaminants of concern to water supplies. To date, examination of the fundamental properties of WWDCs in surface water appears to be completely lacking or inappropriately applied. In this research, the hydrophobicity–ionogenicity profiles of WWDCs reported to occur in surface water were investigated, concentrating primarily on pharmaceuticals and personal care products (PPCPs), steroids, and hormones. Because most water treatment is conducted between pH 7 and 8 and because DOW, the pH-dependent n-octanol–water distribution ratio embodies simultaneously the concepts of hydrophobicity and ionogenicity, DOW at pH 7–8 is presented as an appropriate physicochemical parameter for understanding and regulating water treatment. Although the pH-dependent chemical character of hydrophobicity is not new science, this concept is insufficiently appreciated by scientists, engineers, and practitioners currently engaged in chemical assessment. The extremely hydrophilic character of many WWDCs at pH 7–8, indicated by DOW (the combination of KOW and pKa) not by KOW of the neutral chemical, is proposed as an indicator of occurrence in surface water.

2006 ◽  
Vol 4 (S2) ◽  
pp. 89-99 ◽  
Author(s):  
Rebecca L. Calderon ◽  
Gunther F. Craun

The nature and magnitude of endemic waterborne disease are not well characterized in the United States. Epidemiologic studies of various designs can provide an estimate of the waterborne attributable risk along with other types of information. Community drinking water systems frequently improve their operations and may change drinking water treatment and their major source of water. In the United States, many of these treatment changes are the result of regulations promulgated under the Safe Drinking Water Act. A community-intervention study design takes advantage of these “natural” experiments to assess changes in health risks. In this paper, we review the community-intervention studies that have assessed changes in waterborne gastroenteritis risks among immunocompetent populations in industrialized countries. Published results are available from two studies in Australia, one study in the United Kingdom, and one study in the United States. Preliminary results from two other US studies are also available. Although the current information is limited, the risks reported in these community-intervention studies can help inform the national estimate of endemic waterborne gastroenteritis. Information is provided about endemic waterborne risks for unfiltered surface water sources and a groundwater under the influence of surface water. Community-intervention studies with recommended study modifications should be conducted to better estimate the benefits associated with improved drinking water treatment.


2001 ◽  
Vol 1 ◽  
pp. 39-43 ◽  
Author(s):  
V. Zitko

Many countries require the presence of free chlorine at about 0.1 mg/l in their drinking water supplies. For various reasons, such as cast-iron pipes or long residence times in the distribution system, free chlorine may decrease below detection limits. In such cases it is important to know whether or not the water was chlorinated or if nonchlorinated water entered the system by accident. Changes in UV spectra of natural organic matter in lakewater were used to assess qualitatively the degree of chlorination in the treatment to produce drinking water. The changes were more obvious in the first derivative spectra. In lakewater, the derivative spectra have a maximum at about 280 nm. This maximum shifts to longer wavelengths by up to 10 nm, decreases, and eventually disappears with an increasing dose of chlorine. The water treatment system was monitored by this technique for over 1 year and changes in the UV spectra of water samples were compared with experimental samples treated with known amounts of chlorine. The changes of the UV spectra with the concentration of added chlorine are presented. On several occasions, water, which received very little or no chlorination, may have entered the drinking water system. The results show that first derivative spectra are potentially a tool to determine, in the absence of residual chlorine, whether or not surface water was chlorinated during the treatment to produce potable water.


2020 ◽  
Vol 185 ◽  
pp. 109385 ◽  
Author(s):  
Donatella Feretti ◽  
Mattia Acito ◽  
Marco Dettori ◽  
Elisabetta Ceretti ◽  
Cristina Fatigoni ◽  
...  

Water ◽  
2020 ◽  
Vol 12 (6) ◽  
pp. 1772
Author(s):  
Saria Bukhary ◽  
Jacimaria Batista ◽  
Sajjad Ahmad

Drinking water treatment, wastewater treatment, and water distribution are energy-intensive processes. The goal of this study was to design the unit processes of an existing drinking water treatment plant (DWTP), evaluate the associated energy consumption, and then offset it using solar photovoltaics (PVs) to reduce carbon emissions. The selected DWTP, situated in the southwestern United States, utilizes coagulation, flocculation, sedimentation, filtration, and chlorination to treat 3.94 m3 of local river water per second. Based on the energy consumption determined for each unit process (validated using the plant’s data) and the plant’s available landholding, the DWTP was sized for solar PV (as a modeling study) using the system advisor model. Total operational energy consumption was estimated to be 56.3 MWh day−1 for the DWTP including water distribution pumps, whereas energy consumption for the DWTP excluding water distribution pumps was 2661 kWh day−1. The results showed that the largest consumers of energy—after the water distribution pumps (158.1 Wh m−3)—were the processes of coagulation (1.95 Wh m−3) and flocculation (1.93 Wh m−3). A 500 kW PV system was found to be sufficient to offset the energy consumption of the water treatment only operations, for a net present value of $0.24 million. The net reduction in carbon emissions due to the PV-based design was found to be 450 and 240 metric tons CO2-eq year−1 with and without battery storage, respectively. This methodology can be applied to other existing DWTPs for design and assessment of energy consumption and use of renewables.


2019 ◽  
Vol 24 (1) ◽  
pp. 135-163
Author(s):  
Jader Martínez Girón ◽  
Jenny Vanessa Marín-Rivera ◽  
Mauricio Quintero-Angel

Population growth and urbanization pose a greater pressure for the treatment of drinking water. Additionally, different treatment units, such as decanters and filters, accumulate high concentrations of iron (Fe) and manganese (Mn), which in many cases can be discharged into the environment without any treatment when maintenance is performed. Therefore, this paper evaluates the effectiveness of vertical subsurface wetlands for Fe and Mn removal from wastewater in drinking water treatment plants, taking a pilot scale wetland with an ascending gravel bed with two types of plants: C. esculenta and P. australis in El Hormiguero (Cali, Colombia), as an example. The pilot system had three upstream vertical wetlands, two of them planted and the third one without a plant used as a control. The wetlands were arranged in parallel and each formed by three gravel beds of different diameter. The results showed no significant difference for the percentage of removal in the three wetlands for turbidity (98 %), Fe (90 %), dissolved Fe (97 %) and Mn (98 %). The dissolved oxygen presented a significant difference between the planted wetlands and the control. C. esculenta had the highest concentration of Fe in the root with (103.5 ± 20.8) µg/g ; while P. australis had the highest average of Fe concentrations in leaves and stem with (45.7 ± 24) µg/g and (41.4 ± 9.1) µg/g, respectively. It is concluded that subsurface wetlands can be an interesting alternative for wastewater treatment in the maintenance of drinking water treatment plants. However, more research is needed for the use of vegetation or some technologies for the removal or reduction of the pollutant load in wetlands, since each drinking water treatment plant will require a treatment system for wastewater, which in turn requires a wastewater treatment system as well.


2005 ◽  
Vol 71 (2) ◽  
pp. 1042-1050 ◽  
Author(s):  
Gerald Sedmak ◽  
David Bina ◽  
Jeffrey MacDonald ◽  
Lon Couillard

ABSTRACT Reoviruses, enteroviruses, and adenoviruses were quantified by culture for various ambient waters in the Milwaukee area. From August 1994 through July 2003, the influent and effluent of a local wastewater treatment plant (WWTP) were tested monthly by a modified U.S. Environmental Protection Agency Information Collection Rule (ICR) organic flocculation cell culture procedure for the detection of culturable viruses. Modification of the ICR procedure included using Caco-2, RD, and HEp-2 cells in addition to BGM cells. Lake Michigan source water for two local drinking water treatment plants (DWTPs) was also tested monthly for culturable viruses by passing 200 liters of source water through a filter and culturing a concentrate representing 100 liters of source water. Reoviruses, enteroviruses, and adenoviruses were detected frequently (105 of 107 samples) and, at times, in high concentration in WWTP influent but were detected less frequently (32 of 107 samples) in plant effluent and at much lower concentrations. Eighteen of 204 samples (8.8%) of source waters for the two DWTPs were positive for virus and exclusively positive for reoviruses at relatively low titers. Both enteroviruses and reoviruses were detected in WWTP influent, most frequently during the second half of the year.


Sign in / Sign up

Export Citation Format

Share Document