Weather Based Irrigation Controllers

Small scale, residential weather-based irrigation controllers (WBICs) first appeared on the scene during the late 1990s. Up until then weather-based irrigation controllers had only been available for commercial landscapes. These tended to be large, expensive systems. Since over half of residential consumption can be traced to outdoor use, a good chunk of it wasteful, water suppliers have been hungry for newer, effective tools to improve outdoor water use efficiency. As a result, residential WBICs have been extensively studied, with the first successful field trial completed in 2001. Many additional trials undertaken since then have added to the corpus of knowledge available about these devices. This paper’s objective is to take stock of what we know and identify gaps that still need to be filled. Residential WBICs have come a long way since the 1990s when only 1 manufacturer was offering this technology. Today there are 20 manufacturers.[1] WBICs can now be purchased easily at big box stores. Costs have dropped with scale and competition. Federal water use efficiency standards have been extended to cover WBICs through EPA’s WaterSense labeling program (analogous to the highly successful EnergyStar labeling program for promoting energy efficiency) which has brought a level of standardization to WBICs that would otherwise have been absent. Finally, water suppliers have actively promoted these devices through hefty rebate programs. Statewide in California, thousands of traditional irrigation timers have been retrofitted with WBICs. In sum, actions taken during the past 15 years by water suppliers, device manufacturers, landscape industry professionals and regulatory agencies has brought significant maturity to the WBIC market.


WBICs are a subset of a larger group of irrigation controllers that are increasingly referred to as “smart” controllers. A key feature of “smart” controllers is that they can sense environmental conditions and tailor irrigation accordingly, without human intervention. Metrics used to capture these conditions, however, can vary. These conditions may include atmospheric metrics, such as temperature, solar radiation, wind speed, precipitation etc., for calculation of plant evapotranspiration rates, or direct soil moisture measurements. Apart from their sensing elements, “smart” controllers can also establish a more efficient baseline schedule from the get go, by allowing the user to incorporate information about landscape characteristics into the scheduling, such as plant type, soil type, slope, shade, etc. This can be a double-edged sword though; optimum scheduling of a “smart” controller requires a greater degree of scheduling knowledge. Users that blindly rely upon default built-in parameters can rob a “smart” controller of some of its ability to save water, hence the significant need for education and outreach to accompany WBIC retrofit programs.

In contrast, a traditional irrigation controller is no more than a timer, blindly irrigating a landscape some number of minutes per day based on the programmed schedule. A great deal of survey research shows that most homeowners change irrigation schedules infrequently, usually only seasonally, leading to wasteful irrigation (more on this topic later).

As mentioned earlier, “smart” controllers can include products that either sense atmospheric weather or soil-moisture content. This paper mainly focuses on the former, although where germane comparisons with soil-moisture based “smart” controllers have also been factored into the discussion.

Weather-based irrigation (“smart”) controllers are in turn divided into two broad groups:

1. Controllers with on-site weather sensors. One or more sensors (for example, for temperature, solar radiation, precipitation) feed weather data into the controller, which then calculates a plant evapotranspiration rate in real time, using this estimate to determine or modify a base irrigation schedule to reflect plant watering needs on any given day. These sensors may be designed to communicate with the controller through a wired connection or wirelessly. The sensors may be sold as part of a full package consisting of a controller, weather sensor and other accessories. Or, they may be sold as add-on units for existing compatible controllers, which become “smart” when coupled with the add-on unit.
2. Controllers that receive weather data from an off-site source. These controllers receive evapotranspiration data appropriate for their location from a central transmitting source, which are then used to create or adjust a baseline schedule programmed into the controller. These weather data may arrive via signals transmitted through the cellular phone spectrum, or through the internet via a home’s Wi-Fi network. Some manufacturers may charge an ongoing monthly fee to provide real-time weather data. Others may offer it free of charge, factoring in the cost of providing real-time weather information into the initial price.

In either type of WBIC, failure whether in the external sensor module or in the reception of real-time weather data causes the smart controller to revert to a default schedule (a baseline schedule or the last calculated schedule saved to memory). Landscapes continue to be watered, just not efficiently. Many residential WBICs can be controlled and programmed through a computer or smartphone based interface, with the controller relaying back system faults in case one develops. This 2-way communication capability was earlier available only among large central irrigation systems found in commercial landscapes, but looks to become more widespread in residential settings as the “internet of things” slowly expands to cover WBICs.

Water suppliers interested in promoting WBICs through rebate and retrofit programs have looked for ways to ensure that products they incentivize meet some minimum efficacy standards where efficacy is defined as adequate irrigation with minimal runoff. Collaborative efforts in this regard between water suppliers and the irrigation industry led to a testing protocol called the Smart Water Applications Technologies (SWAT) initiative. Under this testing protocol smart controllers are evaluated using a “virtual landscape” designed to mimic different plant materials and soil types. Test results are published by consent of the manufacturers, but the test does not result in a “pass” or “fail” grade. It is up to the consumer and water suppliers to decide how to use the test results.

The Environmental Protection Agency’s (EPA) WaterSense labeling program, however, works a little bit differently. EPA’s WaterSense endorsement program is meant to raise water use efficiency just as the EnergyStar endorsement program has done for energy. The EPA has been endorsing WBICs as WaterSense compliant since 2012. EPA’s testing protocol is to a great degree based on the SWAT protocol, but it is administered separately and EPA does not publish its test results. Instead, WBICs are allowed to be advertised as WaterSense compliant if they meet minimum efficacy standards defined as 80% irrigation adequacy, less than 10% excess irrigation in any single landscape zone, and less than 5% excess irrigation when averaged across all landscape zones (per the “virtual” testing).

WaterSense endorsed WBICs must also include several additional design features. These include presence of non-volatile memory so that irrigation settings are not lost due to power outage, ability to notify the user when the device is not functioning in “smart” mode, ability to connect to a rain sensor, ability to easily accommodate utility-mandated watering restrictions, ability to ratchet down irrigation by a user-selected percentage to stay within a budget, etc. These controllers also allow the user to set parameters for plant type, soil type, slope, shade, etc. that are important for establishing an efficient baseline schedule.

By and large water suppliers limit their retrofit incentive programs to SWAT-tested or WaterSense compliant WBICs. Efforts to extend the WaterSense endorsement to qualifying soil-moisture based “smart” controllers are also underway.


Several field trials have now been completed to evaluate water savings from WBIC retrofits. Many of the earlier studies, based on small samples, were mostly technology demonstration projects. These studies, while cited, have not otherwise been used for savings quantification[1]. Limited sample sizes makes them less useful compared to the larger-scale studies that have now become available. We largely focus on the latter for assessing the water savings potential of WBIC retrofits (Table 1).

Table 1. Results from Key Field Trials of WBICs

Location of Field Trial & Report Publication Year Percent Reduction in Outdoor Use Percent Reduction in Total Household Use Reduction in Gallons Per Day Per Household Sample Size
Orange County, CA, 2001[2] 16-24% 7-10% 37-57 gl. 40 SF homes
Orange County, CA, 2004[3] n.a. 10.00% 41 gl. 97 SF homes
Northern & Southern California, 2009[4] 7.30% n.a. 58 gl. 1,987 SF homes
Orange County, CA, 2010[5] 9.70% 7.10% 37 gl. 899 SF homes
Orange County, CA, 2011[6] n.a. 9.40% 49 gl. 70 SF homes

The 2001 Report: The Irvine Ranch Water District was one of the first water agencies to undertake a fairly comprehensive evaluation of WBICs. The study tested an early prototype of what eventually evolved into a WBIC called WeatherTrak manufactured by Hydropoint, Inc. The results were published in 2001. This study solicited single-family customers from among the top 20% of water users to participate in the study. The study found that the retrofit group reduced its total per-household use by 7% or 37 gallons per day. Savings represent roughly 16% of outdoor use. The study utilized a pre-versus-post evaluation design, including a control group. Savings were derived by comparing 1 year of post-retrofit data to 2 years of pre-retrofit data controlling for weather. The study was later extended to a second post-retrofit year. Savings were found to be marginally higher during the second year as the irrigation scheduling improved on account of learning by doing. Across the two post-retrofit years, total household water use was shown to have dropped by 41 gallons per day anticipating the findings of the later 2004 study. The 2001 study, however, was affected by significant self-selection bias. Compared to the control group, the treatment group was shown to have much lower wasteful irrigation prior to the WBIC retrofit. The 2001 study calculated that if WBICs had been installed in homes that resembled the control group (also selected from the top 20%) then water demand reductions would have been closer to 10% of total household use (or 24% of outdoor use or 57 gallons per household per day) instead of the estimated 7%. The study emphasized the need for proper targeting of wasteful irrigators to ensure a cost-effective program.

The 2004 Report: The 2001 “ET Controller” report was followed by what is informally called the “R3 Study” published in 2004. This study, also conducted in Irvine, California, showed that WBIC retrofits combined with customer education reduced water use by 10% or 41 gallons per household per day in the retrofitted homes. The 2004 study did not provide estimates of the percentage reduction in outdoor use, only total household use. However, under the reasonable surmise that roughly half of total use is accounted for by irrigation, a 10% reduction in total household use translates into a 20% reduction in outdoor use. This study also used a pre-versus-post retrofit comparison, including a control group. Once again the WBIC used in this study was an early prototype of what became WeatherTrak manufactured by Hydropoint, Inc. The R3 study also had other components such as evaluating the reduction in urban runoff as a result of WBIC retrofits, but those elements are not relevant to this paper. The “R3 Study” also targeted some of Irvine’s high water users, but the study does not clearly describe the percentile from which its retrofit group was drawn. The impact of selection bias on savings estimates also remains unexamined.

The 2009 Report: These two successful evaluations set the stage for scaling up of WBIC retrofit programs across California, establishing savings from larger samples, and most importantly comparing the efficacy of different WBIC models. A large multi-agency retrofit program was undertaken encompassing both northern and southern California. Study results were published in 2009. In southern California, very little attempt was made to target high water users. The northern California agencies supposedly did target high water users, but it’s not clear from the report how this was done. The 2009 study’s findings stand out as an outlier being much lower than the two studies that came before it, or the two since. The 2009 report only provides information about the reduction in outdoor use caused by WBIC retrofits. Estimates are derived by comparing 1 year of pre- and 1 year of post-retrofit water use controlling for weather differences between the two periods. The study design did not include a control group. The study also did not develop statistical billing-data models to estimate program impacts. Instead, it developed estimates of outdoor use based on minimum month usage to separate out indoor from outdoor use. Use of billing data models would have led to much more precise and robust results. Many of this study’s findings remain inconclusive (statistically insignificant) in spite of large samples probably because it relies on difference-of-means tests instead of models to assess significance.

Across all retrofitted sites, outdoor use was reported to have dropped by 6%, with savings in northern California being only a little bit higher than southern California (6.8% versus 5.6%). We surmise from this meagre difference between north and south that only very mild targeting was attempted by the northern participating agencies. Table 1, however, highlights savings estimated only for the residential sites retrofitted with WBICs in the 2009 study to enable an apples-to-apples comparison with the other residential retrofit studies. Of the 2,294 sites retrofitted with WBICs in this study 1,987 were residential. The 2009 study reported a reduction of 7.3% in outdoor use, or 58 gallons per household per day, among all the residential sites included in the study. For such a small percentage reduction to translate into such high gallons of savings must imply these residential sites were very large users of water, but the study does not provide any information to shed further light on this.

The 2009 report provides important clues as to why its savings estimates are so much lower than the other studies. Roughly 47% of the retrofitted sites were practicing either deficit or efficient irrigation prior to the WBIC retrofit. If savings are derived from the top fifth of this study’s sample (that is, the sample is sorted by the level of over-irrigation taking place prior to the retrofits in descending order and the top fifth is selected for analysis), the reduction in outdoor use rises to roughly 13% instead of the reported average of 6%.[7] While this estimate is still lower than the studies that came before it, the difference is much reduced, once again highlighting the importance of proper targeting.

Recently, a subset of the Southern California data from the 2009 study was reanalyzed using billing data models[8]. A control group was also developed and incorporated into this reanalysis. The billing data models suggest that WBIC retrofits in the residential sector reduced total per household water use by roughly 15%. While this revised savings estimate is highly significant, its magnitude is not directly comparable to the original study’s estimates, for two reasons. The reanalysis does not include all the Southern California retrofitted homes that were included in the original. And it is based on 3 years of post-retrofit data instead of 1 year in the case of the original. Nonetheless, the reanalysis does bolster the case for favoring models over less efficient statistical techniques, such as difference-in-means comparisons.

The 2009 study also evaluated performance differences across various WBIC models used in the retrofit program. However, since these different models were not assigned to study participants at random, any assessment of comparative efficacy must be treated with abundant caution. It is very clear from the study results that excess irrigation taking place prior to the retrofits differed greatly across various controller models. Until such randomized trials can be completed in the field, greater credence should be given to the SWAT test results for addressing questions about relative performance. Of course, that is what water suppliers are implicitly doing in practice. By and large they are allowing only SWAT-tested or WaterSense endorsed WBICs to qualify for rebates under the auspices of their retrofit programs.[9]

The 2010 Report: The Municipal Water District of Orange County commissioned this evaluation to examine savings from their scaled up Smart Timer program. The study included both residential and commercial customers. Table 1 only reports results applicable to residential customers. This study concluded that WBIC retrofits had reduced total per-household residential use by 7.3% (outdoor use by 9.7%) or 37 gallons per day, quite comparable to the 2001 and 2004 reports. The report does not offer sufficient detail to explain why percentage reduction in outdoor use is not significantly greater than in total use. This study did not use statistical models estimated from customer level billing data, nor did it include a control group to calculate savings. The 2010 report also does not mention anything about customer targeting based on high water use or wasteful irrigation practices. But some of the data in the report’s technical appendices suggest that most residential participants in this program were probably high water users.

The 2011 Report: The Municipal Water District of Orange County once again commissioned this evaluation to examine savings from their scaled up Smart Timer program. The study included both residential and commercial customers. Table 1 only reports results applicable to residential customers. The 2011 report concluded that WBIC retrofits had reduced total per-household use by 9.4% or 49 gallons per day. The results were derived from statistical models estimated from roughly 5 years of customer level billing data, including a control group. The retrofit group appears to consist of higher water users but the report is silent about customer targeting practices used by the Smart Timer program or the impact of selection bias on savings estimates.

Summary: The discussion above shows that most of the larger studies undertaken to date suggest that savings of 40-50 gallons per household per day, or roughly 10% of total use can be expected from a residential WBIC retrofit program assuming such programs target high water users. The criterion on which these targeting rules should be based, however, remain poorly understood. Should one target based on average water use, lot size, irrigated area, level of excess irrigation, or some combination? We don’t really know. It is also noteworthy that many of the large-scale evaluations undertaken to date have taken place in Orange County, California. These studies have generally failed to address how their results should be extrapolated to other areas with different property characteristics and/or weather patterns. So, that is yet another open question that merits further research.

Finally, it is worth pointing out that some of the early comparative evaluations performed on soil-moisture sensor based “smart” controllers suggest that they may be equal to or more effective than weather-based “smart” controllers, especially in humid climates with frequent rainfall (a lot of this research has been conducted in Florida).[10] It is not yet known with a high degree of confidence whether this conclusion would also hold in California given big differences between California’s and Florida’s weather. Future studies need to evaluate this question.


The design of WBIC retrofit programs can greatly affect both the level of water savings, as well as the level of customer satisfaction. When WBICs first became available, many hoped that a “hang it on the wall and walk away” approach would work, allowing water agencies to adopt fairly low cost programs to distribute this new technology.

Figure 1. Peak summer schedule relatively accurate, customer inattentive

Key to this expectation was the assumption that homeowners probably irrigate near optimum during the peak summer season but fail to properly scale irrigation up or down during the off-summer seasons (Figure 1). A great deal of survey research indicates that residential customers modify their irrigation schedules only infrequently, usually seasonally. In such a scenario, simply transferring summer schedules from the existing controller to the retrofitted controller, or estimating a baseline schedule using simple rules of thumb, could be expected to generate marked savings. However, if in reality excess irrigation results less due to inattention, and more due to lack of knowledge on the customer’s part (Figure 2), then savings depend to a greater extent on getting the baseline schedule right, which in turn implies a higher level of customer service during the retrofit phase, and ongoing customer education thereafter to ensure that the irrigation system as a whole is maintained at a high level of efficiency.

Figure 2. Customer relatively attentive, but not knowledgeable

Instead of relying on generalized claims about customer behavior, it is important that agencies carefully assess which of the above two worlds its customers inhabit, before embarking on a WBIC retrofit program. The 2001 and 2004 reports carefully examined this issue and showed that Orange County’s residential customers exhibit a fairly sharp seasonal pattern with over watering mostly concentrated during the summer and fall seasons—in other words, a pattern more like Figure 2 than Figure 1. It is difficult to say whether these patterns are unique to Irvine, possibly because of their budget-based rate structure, or are more general in nature (suggesting unreliability in prior survey research). That is why agencies must carefully assess this issue while designing their retrofit programs.

Expert opinion is fairly unanimous that a “set it and forget it” approach does not work well. Water conservation goals are difficult to realize until homeowners are involved in ensuring that their entire system is working properly, that is, the landscape is properly hydro-zoned, is free of leaks, has appropriate nozzle hardware with proper pressure regulation, the controller is correctly programmed, its “smart” features are actually working, and so on. This is true of both weather-based and soil-moisture based “smart” controllers. The homeowner’s involvement may range from as simple a matter as changing batteries especially if the weather or soil-moisture sensors are wireless, to a deeper understanding of horticultural and irrigation science to be able to correctly program their “smart” controller. Some of the water savings that we expect from “smart” controllers comes from correctly programming parameters such as, soil type, plant type, slope, shade, etc. Relying on default inputs can rob a “smart” controller’s ability to save water.

Finally, it must be remembered that the word “smart” does not always conjure the same meaning for homeowners as it does for irrigation professionals. Many homeowners associate the word “smart” with a controller’s ability to interface with a computer or smartphone, not necessarily with water use efficiency, which is why outreach and education are so important.


Estimating the cost of a new WBIC has become quite complicated given the plethora of models, manufacturers and hardware combinations that are now available to the residential consumer. At one end are add-on weather sensing modules that can be purchased and connected with an existing controller to make it “smart.” Not all controllers have the ability to accommodate an external sensor, but many do. At the other end are fully integrated units that would need to be purchased as a package to replace an existing controller. The add-on weather sensors for a 6-8 station residential controller can usually be purchased for under $100. Fully integrated “smart” controllers for residential applications can cost anywhere between $200 and $500, with a few models costing a great deal more. Professional installation can easily add another $100 to the cost of a new “smart” controller. The cost range for soil-moisture based “smart” controllers appears similar to WBICs. Given the rapid innovation that is taking place in this technological space, cost information will remain bit of a moving target requiring frequent market updates.


There are no good field data for estimating the average life of a WBIC. Expert judgment by landscape professionals generally places it at roughly 10 years. While this may be a good estimate for the main controller mounted indoors or within an enclosure, the outdoor weather sensors if present may have significantly shorter lives because of their exposure to the elements. How many replace faulty sensors in a timely manner, and how many simply accept a “smart” controller that has defaulted to a traditional controller? Similarly, those that need to subscribe to a paid weather data service for making their WBIC work in “smart” mode, how many cancel this service before the main controller fails? We don’t yet have good answers to these questions.


Several questions remain unanswered despite great interest in WBICs and many completed field trials. These include:

1. How should savings estimates obtained from field trials be extrapolated to other areas with very different property characteristics and/or weather patterns?

2. How should wasteful irrigators be identified and steered toward WBIC retrofit programs? What customer targeting rules lead to a cost-effective retrofit program?

3. What is the longevity of savings associated with WBIC retrofits? What actions should be taken to enhance savings longevity?

4. How do savings compare between weather-based and soil-moisture based “smart” controllers? Which circumstances favor one over the other?

With respect to the first two questions, it is important that future field evaluations build a “look forward” feature into their study design. It is not enough to estimate what a certain program saved in retrospect. It is also important to provide a way for others to assess that if program XYZ saved a certain amount of water in a given service area, how much is a similar program likely to save in another area with different socio-economic and weather characteristics. Very often, a field evaluation will report savings in terms of gallons per household or percent reduction per household without providing any sense for how these savings varied by different household characteristics, such as total water use, or lot size, or irrigated acreage, or level of wasteful irrigation prior to the retrofit. Without such granularity, however, others cannot extrapolate a study’s results to their own service areas. We hope future studies will go the extra mile to make their findings more broadly applicable, in the bargain also improving our ability to dissect and compare an evaluation’s results with those of other such evaluations.

Annotated Water Savings Literature

MWDOC and IRWD (2004) report their most recent in-depth study of their 7 year research program in the Residential Runoff Reduction Study ( PDFs/runoff-table-of-contents.htm). The study measured the change in metered water consumption and directly measured urban runoff reduction (in flow volume and water quality). It determined ET controllers reduced household water use on average by 41 gallons per day per single-family household (approximately 10 percent of total household water use); the bulk of the savings occurred in the summer and fall periods. The education-only group of residential customers saved 26 gpd, or about 6 percent of total water use. The savings from this group were more uniform throughout the year. The report provides a discussion of the additional benefit attributable to peak period demand-load reduction. In addition, 15 large landscape sites with dedicated landscape meters were retrofit with ET controllers (ranging in size from 0.14 acres to 1.92 acres) This portion of the study showed average water savings of 545 gpd.

Compared to a control group, the retrofit group showed a reduction of 71 percent in dry season runoff. Water quality indicators were highly variable and low statistical power precluded detection of statistically significant differences. Customer acceptance of ET controllers was robust with 72 percent of the participants indicating that they liked the controllers and 70 percent ranking their landscape appearance as good to excellent.

IRWD (2001), the “ET Controller Study,” tested a system of controllers that were automatically adjusted using a broadcast signal based on weather conditions. The test group was compared to both a control group without intervention and a group that received postcards with ET information but no automatic controller adjustments. The controllers fitted to the test homes Irrigation Controllers

California Urban Water Conservation Council 2-3 were all pre-programmed with the same irrigation schedule, which was then modified each week by the broadcast signal. Total household consumption was estimated to decline 7 percent in the post-retrofit year—roughly a 16 percent reduction in outdoor consumption--controlling for weather. This translates into a reduction of 37 gallons per household per day. The author cautions the reader against simplistically applying these savings results to other customers as the program was voluntary and evidence was presented to indicate the study group conservation potential was less than for average customers who had similar initial water consumption.

Aqua Conserve (2002) reports that ET controllers adjusted with historical data and temperature sensors successfully conserved water for high-volume residential customers in Colorado and California. Total outdoor water savings were 21 percent in Denver, with an average savings per participant of 21.47 percent. (A symmetric distribution of savings was reported for Denver.) For the City of Sonoma, total outdoor savings were 23 percent, with an average savings per participant of 7.37 percent. (A skewed distribution of water savings was reported for Sonoma.)

Valley of the Moon Water District reported 28 percent total savings with an average savings per participant of 25.1 percent. (A symmetric distribution of water savings was reported for Valley of the Moon.) Savings were calculated as post-intervention consumption relative to five-year historical consumption. A control group was used to control for test-year weather.

Aquacraft (2003) reports that of the 10 sites included in their study, savings averaged 26,000 gallons per year per site; savings from the 5 largest-saving sites were 68,000 gallons per site. As a group, water application by the controllers was 94 percent of ETo, or 28 inches of water. The sites were a combination of volunteer sites and those with high volume water use; all were residential except for one commercial site.

Bamezai (1996) reports savings in a study that considered the effects of connecting multiple meters to a central irrigation controller that controls watering based on ET for each meter. Controlling for climate and landscape size, the average savings per meter at the site was 34 percent. Most of the savings were achieved on sloped areas with diverse plant materials.

Persistence Bamezai (2001) reports the results of an analysis of savings in the second year following the retrofit with ET controllers as described in IRWD (2001). Water savings for the entire household was 8.2 percent in the second post-retrofit year, compared to 7.2 percent in the first year. Since these sites were not separately metered, an approximation was used to estimate savings attributed to outdoor use and the ET controller program. Using this approximation, the outdoor savings was 18 percent.

DeOreo (1998) reports the results of a study of soil moisture sensors that work in conjunction with conventional irrigation timers to stop watering during rain and whenever soil moisture is otherwise adequate. The study reports that after five years, the sensors “successfully match irrigation applications to requirements with the seasonal applications” … “ranging from 52% to 124% of the theoretical, and the average equaling 76 percent.” The wide range is because the controllers were set to maximum in this test.

Irrigation Controllers


  • For ET controllers to be fully effective, the existing irrigation system must be operated and maintained properly.
  • Some studies had to approximate the outdoor water consumption because target sites did not meter landscape use separately.
  • The studies more frequently selected large volume customers and volunteers. Care should be taken in generalizing these results as large customers tend to generate large absolute savings figures (not necessarily larger percent savings, however) and volunteers tend to be relatively more receptive to conservation than average.

Confidence in Estimates


Related Literature

File:2.1 Irrigation Controllers (Residential)-CSS-2005.pdf


A&N Technical Services (2004), “Residential Runoff Reduction Study, Appendix C: Statistical Analysis of Water Savings,” prepared for the Municipal Water District of Orange County and the Irvine Ranch Water District, July.

A&N Technical Services (2004), “Residential Runoff Reduction Study, Appendix D: Statistical Analysis of Urban Runoff Reduction,” prepared for the Municipal Water District of Orange County and the Irvine Ranch Water District, July.

A&N Technical Services (1997), “Landscape Water Conservation Programs: Evaluation of Water Budget Based Rate Structures,” prepared for the Metropolitan Water District of Southern California, September.

Aqua Conserve (2002), “Residential Landscape Irrigation Study using Aqua ET Controllers,” Information from the CUWCC ET and Weather-Based Irrigation Controllers Workshop, March 2003. URL:

Aquacraft (Undated, Downloaded 2003), “Performance Evaluation of WeatherTRAK Irrigation Controllers in Colorado.” URL:

Ash, T. (1998), “Landscape Management for Water Savings: How to Profit from a Water Efficient Future,” Municipal Water District of Orange County.

Ash, T. (2002), “Using ET Controller Technology to Reduce Demand and Urban Water Run-Off: Summary of the Technology, Water Savings Potential & Agency Programs,” American Water Works Association Water Sources Conference Proceedings.

Bamezai, A. (1996), “Do Centrally Controlled Irrigation Systems Use Less Water? The Aliso Viejo Experience,” prepared for the Metropolitan Water District of Southern California, October.

Bamezai, A. (2001), “ET Controller Savings Through the Second Post-Retrofit Year: A Brief Update,” prepared for the Irvine Ranch Water District, April.

CUWCC 2003, “ET and Weather-Based Irrigation Controllers Workshop: Product Information,” California Urban Water Conservation Council, March. URL:

DeOreo, W.B., et al. (Undated, Approximately 1998), “Soil Moisture Sensors: Are They a Neglected Tool?”

IRWD (2001), “Residential Weather-Based Irrigation Scheduling: Evidence from the Irvine “ET Controller” Study,” Irvine Ranch Water District, the Municipal Water District of Orange County, and the Metropolitan Water District of Southern California, June.

MWDOC and IRWD (2004), “The Residential Runoff Reduction Study,” Irvine Ranch Water District, the Municipal Water District of Orange County, July.

Walker, W., G. Kah, and M. Lehmkuhl (1995), “Landscape Water Management: Auditing,” Irrigation Training Research Center, California Polytechnic State University, San Luis Obispo.

Whitcomb, J. (1994), Contra Costa Water District, “Weather Normalized Evaluation,” August

Whitcomb, J., G. Kah, and W. Willig (1999), “BMP 5 Handbook: A Guide to Implementing Large Landscape Conservation Programs as Specified in Best Management Practice 5,” California Urban Water Conservation Council, April.


  1. Addink, S., and Rodda, T. W., Residential Landscape Irrigation Study Using Aqua ET Controllers, 2002. Aquacraft, Performance Evaluation of WeatherTRAK Irrigation Controllers in Colorado, 2001 and 2002. Jordan, A., Lang, R., and Gonzales, M., High Tech World Meets the Residential Irrigation Controller to Save Water in Santa Barbara County, 2004. The Saving Water Partnership, Water Efficient Irrigation Study Final Report, 2003.
  2. Residential Weather-Based Irrigation Scheduling: Evidence From the Irvine “ET Controller” Study, 2001 (
  3. The Residential Runoff Reduction Study, 2004 (
  4. Evaluation of California Weather-Based “Smart” Irrigation Controller Programs, 2009 (
  5. Pilot Implementation of Smart Timers: Water Conservation, Urban Runoff Reduction, And Water Quality, 2010 (
  6. MWDOC Smart Timer Rebate Program Evaluation, 2011 (
  7. Based on this author’s reanalysis of the original study’s data using the same analytic methods as used by the original 2009 study.
  8. Chesnutt, T.W., Statistical Impact Evaluation of Consumption Data from Metropolitan Weather Based Irrigation Controller Program, a white paper prepared for the US Bureau of Reclamation, California Department of Water Resources and Metropolitan Water District of Southern California, 2013 (
  9. We know of at least one manufacturer ( that rubbishes both SWAT testing and EPA’s Watersense labeling in favor of the 2009 study’s results for touting their product’s efficacy. Water suppliers need to independently evaluate such products because the 2009 study’s results are not reliable when it comes to ranking relative efficacy of different controller models. WBICs were not allocated at random across the pool of study participants, so a rigorous apples-to-apples comparison between them is not possible.
  10. Baum-Haley, M., Soil Moisture Sensors, 2013, a potential best management practice evaluation report prepared for the CUWCC (forthcoming.)
The given value was not understood.
  1. Weather and Soil-Moisture Based Landscape Irrigation Scheduling Devices, Technical Review Report-4th Edition, 2012. (