In this Issue:

  • Spring 2019
  • The Future of Water
  • Stewarding the Earth’s Water
  • Crunch: What Is a Water Footprint?
  • The Rediscovery of Water
  • A Map of the Future of Water
  • The Water Cycle is Broken But We Can Fix It
  • Groundwater: Unseen But Increasingly Needed
  • America's Water Infrastructure
  • Sometimes Water Should Be Left Where It Is
  • Five Questions: Bringing Water to Those in Need
  • Voices: When You Can't Take Water for Granted
  • View All Other Issues
How Development of America's Water Infrastructure Has Lurched Through History

Throughout history as cities grew, new water infrastructure was built to supply this vital resource to increasing numbers of people. Initially, urban dwellers carried water from hand-dug wells and lakes and streams that ran through the city. As cities advanced, engineers built aqueducts and canals to import water from great distances. Among the engineering marvels of the ancient world, the Roman water system of elevated aqueducts, underground piping, and the world’s first sewer network is an iconic example of the ingenuity that made possible Europe’s first city of a million people.

Modern water systems owe a lot to the Roman innovations from 2,000 years ago. But instead of celebrating the technology that has allowed millions of people to survive in places where the local water supply is limited, we hide our water infrastructure underground and go about our daily lives oblivious to these lifelines. Today, we talk about urban water systems only when they fail. And therein lies our current problem: Much of the water infrastructure in the United States, Western Europe, and many other places is aging and in serious need of replacement or upgrading, especially to address the effects of a changing climate and new generation of man-made contaminants.

Due to our complacency, only a serious crisis that could leave people without access to tap water is likely to free up the financial resources needed to bring water infrastructure—which in many places still includes pipes from the 1800s—into the 21st century. Absent an emergency, cash-strapped water utility managers will continue to deal with aging water systems by economizing on routine maintenance and deferring upgrades for as long as possible. This chronic funding shortage is so dire that the American Society of Civil Engineers has awarded the drinking water infrastructure of the United States grades of D-minus or D for over a decade.

Much of the water infrastructure in the United States, Western Europe, and many other places is aging and in serious need of replacement or upgrading.

Our reluctance to invest means that we allow our water systems to deteriorate until they nearly fail and invest in them only after the public decides that the status quo is unacceptable. Our water systems’ shortcomings were brought to the public’s attention by Flint, Michigan’s, recent experience. But it doesn’t end there: Water systems are teetering on the edge of viability in numerous cities. We have seen this pattern before—and the present-day warning for us all is that the past is often prologue.

As the United States grew during the 1800s, it transformed from an agrarian nation to an industrialized one as populations increased and built drinking water infrastructure on a grand scale. But these developments had less to do with real planning than with reacting to crises. The first crisis occurred when the rapid population growth overwhelmed the water infrastructure of the period—typically shallow wells or small reservoirs located within the city—leaving it unable to provide sufficient quantities of drinking water.

The clearest example of this was in New York, where the population more than tripled, from about 60,000 people to more than 200,000 people, between 1800 and 1830. After decades of denial by city leaders during which the wealthy drank water provided by the Manhattan Water Co. (the predecessor of Chase Bank) while the poor drank well water of dubious quality, New York’s leaders invested $9 million (about $850 per person in today’s dollars) to import water to the city using a system of canals, pipes, and reservoirs situated about 40 miles to the north.

Building upon this early success, New Yorkers spent another $177 million (about $500 per person today) to expand their water system out another 60 miles in search of more clean water as the city grew in the subsequent decades. This pattern of population growth outstripping the capacity of local water supplies, followed by investments of hundreds of dollars per person to import water from great distances, also took place in Boston, Washington, Philadelphia, and other cities during this period. The periodic crises of growing East Coast cities taught the young country some valuable lessons. The technological know-how gained from the construction of dams and reservoirs helped our nation’s westward migration that began several decades later when leaders of Seattle, San Francisco, and Los Angeles were able to build massive imported water systems before their cities reached a state of crisis.

Data Points

$60 billion: The amount the U.S. government invested in the 1970s and ’80s to make America’s waterways fishable and swimmable again after water pollution. TWEET

Data Points

America’s waterways had deteriorated after years of inadequate sewage treatment, leading to the passage of the Clean Water Act in 1972. TWEET

Data Points

10% of Americans receive their drinking water from dams on the Colorado River. TWEET

Data Points

The water levels in the dams have been falling since 2000 due to climactic shifts and increasing demand from cities and farmers. TWEET

These solutions to the nation’s first water crisis, though, spawned its second one. Once city dwellers had access to large quantities of water, per capita water consumption increased as they indulged in stay-at-home baths and replaced their outhouses with indoor toilets. The sewage produced by city dwellers flowed to the nearest rivers, which often served as the drinking water supply for the next downstream city. By the late 19th century, typhoid fever and other waterborne diseases had increased to epidemic levels.

The new challenge was to develop treatment plants that could make sewage-contaminated waters safe to drink. By the early 1900s, billions of dollars had been invested in the new technology of drinking water treatment. The corresponding decrease in waterborne disease and lengthened life spans resulting from these advances has been hailed as one of the top five technological achievements of the 20th century by the National Academy of Sciences. Thanks to water filtration and chlorination, the second water crisis was averted.

America’s third water crisis occurred as cities again grew during the economic expansion that followed World War II.  As people migrated to urban areas, the increased volume of wastewater they produced overwhelmed the assimilative capacity of the nation’s rivers, lakes, and estuaries, which had purified the modest amount of pollution that they had received in the previous years. For the next 25 years, foul smells emanated from urban waterways, dead fish washed up on shorelines, and runaway algal blooms became the norm in lakes. Water pollution was a nuisance, but city leaders lacked the will to tax their constituents to build sewage treatment infrastructure that might benefit downstream communities more than their own—and the state of the nation’s waterways further deteriorated until the early 1970s. It was only then that the nation, fed up with water pollution, came to support the Clean Water Act—a federal law that established requirements for sewage treatment. The federal government provided cities with grants and low-interest loans to upgrade their inadequate sewage infrastructure. During the two decades ending in 1992, the federal government invested over $60 billion (about $700 per person today) to again make America’s waterways fishable and swimmable.

As these investments in sewage treatment improved the environment, cities continued their struggle to keep up with the demand of growing populations. In addition to building more imported water systems, they turned their attention to conservation and passed laws that required low-flow fixtures and less thirsty landscaping in new housing developments.

But as we soon enter the third decade of the 21st century, two potential crises are again poised to threaten our ability to keep up with thirsty American cities: continued demand and the growing perception by residents of some communities that their tap water is no longer safe to drink.

Lengthened life spans resulting from these advances has been hailed as one of the top five technological achievements of the 20th century.

The availability of water has continued to be an issue as population growth has driven demand. But what is complicating things more than before are climate change-induced shifts in precipitation patterns and a greater recognition that taking too much water from rivers and streams damages aquatic ecosystems. This means that the old model of piping water in from long distances is no longer attractive. For example, the water level in the massive dams on the Colorado River, which supplies some of the drinking water to about 10 percent of the nation’s population, has been falling since 2000 due to climactic shifts and increasing demand from cities and farmers. The imminent declaration by the Colorado River’s managers of a shortage means that water is about to get more expensive, and water rights lawyers will become more plentiful in cities throughout the Southwest as legal disputes increase. Recent droughts of historic duration and intensity from Texas to California also have contributed to a sense that action is needed to enhance water security—that simple notion of having enough available, clean water to meet society’s needs. Atlanta, Tampa, Florida, and Charlotte, North Carolina, are worrying about the security of their existing water supplies because their populations are approaching a point where local water sources will no longer be sufficient, especially during dry years.

Some communities facing water shortages have begun to think ahead by investing in new strategies for decreasing their reliance on imported water. This movement, which is sometimes referred to as water self-sufficiency, is furthest advanced in Southern California, where water has long been a scarce resource. The 2.5 million people of Orange County now recycle nearly all of their wastewater, passing it through an advanced treatment plant and returning it to the aquifer from which they draw their drinking water. The county currently satisfies 75 percent of its drinking water needs by combining water from wastewater recycling with groundwater recharged with rainwater that falls within the city and water from an effluent-laden stream that bisects the county. If the remaining 25 percent of the region’s imported water supply becomes too expensive or unreliable, the county could meet its water needs by building seawater desalination plants, just as its neighbors to the south, in San Diego, and to the north, in Santa Barbara, did in response to their water scarcity concerns.

Elsewhere, the drive toward water self-sufficiency has taken a different form, shaped by local geography and geology. In California’s Salinas Valley, technologies similar to those used to recycle wastewater in Orange County are being repurposed to create drinking water from a mixture of municipal wastewater effluent, runoff from city streets and farm fields, and wash water from food processing plants.

What is complicating things more than before are climate change-induced shifts in precipitation patterns and a greater recognition that taking too much water from rivers and streams damages aquatic ecosystems.

On the East Coast, in eastern Virginia, the local utility is treating wastewater with advanced technologies before using it to recharge the local drinking water aquifer. The project makes sense in that relatively wet part of the country because it eliminates the discharge of nutrient-rich wastewater to the ecologically sensitive Chesapeake Bay and counteracts land subsidence that has made the region increasingly vulnerable to flooding from rising sea levels.

The second potential water crisis is related to a growing public perception that tap water is no longer safe to drink. The failure of the municipal water system in Flint to properly manage its aging pipe network, which contaminated the water supply with lead and Legionella bacteria, was national news a few years ago. More recently, the discovery that chemicals used for firefighting and industrial manufacturing—the per- and polyfluoroalkyl substances referred to as PFAS—have contaminated water supplies for about a quarter of the nation has further highlighted the vulnerability of drinking water systems to man-made pollutants.

Most important, this discovery raises a significant new issue: Can our old water filtration and disinfection plants protect public health? Simply retrofitting treatment plants in places where water supplies are known to be contaminated and banning difficult-to-treat chemicals like PFAS will not protect us from the coming quality challenges.  Evidence of the systemic shortcomings of the existing drinking water system are apparent a short drive south of Flint, in Toledo, Ohio, where continued release of nutrients from farms, wastewater treatment plants, and city streets, coupled with warmer temperatures in the Great Lakes, resulted in blooms of toxic algae that made tap water unsafe for several days in 2014. The exact cause of more recent toxic algal blooms that have occurred in Florida, Oregon, Ohio, and other parts of the country is unclear, but most experts suspect that nutrients that are legally released from farms and cities are the main culprit. Simply put, our aging drinking water systems are not ready for the less forgiving future that will prevail in an era of climate change and inadequate pollution regulations.

Considering the way that change has come about in the past, it seems likely that the nation will have to weather a few more high-profile drinking water contamination incidents before public opinion forces action. When change does come about, it would be useful if the means of evolving our water systems were ready to be deployed. Using the water self-sufficiency movement as a starting point, it may be possible to rapidly adapt existing infrastructure. For example, the reverse osmosis technology used to make municipal wastewater effluent and seawater safe to drink by forcing water through a membrane that captures salts, microbes, and chemicals could be repurposed to remove PFAS and algal toxins from water supplies. With a little more development, emerging technologies that have yet to be deployed at scale, such as energy-efficient LED water disinfection lamps and treatment systems that use electricity instead of difficult-to-manage chemicals to decontaminate water, could provide new approaches for solving water-quality problems. Although advanced treatment technologies will not solve all of the problems related to decaying water pipes, aging dams, and inadequate treatment plants, they may create the means to move away from our historic reliance on massive infrastructure projects that have become too expensive to properly maintain.

For example, point-of-entry water filters that purify only the water that comes into the kitchen and building-scale water recycling systems that clean up any contaminants that entered the water within the underground pipe network could reduce costs by allowing water used outdoors for cleaning and irrigation to be treated less stringently than drinking water. Additional savings could be realized by investments in underutilized technologies that prevent treated water from escaping from aging water pipes between the treatment plant and the user.

Given these needs, our nation’s water systems are on the cusp of a once-in-a-generation change involving costs that could reach $100 billion. Whether the change is preceded by crises that compromise public health and damage local economies will depend upon the investments that are made over the next few years. Federal agencies, including the National Science Foundation and the Department of Energy, along with water-stressed cities in Southern California and Texas, have begun to invest in the research and development needed to adapt urban water infrastructure to a future with greater water scarcity and increasing threats to water quality. Elected officials and community leaders now must recognize that they have an important role to play in reforming the institutions, regulations, and financial policies that impede systemic change. Our history of crisis and response will likely continue, but the more we can anticipate and plan, the better the chance that we’ll have the safe water we all need in a less forgiving future.

Takeaway 

David Sedlak is the Malozemoff professor in the Department of Civil and Environmental Engineering at the University of California, Berkeley; co-director of the Berkeley Water Center; and author of Water 4.0: The Past, Present, and Future of the World’s Most Vital Resource.

Sometimes Water Should Be Left Where It Is Groundwater: Unseen But Increasingly Needed