(NewsNation) — As the demand for artificial intelligence accelerates, data centers are consuming vast amounts of freshwater, raising concerns over potential strains on water supplies across the United States.

When AI is given a prompt, massive server hubs consume enormous amounts of energy to respond, causing the machines to heat up. To cool down those servers, many data centers use fresh water, which then evaporates.

According to new research from Purdue University, the average data center uses 300,000 gallons of water daily — enough to supply 1,000 homes — and two-thirds of new data centers are built in water-scarce regions, such as Arizona and Texas.

The Houston Advanced Research Center has estimated that data centers in the Lone Star State will use 46 billion gallons of water this y

See Full Page