Artificial intelligence may live in “the cloud,” but its footprint is firmly on the ground. As AI systems grow more powerful, the data centers that train and run them are consuming massive amounts of land, water and electricity—as well as reshaping regional power grids. What does this surge in demand mean for the environment, energy infrastructure, and the future of innovation?
In this episode, we speak with UChicago computer scientist Andrew Chien, an expert in large-scale computing and cloud computing, about why these data centers require so much power, why they’re stirring such controversy—and whether there are sustainable approaches that could keep our energy use in check.
Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.