AI tools like ChatGPT seem fantastic. But do you know the environmental costs of the searches that you and I are making on ChatGPT?
“An average user’s conversational exchange with ChatGPT basically amounts to dumping a large bottle of fresh water out on the ground, according to a new study.” 😳
Microsoft claims that its most recent supercomputer, which would require a significant cooling apparatus, contains 10,000 graphics cards and over 285,000 processor cores, providing a glimpse into the enormous scale of the operation behind artificial intelligence. OpenAI has not disclosed the amount of time it took to train GPT-3. But we can take a guess.
To keep servers from breaking down from heat generated out of computing, typically server rooms are maintained chilly — between 50 and 80 degrees Fahrenheit. By evaporating cold water, cooling towers try to reduce that heat and maintain the rooms’ optimal temperature. Cooling towers perform the job, but it takes a lot of water for them to do so. Additionally, not just any water can be used. In order to prevent corrosion or bacterial development that can occur with seawater, data centres get their water from pristine, freshwater sources.
It is estimated that ChatGPT would require 500 millilitres of water to finish a simple conversation with a user that would include 25 to 50 questions. According to the study, there may have been a three-fold increase in water use if the data had been trained in the company’s less energy-efficient Asia data centres.
According to the experts, an average data centre uses about a gallon of water for every kilowatt-hour used and as per their calculations, the amount of clear freshwater needed to train GPT-3 is equal to the volume of water required to fill a nuclear reactor’s cooling tower.
This problem is not limited to OpenAI’s or Microsoft’s data centers. These are relatively small compared to Google and Amazon’s data centers. Imagine the water shortages we will be running into soon 🤯