So-called artificial intelligence (AI) is all the rage right now between your grandma asking ChatGPT how to code in Python or influencers making videos without having to hire extras, but one growing concern is where the power is going to come from for the data centers. The MIT Technology Review team did a deep dive on what the current situation is and whether AI is going to kill us all (with carbon emissions).Probably of most interest to you, dear hacker, is how they came up with their numbers. With no agreed upon methods and different companies doing different types of processing there were a number of assumptions baked into their estimates. Given the lack of information for closed-source models, Open Source models were used as the benchmark for energy usage and extrapolated for the industry as a whole. Unsurprisingly, larger models have a larger energy usage footprint.While data center power usage remained roughly the same from 2005 to 2017 as increases in efficiency offset the increase in online services, data centers doubled their energy consumption by 2023 from those earlier numbers. The power running into those data centers is 48% more carbon intensive than the US average already, and expected to rise as new data centers push for increased fossil fuel usage, like Meta in Louisiana or the X data center found to be using methane generators in violation of the Clean Air Act.Technology Review did find “researchers estimate that if data centers cut their electricity use by roughly half for just a few hours during the year, it will allow utilities to handle some additional 76 gigawatts of new demand.” This would mean either reallocating requests to servers in other geographic regions or just slowing down responses for the 80-90 hours a year when the grid is at its highest loads.If you’re interested in just where a lot of the US-based data centers are, check out this map from NREL. Still not sure how these LLMs even work? Here’s an explainer for you.