Artificial Intelligence Data Centres and their Utilities Bill: Beginning at Lake Tahoe

By the end of the second week of May 2026, the news of an energy company named NV Energy emerged; they have informed the residents of Lake Tahoe that its energy deliveries are ending in May 2027. A new energy vendor will need to be found so that the residents can get electricity. The reason stated: growing demand of energy for upcoming AI data centres in the Nevada region. Well that just sounds amazing doesn't it. Your day-to-day life is getting impacted in favour of a large building filled with computers and servers.

Figure 1: Lake Tahoe (Reference: Pexels)

Why was Lake Tahoe chosen? The area has a low average temperature year round. Maximum temperature recorded was 25.5℃ in 2006. This makes the location ideal for a facility requiring constant temperature control. This strategic location has caused a massive shock to the locals as the electrical energy is now being shifted to the invading corporation.

The true comedy lies in the fact that this is not the first time the common people of the U.S. have suffered at the hands of these AI data centres.

Let us talk about these data centres first. What even are they? According to IBM: "An AI data center is a facility that houses the specific IT infrastructure needed to train, deploy and deliver AI applications and services. It has advanced compute, network and storage architectures and energy and cooling capabilities to handle AI workloads." Cutting through the big words, what they mean to say is an AI data centre is a more expensive and sophisticated data centre. A traditional data centre is used to store and distribute digital information. An AI data centre is used to train a model on the information. Simple storage and distribution compared to processing and training have huge differences in hardware requirements. High performance memory and graphics processing units like the Nvidia H100 are used in such cases. Increasing computational complexity requires more energy to power such advanced hardware. This hardware also has the problem of producing a tremendous amount of heat, our phones get so hot nowadays, you can imagine the heat that a large server might produce. Huge cooling systems requiring large amounts of water are used to regulate temperature.

The hardware crisis is ongoing, as seen by ballooning RAM prices, though it has stabilized a bit due to the advancements in AI development. Thermal management happens to be a big obstacle, water cooling just so happens to be the best solution. I want you to think about what we have been taught all through school, do not waste water. These data centres use up all the water you might have saved leaving you with precious little. Apparently, one 100 word prompt can use up 500 mL of water. Seems like a small number, for one user. If approximately one million users are asking at the same time, the consumption becomes 500 kilo-litres or 500,000 L. If we assume this is the amount used in one minute, then in one hour we would have used 30 million L (60 min x 500,000L) of water. For comparison Lake Tahoe, our unfortunate victim, has a volume of around 156 trillion litres of water. We will use up Lake Tahoe's worth of water in 5,200,000 hours, equal to 93.2 years. 

Wow, way to make a point dude, what you just proved is we have enough water to last us a century. May I remind you that I used one million people as a starting number. Look at the amount of people using AI, your brothers, sisters, teachers, bosses, all of them are using it. India has around 2.37 million students appearing for class 10 Central Board exam in 2025. This not including state boards. I am sure about half the students used some commercially available AI to help in their studies. Moving up the chain, if we continue to take one million students from each remaining year of school and college we end up with 7 million (2 years in senior secondary and 4 years in bachelors). The amount of years have dropped to 13.3 years. This is still an estimate and may I say, a very conservative one, I have not involved any one in the workforce or any person outside India. I guarantee you that the time frame will lower itself to weeks if they correct figures are taken.

Going back to electricity, let us do some more short estimations. One Nvidia H100 SXM has an average power consumption of 400W. 1000 W used in one hour is called 1 kWh which is more commonly referred to as a unit of energy. One H100 SXM will use 0.4 units of energy per hour. One large scale data centre, used by industry giants like OpenAI, Google, Microsoft, and the others, has an average of 1000-10,000 AI servers. Cutting edge servers like the DGX H100 has 8 H100s fitted inside it. So one server will use up 3.2 units of energy if we do not count all the peripherals like SSDs, cooling systems, and other absolutely critical hardware which will likely cause the energy usage to soar to three times the estimate to around 9.6 units of energy per hour. For one top company's AI data centre, that number becomes 96,000 units of electricity per hour which once converted to daily usage becomes 2304 MWh. 

Adding in all the affected residents at Lake Tahoe, the number comes to roughly 49,000. If we approximate one U.S. resident uses an average of 30 kWh per day. Calculating the total energy for used for the affected area, we end up with the figure: 1470 MWh. I am sure you see that staggering figure and now know the cost of artificial intelligence to not just to the environment, but most importantly, to our wallets.

How is the U.S of A. managing to cope with this drastic increase in electricity consumption? Quite easily actually, they just dump the extra cost on the average consumer. In states with a large concentration of data centres like Virginia, the cost has undergone an increase of 267% in the past five years. The poor demographic was hit the hardest with the electricity bill now taking up 20% of their income.

Table 1: Data showing India's total electricity consumption and tallying it with USA's data centre consumption. (Reference data: chartingtheglobe, rigzone)



Figure 2: Bar graph showing the electricity consumption of India if we had the same level of data centre penetration as the US.


Figure 3: Percentage of data centre power consumption increase versus year. The trend shows data centre power consumption increasing faster than total consumption.

What does this mean for dear old India? The increase in electric vehicles can strain our power grid, adding in data centres into the mix will certainly increase the challenge tenfold. Suddenly a large facility that utilizes a city worth of power 24x7 without any breaks takes up the electricity that you were using. All the lessons of saving electricity can go out the window in these circumstances. India's AI market will continue to increase, but will this growth come at the cost of their customers? Will Indians stay silent as the AI machine takes their jobs, electricity, and water?

Go back to main page.

Comments

Popular posts from this blog

VIKRAM3201: Space grade versus regular processors

The Mount Everest of Capital Investment: Understanding the huge barrier of entry in the semiconductor fabrication industry

Lithography AKA Rock Drawing