Our recent commentary on data center power demand was that AI can only improve through brute-force application of ever more processing power and ever larger datasets, with exponential increases in energy consumption for a linear improvement in performance. In particular we cited the cost to train a model, which had grown to between a hundred million dollars to as much as a billion dollars apiece, which can be graphed with a clear upward trajectory even on a logarithmic scale. The natural conclusion was that the billion-dollar training mark would be passed, then $10 billion or even $100 billion dollars. Running these models, which is where long-term power demand enters the conversation, would likewise become ever more energy-intensive. President Trump’s $500 billion Stargate AI data…