When OpenAI’s CFO Sarah Friar used the word “backstop” while talking about AI funding, the internet panicked. People thought she meant a government bailout like the 2008 financial crisis. In reality, she was just referring to long-term support for big infrastructure projects. But the reaction showed one thing: most people have no idea what it actually costs to build artificial intelligence at today’s scale — or why those costs sound so huge. Building a large AI model isn’t like creating an app or a website. It’s more like building a national power grid or railway system — massive, complicated, and expensive at the start. Three things make up most of the bill: compute (chips), energy (power), and data (training material). 1. Compute To train a large AI model, companies need

See Full Page