Most organisations don’t even recognise what good engineering looks like. They treat software development as a commodity – a manufacturing production line – measured by how many features are shipped rather than whether the right outcomes are achieved. Few understand the value of investing in modern software engineering best practices and design – the things that make those outcomes sustainable.
I have never been the best engineer I’ve ever met.
Even if we may be in an AI bubble, it seems Altman is expecting OpenAI to survive the burst. “You should expect OpenAI to spend trillions of dollars on data center construction in the not very distant future,” Altman said. “You should expect a bunch of economists to wring their hands.”
I’m not sure whats more audacious. Declaring everyone ELSE is delusional or lighting billions on fire for the equivalent of a summary machine…
Seven years from GPT-1 to the plateau. How many more until we stop trying to build intelligence and start trying to understand what we’ve already built? That’s the real work now - not training the next model, but figuring out what to do with the ones we have. Turns out the singularity looks less like transcendence and more like integration work. Endless, necessary integration work.
I think this is it.
NVIDIA’s earnings are, effectively, the US stock market’s confidence, and everything rides on five companies — and if we’re honest, really four companies — buying GPUs for generative AI services or to train generative AI models. Worse still, these services, while losing these companies massive amounts of money, don’t produce much revenue, meaning that the AI trade is not driving any real, meaningful revenue growth.
We’re three years in, and generative AI’s highest-grossing companies — outside OpenAI ($10 billion annualized as of early June) and Anthropic ($4 billion annualized as of July), and both lose billions a year after revenue — have three major problems:
AI assistants optimize for making tests pass and errors disappear. Without clear direction, they’ll take the path of least resistance. Common shortcuts to watch for:
TypeScript any types appearing when proper typing gets complex Tests getting commented out or skipped when they’re hard to fix Quick fixes that address symptoms rather than root causes
Great write up on a using claude code cli. And this was in July before 3.
So from an executive perspective, lighting comically large piles of money on fire trying to teach graphics cards how to read is, surprisingly, the logical play. The rest, well, that’s all just creative marketing. It’s very difficult to show up to a quarterly shareholder meeting and tell your investors you just vaporized another $10 billion for absolutely no return-on-investment. At least, that is, without them questioning if you’ve completely lost your mind.
Some voices within the industry began to wonder if the A.I. scaling law was starting to falter. “The 2010s were the age of scaling, now we’re back in the age of wonder and discovery once again,” Ilya Sutskever, one of the company’s founders, told Reuters in November. “Everyone is looking for the next thing.” A contemporaneous TechCrunch article summarized the general mood: “Everyone now seems to be admitting you can’t just use more compute and more data while pretraining large language models and expect them to turn into some sort of all-knowing digital god.
SUMMARY • 66% of respondents have adopted AI tools in production. • 85% are focused on internal engineering use cases. • 59% of respondents feel AI has increased productivity. The tooling landscape Whether overzealous leaders have mandated adoption or allowed engineers to discover these tools themselves, it’s safe to say that AI coding assistants and large language models (LLMs) are firmly part of the software developer’s tool belt today. Two-thirds (66%) of respondents have adopted AI tools or models for at least some use cases, with 20% at a pilot stage, and 13% still exploring.
Altman illustrated the productivity revolution with a personal example. He described using an upcoming OpenAI model to complete a complex home automation programming task that would have taken him “days to do” before AI assistance.
The AI completed “almost all of the work” in just “5 minutes,” he said. A year ago, “you would have paid a very high-end programmer 20 hours, 40 hours something like that to do” the same task.
Look, the economics of this just don’t make sense and the “what trillion dollar problem will AI solve?” question takes up a lot of space in my brain. I see people talking about the limited ways in which they work with it now, and wonder what happens when the bill comes due. We wouldn’t, as a society, pay a trillion dollars to solve those problems. Not even close.
While I’d say I’m still excited to talk about how AI is improving some things that I do, and especially helps me code from time to time.