Welcome to my public knowledge archive, where I document insights from articles, research, and ideas worth remembering.
About Duly Noted | RSS Feed
You might also like my longer pieces
More than three quarters (77%) of companies’ usage of Anthropic’s Claude AI software involved automation patterns, often including “full task delegation,” according to a research report the startup released on Monday. The finding was based on an analysis of traffic from Anthropic’s application programming interface, which is used by developers and businesses.
I think the critical thought here is maybe users are only leaning into AI where it makes sense now?
Do I program any faster? Not really. But it feels like I’ve gained 30% more time in my day because the machine is doing the work. I alternate between giving it instructions, reading a book, and reviewing the changes.
I think this is the way. and matches what I’ve experienced. Sure you can spool code from the CLI to your editor and review via git diffs. But are you accomplishing more throughput or just offloading syntaxual structure.
Code review, however, emerged as the most significant challenge. Reviewing the generated code line by line across all changes took me approximately 20 minutes. Unlike pairing with a human developer—where iterative discussions occur at a manageable pace—working with an AI system capable of generating entire modules within seconds creates a bottleneck on the human side, especially when attempting line-by-line scrutiny. This isn’t a limitation of the AI itself, but rather a reflection of human review capacity.
Seven years from GPT-1 to the plateau. How many more until we stop trying to build intelligence and start trying to understand what we’ve already built? That’s the real work now - not training the next model, but figuring out what to do with the ones we have. Turns out the singularity looks less like transcendence and more like integration work. Endless, necessary integration work.
I think this is it.
Automation affects workers in different ways. In some cases, technology acts as a complement to human labor, and in other cases as a substitute for human labor. Over the long run, technological advance creates new goods and services, raises national income, and increases the demand for labor throughout the economy. However, it is important to note that these changes can create winners and losers—some workers will lack the skills to transition to new jobs.
Harris took a different approach. Svelte performs its middle-layer work before a developer uploads code to a web server, well before a user ever downloads it. This makes it possible to remove unnecessary features, shrinking the resulting app. It also reduces the number of moving parts when a user runs the app, which can make Svelte apps faster and more efficient. Wang says he likes to use Svelte for web pages, but he still uses React for larger applications, including his professional work.
Some of us lean on AI coding to push side projects faster into the delivery pipeline. These are not core product features but experiments and MVP-style initiatives. For bringing that kind of work to its first version, the speed-up is real. … output quality gets worse the more context you add. The model starts pulling in irrelevant details from earlier prompts, and accuracy drops. … AI can get you 70% of the way, but the last 30% is the hard part.
“We need to stop talking about AI as a magic fix and instead focus on the specifics: where are the biggest points of friction for developers, how can AI help alleviate that friction, and specifically how should developers use AI tools to overcome that friction and move faster?” Laura Tacho, CTO at DX told LeadDev earlier this year.
Not sure if I captured this before. but writing code was never the bottleneck in software development.
As my friend Kasey put it in a recent conversation, growth is a fire. If you build a nice, sustainable fire, it’ll keep you warm, cook food and sustain life. And if the only thing you care about is how big your fire is, then it’ll set fire to everything around it, and the more you throw into it, the more it’ll burn. Eventually, you’ll have nothing left, but if you desperately desire that fire, you will constantly have to find new things to burn at any cost.
NVIDIA’s earnings are, effectively, the US stock market’s confidence, and everything rides on five companies — and if we’re honest, really four companies — buying GPUs for generative AI services or to train generative AI models. Worse still, these services, while losing these companies massive amounts of money, don’t produce much revenue, meaning that the AI trade is not driving any real, meaningful revenue growth.
We’re three years in, and generative AI’s highest-grossing companies — outside OpenAI ($10 billion annualized as of early June) and Anthropic ($4 billion annualized as of July), and both lose billions a year after revenue — have three major problems: