The expectations have sped up rapidly. One engineer said that building a feature for the website used to take a few weeks; now it must frequently be done within a few days. He said this is possible only by using A.I. to help automate the coding and by cutting down on meetings with colleagues to solicit feedback and explore alternative ideas.
Lots of ink spilled on the productivity of AI.
Every couple of days a new article pops up about how engineers are X% more productive, and how company Y laid off hundreds of developers because they are not needed anymore. … Also, if you are working on a completely fresh codebase, or on a PoC - the gains can be huge. I was able to build in the last 2 months something that would have taken me a year previously.
I have found that AI-generated code is often sloppy, unnecessarily complex, and a lot of the time, just plain wrong. For me, AI code generation is akin to mindlessly copy-pasting code snippets from Stack Overflow, and we all know how that goes. It usually takes me longer to understand AI generated code than write my own.
You can’t off load understanding. Using AI to generate entire projects isn’t the solution.
Dohmke [GitHub CEO] described an effective workflow where AI tools generate code and submit pull requests. Developers can make immediate adjustments using their programming skills.
Matches my experience as well. A silly example. I ask AI to add padding to a div and it adds a style inline tag. Not a pt-3 class. AI has been great for getting 80% started. The rest is still up to us.
Quote Citation: TECHINASIA, “GitHub CEO: manual coding remains key despite AI boom”, 23 Jun 2025, https://www.
Redoing work is now extremely cheap. Code in the small is less important than structural patterns and organisation of the code in the large. You can also build lots of prototypes to test an idea out. For this, vibe-coding is great, as long as the prototype is thrown away and rewritten properly later.
This is fitting my better understanding of the shift in software development from vibe coding (hello Ruby on Rails would like a word) and using prompts to build design docs to THEN build software.
Speaking to the US Senate Banking Committee on Wednesday to give his semiannual monetary policy report, Powell told elected officials that AI’s effect on the economy to date is “probably not great” yet, but it has “enormous capabilities to make really significant changes in the economy and labor force.”
No timeline given, but another signal that labor disruption is on the horizon. And fiddling with interest rates isn’t going to fix this one.
Since the start of 2023, more than half-a-million tech workers have been laid off, according to industry tallies. Headlines have blamed over-hiring during the pandemic and, more recently, AI. But beneath the surface was a hidden accelerant: a change to what’s known as Section 174 that helped gut in-house software and product development teams everywhere from tech giants such as Microsoft (MSFT) and Meta (META) to much smaller, private, direct-to-consumer and other internet-first companies.
Eoin Hinchy, cofounder and CEO of workflow automation company Tines, said his team had 70 failures with an AI initiative they were working on over the course of a year before finally landing on a successful iteration.
As Jim Collins says, bullets then cannonballs. ‘AI’ covers so many types of solutions that to say you’re doing ‘AI’ is a lot like ‘we have a website’ in the late 90s. Congratulations on recognizing that the internet/ai is transformative.
With AI, code is becoming really cheap. This means that you can now build stuff that you only ever use once without feeling bad about it. Everything that you wish would make your current task easier can just be created out of thin air.
Fits in with being more ambitious because the cost of writing code is zero. But knowing what code to write is priceless. Also some good ideas on gitworkrees and task delegation.
These statements betray a conceptual error: Large language models do not, cannot, and will not “understand” anything at all. They are not emotionally intelligent or smart in any meaningful or recognizably human sense of the word. LLMs are impressive probability gadgets that have been fed nearly the entire internet, and produce writing not by thinking but by making statistically informed guesses about which lexical item is likely to follow another.